Finger Search in the Implicit Model

Similar documents
( x) f = where P and Q are polynomials.

Maximum Flow. Reading: CLRS Chapter 26. CSE 6331 Algorithms Steve Lai

Supplementary material for Continuous-action planning for discounted infinite-horizon nonlinear optimal control with Lipschitz values

Circuit Complexity / Counting Problems

On High-Rate Cryptographic Compression Functions

The Complexity of Constructing Evolutionary Trees Using Experiments

2. ETA EVALUATIONS USING WEBER FUNCTIONS. Introduction

(C) The rationals and the reals as linearly ordered sets. Contents. 1 The characterizing results

On Adaptive Integer Sorting

1 Approximate Quantiles and Summaries

New Results on Boomerang and Rectangle Attacks

A Tight Lower Bound for Dynamic Membership in the External Memory Model

A General Lower Bound on the I/O-Complexity of Comparison-based Algorithms

Product Matrix MSR Codes with Bandwidth Adaptive Exact Repair

Numerical Methods - Lecture 2. Numerical Methods. Lecture 2. Analysis of errors in numerical methods

Cache-Oblivious Algorithms

Roberto s Notes on Differential Calculus Chapter 8: Graphical analysis Section 1. Extreme points

Cache-Oblivious Algorithms

TESTING TIMED FINITE STATE MACHINES WITH GUARANTEED FAULT COVERAGE

Objectives. By the time the student is finished with this section of the workbook, he/she should be able

Rank and Select Operations on Binary Strings (1974; Elias)

Strategies for Stable Merge Sorting

arxiv: v3 [cs.ds] 21 Feb 2012

Optimal Color Range Reporting in One Dimension

Premaster Course Algorithms 1 Chapter 3: Elementary Data Structures

Persistent Data Sketching

SIO 211B, Rudnick. We start with a definition of the Fourier transform! ĝ f of a time series! ( )

Approximative Methods for Monotone Systems of min-max-polynomial Equations

Advanced Data Structures

Curve Sketching. The process of curve sketching can be performed in the following steps:

Chapter 5 Data Structures Algorithm Theory WS 2017/18 Fabian Kuhn

OPTIMAL PLACEMENT AND UTILIZATION OF PHASOR MEASUREMENTS FOR STATE ESTIMATION

Advanced Data Structures

Lower Bounds for Dynamic Connectivity (2004; Pǎtraşcu, Demaine)

Chapter 6 Reliability-based design and code developments

RATIONAL FUNCTIONS. Finding Asymptotes..347 The Domain Finding Intercepts Graphing Rational Functions

Beyond Hellman s Time-Memory Trade-Offs with Applications to Proofs of Space

Higher Cell Probe Lower Bounds for Evaluating Polynomials

Scanning and Traversing: Maintaining Data for Traversals in a Memory Hierarchy

Definition: Let f(x) be a function of one variable with continuous derivatives of all orders at a the point x 0, then the series.

A Faster Randomized Algorithm for Root Counting in Prime Power Rings

CS 361 Meeting 28 11/14/18

Dynamic Ordered Sets with Exponential Search Trees

arxiv:quant-ph/ v2 12 Jan 2006

Physics 2B Chapter 17 Notes - First Law of Thermo Spring 2018

BANDELET IMAGE APPROXIMATION AND COMPRESSION

Divide-and-Conquer Algorithms Part Two

A Linear Time Algorithm for Ordered Partition

Online Sorted Range Reporting and Approximating the Mode

Finite Dimensional Hilbert Spaces are Complete for Dagger Compact Closed Categories (Extended Abstract)

Upper and Lower Bounds on the Power of Advice

s 1 if xπy and f(x) = f(y)

On the Girth of (3,L) Quasi-Cyclic LDPC Codes based on Complete Protographs

Simpler Functions for Decompositions

Data selection. Lower complexity bound for sorting

Geometric Optimization Problems over Sliding Windows

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

Differentiation. The main problem of differential calculus deals with finding the slope of the tangent line at a point on a curve.

CS 240 Data Structures and Data Management. Module 4: Dictionaries

Design and Analysis of Algorithms

Sorting. Chapter 11. CSE 2011 Prof. J. Elder Last Updated: :11 AM

Space-Efficient Re-Pair Compression

On the Limits of Cache-Obliviousness

Skip Lists. What is a Skip List. Skip Lists 3/19/14

Divide and Conquer Algorithms

Fundamental Algorithms

Fundamental Algorithms

A Consistent Generation of Pipeline Parallelism and Distribution of Operations and Data among Processors

Algorithms and Complexity Results for Exact Bayesian Structure Learning

Succinct Data Structures for Approximating Convex Functions with Applications

Cell Probe Lower Bounds and Approximations for Range Mode

Randomization in Algorithms and Data Structures

Compressed Index for Dynamic Text

Lecture 18 April 26, 2012

Figure 1: Bayesian network for problem 1. P (A = t) = 0.3 (1) P (C = t) = 0.6 (2) Table 1: CPTs for problem 1. (c) P (E B) C P (D = t) f 0.9 t 0.

Complexity and Algorithms for Two-Stage Flexible Flowshop Scheduling with Availability Constraints

Cell-Probe Proofs and Nondeterministic Cell-Probe Complexity

b + O(n d ) where a 1, b > 1, then O(n d log n) if a = b d d ) if a < b d O(n log b a ) if a > b d

Notes on Logarithmic Lower Bounds in the Cell Probe Model

6.854 Advanced Algorithms

Data Structure. Mohsen Arab. January 13, Yazd University. Mohsen Arab (Yazd University ) Data Structure January 13, / 86

1 Relative degree and local normal forms

Scattered Data Approximation of Noisy Data via Iterated Moving Least Squares

Fixed-Universe Successor : Van Emde Boas. Lecture 18

The Deutsch-Jozsa Problem: De-quantization and entanglement

Biased Quantiles. Flip Korn Graham Cormode S. Muthukrishnan

Improvement of Sparse Computation Application in Power System Short Circuit Study

Module 1: Analyzing the Efficiency of Algorithms

The intrinsic structure of FFT and a synopsis of SFT

3. Several Random Variables

Cache-Oblivious Hashing

arxiv: v2 [cs.dc] 13 May 2018

Special types of Riemann sums

Strong Lyapunov Functions for Systems Satisfying the Conditions of La Salle

The Impact of Carries on the Complexity of Collision Attacks on SHA-1

Lecture 4: Divide and Conquer: van Emde Boas Trees

RANDOMIZED ALGORITHMS

Solving Multi-Mode Time-Cost-Quality Trade-off Problem in Uncertainty Condition Using a Novel Genetic Algorithm

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

past balancing schemes require maintenance of balance info at all times, are aggresive use work of searches to pay for work of rebalancing

Transcription:

Finger Search in the Implicit Model Gerth Stølting Brodal, Jesper Sindahl Nielsen, Jakob Truelsen MADALGO, Department o Computer Science, Aarhus University, Denmark. {gerth,jasn,jakobt}@madalgo.au.dk Abstract. We address the problem o creating a dictionary with the inger search property in the strict implicit model, where no inormation is stored between operations, except the array o elements. We show that or any implicit dictionary supporting inger searches in q(t) = Ω(log t) time, the time to move the inger to another element is Ω(q 1 (log n)), where t is the rank distance between the query element and the inger. We present an optimal implicit static structure matching this lower bound. We urthermore present a near optimal implicit dynamic structure supporting search, change-inger, insert, and delete in times O(q(t)), O(q 1 (log n) log n), O(log n), and O(log n), respectively, or any q(t) = Ω(log t). Finally we show that the search operation must take Ω(log n) time or the special case where the inger is always changed to the element returned by the last query. 1 Introduction We consider the problem o creating an implicit dictionary [4] that supports inger search. A dictionary is a data structure storing a set o elements with distinct comparable keys such that an element can be located eiciently given its key. It may also support predecessor and successor queries where given a query k it must return the element with the greatest key less than k or the element with smallest key greater than k. A dynamic dictionary also supports insertion the and deletion o elements. A dictionary has the inger search property i the time or searching is dependent on the rank distance t between a speciic element, called the inger, and the query key k. In the static case O(log t) search can be achieved by exponential search on a sorted array o elements starting at the inger. Dynamic inger search data structures have been widely studied, e.g. some o the amous dynamic structures that support inger searches are splay trees, randomized skip lists and level linked (2-4)-trees. These all support inger search in O(log t) time, respectively in the amortized, expected and worst case sense. For an overview o data structures that support inger search see [3]. We consider two variants o inger search structures. The irst variant is the inger search dictionary where the search operation also changes the inger to the returned element. The second variant is the change inger dictionary Center or Massive Data Algorithmics, a Center o the Danish National Research Foundation.

where the change-inger operation is separate rom the search operation. We consider the two problems in the strict implicit model where we are only allowed to explicitly store the elements and the number o elements n between operations as deined in [1, 6]. Note that the static sorted array solution does not it into this model, since we are not allowed to use additional space to store the index o between operations. Other papers allow O(1) additional words [4, 5, 8]. We call this the weak implicit model. In both models almost all structure has to be encoded in the order o the elements. In either model the only allowed operations on elements are comparisons and swaps. As there is no agreement on the exact deinition o the implicit model, it is interesting to study the limits o the strict model. We show that or a static dictionary in the strict model, i we want a search time o O(log t), then change-inger must take time Ω(n ɛ ), while in the weak model a sorted array achieves O(1) change-inger time. Much eort has gone into inding a worst case optimal implicit dictionary. Among the irst [7] gave a dictionary supporting insert, delete and search in O(log 2 n) time. In [5] an implicit B-tree is presented, and inally in [4] a worst case optimal and cache oblivious dictionary is presented. To prove our dynamic upper bounds we use the movable implicit dictionary presented in [2], supporting insert, delete, predecessor, successor, move-let and move-right. The operation move-right moves the dictionary laid out in cells i through j to i + 1 through j + 1 and move-let moves the dictionary the other direction. Preliminaries. A common implicit data structure technique is the pair encoding o bits. When we have two distinct consecutive elements x and y, then they encode a 1 i x y and 0 otherwise. The running time o the search operation is hereater denoted by q(t, n). Throughout the paper we require that q(t, n) is non decreasing in both t and n, q(t, n) log t and that q(0, n) < log n 2. We deine Z q (n) = min{t N q(t, n) log n 2 }, i.e. Z q(n) is the smallest rank distance t, such that q(t, n) > log n 2. Note that Z q(n) n 2 (since by assumption q(t, n) log t), and i q is a unction o only t, then Z q is essentially equivalent to q 1 (log n 2 ). As an example q(t, n) = 1 ε log t, gives Z q(n) = ( n 2 )ε, or 0 < ε 1. We require that or a given q, Z q (n) can be evaluated in constant time, and that Z q (n + 1) Z q (n) is bounded by a ixed constant or all n. We will use set notation on a data structure when appropriate, e.g. X will denote the number o elements in the structure X and e X will denote that the element e is in the structure X. Given two data structures or sets X and Y, we say that X Y (x, y) X Y : x < y. We use d(e 1, e 2 ) to denote the rank distance between two elements, that is the dierence o the index o e 1 and e 2 in the sorted key order o all elements in the structure. At any time will denote the current inger element and t the rank distance between this and the current search key. Our results. In Section 2 we present a static change-inger implicit dictionary supporting predecessor in time O(q(t, n)), and change-inger in time O(Z q (n)+log n), or any unction q(t, n). Note that by choosing q(t, n) = 1 ε log t,

we get a search time o O(log t) and a change inger time o O(n ε ) or any 0 < ε 1. In Section 3 we prove our lower bounds. First we prove (Lemma 1) that or any algorithm A on a strict implicit data structure o size n that runs in time at most τ, whose arguments are keys or elements rom the structure, there exists a set X A,n o at most O(2 τ ) array entries, such that A touches only array entries rom X A,n, no matter the arguments to A or the content o the data structure. We use this to show that or any change-inger implicit dictionary with a search time o q(t, n), change-inger will take time Ω(Z q (n) + log n) or some t (Theorem 1). We prove that or any change-inger implicit dictionary search will take time at least log t (Theorem 2). A similar argument applies or predecessor and successor. This means that the requirement q(t, n) log t is necessary. We show that or any inger-search implicit dictionary search must take at least log n time as a unction o both t and n, i.e. it is impossible to create any meaningul inger-search dictionary in the strict implicit model (Theorem 3). By Theorem 1 and 2 the static data structure presented in Section 2 is optimal w.r.t. search and change-inger time trade o, or any unction q(t, n) as deined above. In the special case where the restriction q(0, n) < log n 2 does not hold [4] provides the optimal trade o. Finally in Section 4 we outline a construction or creating a dynamic changeinger implicit dictionary, supporting insert and delete in time O(log n), predecessor and successor in time O(q(t, n)) and change-inger in time O(Z q (n) log n). Note that by setting q(t, n) = 2 ε log t, we get a search time o O(log t) and a change-inger time o O(n ε/2 log n) = O(n ε ) or any 0 < ε 1, which is asymptotically optimal in the strict model. It remains an open problem i one can get better bounds in the dynamic case by using O(1) additional words. 2 Static inger search In this section we present a simple change-inger implicit dictionary, achieving an optimal trade o between the time or search and changer-inger. Given some unction q(t, n), as deined in Section 1, we are aiming or a search time o O(q(t, n)). Let = Z q (n). Note that we are allowed to use O(log n) time searching or elements with rank-distance t rom the inger, since q(t, n) = Ω(log n) or t. Intuitively, we start with a sorted list o elements. We cut the 2 + 1 elements closest to ( being in the center), rom this list, and swap them with the irst 2 + 1 elements, such that the inger element is at position + 1. The elements that were cut out orm the proximity structure P, the rest o the elements are in the overlow structure O (see Figure 2). A search or x is perormed by irst doing an exponential search or x in the proximity structure, and i x is not ound there, by doing binary searches or it in the remaining sorted sequences. The proximity structure consists o sorted lists XS S {} L XL. The list S contains the up to elements smaller then that are closest to w.r.t. rank distance. The list L contains the up to closest to, but larger than.

P O XL S L XS l 1 l 2 l 3 Fig. 1. Memory layout o the static dictonary. Both are sorted in ascending order. XL contains a possibly empty sorted sequence o elements larger than elements rom L, and XS contains a possibly empty sorted sequence o elements smaller than elements rom S. Here XL + S = = L + XS, S = min{, rank() 1} and L = min{, n rank()}. The overlow structure consists o three sorted sequences l 2 l 1 {} l 3, each possibly empty. To perorm a change-inger operation, we irst revert the array back to one sorted list and the index o is ound by doing a binary search. Once is ound there are 4 cases to consider, as illustrated in Figure 2. Note that in each case, at most 2 P elements have to be moved. Furthermore the elements can be moved such that at most O( P ) swaps are needed. In particular case 2 and 4 can be solved by a constant number o list reversals. For reverting to a sorted array and or doing search, we need to compute the lengths o all sorted sequences. These lengths uniquely determine the case used or construction, and the construction can thus be undone. To ind S a binary search or the split point between XL and S, is done within the irst elements o P. This is possible since S {} XL. Similarly L and XS can be ound. The separation between l 2 and l 3, can be ound by doing a binary search or in O, since l 1 l 2 {} l 3. Finally i l 3 < O, the separation between l 1 and l 2 can be ound by a binary search, comparing candidates against the largest element rom l 2, since l 2 l 1. When perorming the search operation or some key k, we irst determine i k <. I this is the case, an exponential search or k in S is perormed. We can detect i we have crossed the boundary to XL, since S {} XL. I the 2 + 1 n (2 + 1) P O l 1 l 2 l 3 Case 1 l 2 l 3 Case 2 l 1 l 2 Case 3 l 3 Case 4 Fig. 2. Cases or the change-inger operation. The let side is the sorted array. In all cases the horizontally marked segment contains the new inger element and must be moved to the beginning. In the inal two cases, there are not enough elements around so P is padded with what was already there. The emphasized bar in the array is the 2 + 1 break point between the proximity structure and the overlow structure.

element is ound it can be returned. I k > we do an identical search in L. Otherwise the element is neither located in S nor L, and thereore d(k, ) >. All lengths are reconstructed as above, and the element is searched or using binary search in X l and l 3 i k > and, otherwise in X s, l1 and l 2. Analysis The change-inger operation irst computes the lengths o all lists in O(log n) time. The case used or constructing the current layout is then identiied and reversed in O( ) time. We locate the new inger by binary search in O(log n) time and aterwards the O( ) elements closest to are moved to P. We get O( + log n) time or change-inger. For searches there are two cases to consider. I t, it will be located by the exponential search in P in O(log t) = O(q(t, n)) time, since by assumption q(t, n) log t. Otherwise the lengths o the sorted sequences will be recovered in O(log n) time, and a constant number o binary searches will be perormed in O(log n) time total. Since t q(t, n) log n 2, we again get a search time o O(q(t, n)). 3 Lower bounds To prove our lower bounds we use an abstracted version o the strict implicit model. The strict model requires that nothing but the elements and the number o elements are stored between operations, and that during computation elements can only be used or comparison. With these assumptions a decision tree can be ormed or a given n, where nodes correspond to element comparisons and loads and leaves contain the answers. Note that in the weak model a node could probe a cell containing an integer, giving it a degree o n, which prevents any o our lower bound arguments. Lemma 1. Let A be an operation on an implicit data structure o length n, running in time τ worst case, that takes any number o keys as arguments. Then there exists a set X A,n o size 2 τ, such that executing A with any arguments will touch only cells rom X A,n no matter the content o the data structure. Proo. Beore loading any elements rom the data structure, A can reach only a single state which gives rise to a root in a decision tree. When A is in some node s, the next execution step may load some cell in the data structure, and transition into another ixed node, or A may compare two previously loaded elements or arguments, and given the result o this comparison transition into one o two distinct nodes. It ollows that the total number o nodes A can enter within its τ steps is τ 1 i=0 2i < 2 τ. Now each node can access at most one cell, so it ollows that at most 2 τ dierent cells can be probed by any execution o A within τ steps. Observe that no matter how many times an operation that take at most τ time is perormed they will only be able to reach the same set o cells, since the decision tree is the same or all invocations.

Theorem 1. For any change-inger implicit dictionary with a search time o q(t, n) as deined in Section 1, change-inger requires Ω(Z q (n)+log n) time. Proo. Let e 1... e n be a set o elements in sorted order with respect to the keys k 1... k n. Let t = Z q (n) 1. By deinition q(t + 1, n) log n 2 > q(t, n). Consider the ollowing sequence o operations: or i = 0... n t : change-inger(k it ) or j = 0... t 1: search(k it+j ) Since the rank distance o any query element is at most t rom the current inger and q is non-decreasing each search operation takes time at most q(t, n). By Lemma 1 there exists a set X o size 2 q(t,n) such that all queries only touch cells in X. We note that X 2 q(t,n) 2 log(n/2) = n 2. Since all n elements were returned by the query set, the change-inger operations, must have copied at least n X n 2 elements into X. We perormed n t change-inger operations, thus on average the change-inger operations must have moved at least t 2 = Ω(Z q(n)) elements into X. For the log n term in the lower bound, we consider the sequence o operations change-inger(k i ) ollowed by search(k i ) or i between 1 and n. Since the rank distance o any search is 0 and q(0, n) < log n 2 (by assumption), we know rom Lemma 1 that there exists a set X s o size at most 2 log(n/2), such that search only touches cells rom X s. Assume that change-inger runs in time c(n), then rom Lemma 1 we get a set X c o size at most 2 c(n) such that change-inger only touches cells rom X c. Since every element is returned, the cell initially containing the element must be touched by either change-inger or search at some point, thus X c + X s n. We see that 2 c(n) X c n X s n 2 log(n/2) = 2 log(n/2), i.e. c(n) log n 2. Theorem 2. For a change-inger implicit dictionary with search time q (t, n), where q is none decreasing in both t and n, it holds that q (t, n) log t. Proo. Let e 1... e n be a set o elements with keys k 1... k n in sorted order. Let t n be given. First perorm change-inger(k 1 ), then or i between 1 and t perorm search(k i ). From Lemma 1 we know there exists a set X o size at most 2 q (t,n), such that any o the search operations touch only cells rom X (since any element searched or has rank distance at most t rom the inger). The search operations return t distinct elements so t X 2 q (t,n), and q (t, n) log t. Theorem 3. For inger-search implicit dictionary, the inger-search operation requires at least g(t, n) log n time or any rank distance t > 0 where g(t, n) is non decreasing in both t and n. Proo. Let e 1... e n be a set o elements with keys k 1... k n in sorted order. First perorm inger-search(k 1 ), then perorm inger-search(k i ) or i between 1 and n. Now or all queries except the irst, the rank distance t 1 and by

P 2 i+1 2 2i C 1 D 1 C i D i C l D l O B 1 B i B l Fig. 3. Memory layout. Lemma 1 there exists a set o memory cells X o size 2 g(1,n) such that all these queries only touch cells in X. Since all elements are returned by the queries we have X = n, so g(1, n) log n, since this holds or t = 1 it holds or all t. We can conclude that it is not possible to achieve any orm o meaningul ingersearch in the strict implicit model. The static change-inger implicit dictionary rom Section 2 is by Theorem 1 optimal within a constant actor, with respect to the search to change-inger time trade o, assuming the running time o change-inger depends only on the size o the structure. 4 A dynamic structure For any unction q(t, n), as deined in the introduction, we present a dynamic change-inger implicit dictionary that supports change-inger, search, insert and delete in O( log n), O(q(t, n)), O(log n) and O(log n) time respectively, where = Z q (n) and n is the number o elements when the operation was started. The data structure consists o two parts: a proximity structure P which contains the elements near and an overlow structure O which contains elements urther rom w.r.t. rank distance. We partition P into several smaller structures B 1,..., B l. Elements in B i are closer to than elements in B i+1. The overlow structure O is an implicit movable dictionary [2] that supports move-let and move-right as described in the Section 1. See Figure 3 or the layout o the data structure. During a change-inger operation the proximity structure is rebuilt such that B 1,..., B l correspond to the new inger, and the remaining elements are put in O. The total size o P is 2 + 1. The i th block B i consists o a counter C i and an implicit movable dictionary D i. The counter C i contains a pair encoded number c i, where c i is the number o elements in D i smaller than. The sizes within B i are C i = 2 i+1 and D i = 2 2i, except in the inal block B l where they might be smaller (B l might be empty). In particular we deine: { l = min l l ( N 2 i+1 + 2 2i) } > 2. i=0

We will maintain the ollowing invariants or the structure: I.1 i < j, e 1 B i, e 2 B j : d(, e 1 ) < d(, e 2 ) I.2 e 1 B 1 B l, e 2 O : d(, e 1 ) d(, e 2 ) I.3 P = 2 + 1 I.4 C i 2 i+1 I.5 D i > 0 C i = 2 i+1 I.6 D l < 2 2l and i < l : D i = 2 2i I.7 D i > 0 c i = {e D i e < } We observe that the above invariants imply: O.1 i < l : B i = 2 i+1 + 2 2i (From I.5 and I.6) O.2 B l < 2 l+1 + 2 2l (From I.4 and I.6) O.3 d(e, ) 2 2k 1 e B j or some j k (From I.1 I.6) 4.1 Block operations The ollowing operations operate on a single block and are internal helper unctions or the operations described in Section 4.2. block delete(k, B i ): Removes the element e with key k rom the block B i. This element must be located in B i. First we scan C i to ind e. I it is not ound it must be in D i, so we delete it rom D i. I e < we decrement c i. In the case where e C i and D i is nonempty, an arbitrary element g is deleted rom D i and i g < we decrement c i. We then overwrite e with g, and ix C i to encode the new number c i. In the inal case where e C i and D i is empty, we overwrite e with the last element rom C i. block insert(e, B i ): Inserts e into block B i. I C i < 2 i+1, e is inserted into C i and we return. Else we insert e into D i. I D i was empty we set c i = 0. In either case i e < we increment c i. block search(k, B i ): Searches or an element e with key k in the block B i. We scan C i or e, i it is ound we return it. Otherwise i D i is nonempty we perorm a search on it, to ind e and we return it. I the element is not ound nil is returned. block predecessor(k, B i ): Finds the predecessor element or the key k in B i. Do a linear scan through C i and ind the element l 1 with largest key less than k. Aterwards do a predecessor search or key k on D i, call the result l 2. Return max(l 1, l 2 ), or that no element in B i has key less than k. 4.2 Operations In order to maintain correct sizes o P and O as the entire structure expands or contracts a rebalance operation is called in the end o every insert and delete operation. This is an internal operation that does not require I.3 to be valid beore invocation. rebalance(): Balance B l such that the number o elements in P less than is as close to the number o elements greater than as possible. We start by

evaluating = Z q (n), the new desired proximity size. Let s be the number o elements in B l less than which can be computed as c l + {e C l e < }. While 2 + 1 > P we move elements rom O to P. We move the predecessor o rom O to B l i O {} (s < B l 2 ({} O)) and otherwise we move the successor o to O. While 2 + 1 < P we move elements rom B l to O. We move the largest element rom B l to O i s < B l 2. Otherwise we move the smallest element. change-inger(k): To change the inger o the structure to k, we irst insert every element o B l... B 1 into O. We then remove the element e with key k rom O, and place it at index 1 as the new, and inish by perorming rebalance. insert(e): Assume e >. The case e < can be handled similarly. Find the irst block B i where e is smaller than the largest element l i rom B i (which can be ound using a predecessor search) or l i <. Now i l i > or all blocks j i, block delete the largest element and block insert it into B j+1. In the other case where l i < or all blocks j i, block delete the smallest element and block insert it into B j+1. The inal element that does not have a block to go into, will be put into O, then we put e into B i. In the special case where e did not it in any block, we insert e into O. In all cases we perorm rebalance. delete(k): We perorm a block search on all blocks and a search in O to ind out which structure the element e with key k is located in. I it is in O we just delete it rom O. Otherwise assume k < (the case k > can be handled similarly), and assume that e is in B i, then block delete e rom B i. For each j > i we block delete the predecessor o in B j, and insert it into B j 1 (in the case where there is no predecessor, we block delete the successor o instead). We also delete the predecessor o rom O and insert it in B l. The special case where k =, is handled similarly to k <, we note that ater this the predecessor o will be the new inger element. In all cases we perorm a rebalance. search(k), predecessor(k) and successor(k), all ollow the same general pattern. For each block B i starting rom B 1, we compute the largest and the smallest element in the block. I k is between these two elements we return the result o block search, block predecessor or block successor respectively on B i, otherwise we continue with the next block. In case k is not within the bounds o any block, we return the result o search(k), predecessor(k) or successor(k) respectively on O. 4.3 Analysis By the invariants, we see that every C i and D i except the last, have ixed size. Since O is a movable dictionary it can be moved right or let as this inal C i or D i expands or contracts. Thus the structure can be maintained in a contiguous memory layout. The correctness o the operations ollows rom the act that I.1 and I.2, implies that elements in B j or O are urther away rom than elements rom B i where i < j. We now argue that search runs in time O(q(t, n)). Let e be the element we are searching or. I e is located in some B i then at least hal

the elements in B i 1 will be between and e by I.1. We know rom O.1 that t = d(, e) Bi 1 2 2 2i 1 1. The time spent searching is O( i j=1 log B j ) = O(2 i ) = O(log t) = O(q(t, n)). I on the other hand e is in O, then by I.3 there are 2 + 1 elements in P, o these at least hal are between and e by I.2, so t, and the time used or searching is O(log n + k j=1 log B j ) = O(log n) = O(q(t, n)). The last equality ollows by the deinition o Z q. The same arguments work or predecessor and successor. Beore the change-inger operation the number o elements in the proximity structure by I.3 is 2 + 1. During the operation all these elements are inserted into O, and the same number o elements are extracted again by rebalance. Each o these operations are just insert or delete on a movable dictionary or a block taking time O(log n). In total we use time O( log n). Finally to see that both Insert and Delete run in O(log n) time, notice that in the proximity structure doing a constant number o queries in every block is asymptotically bounded by the time to do the queries in the last block. This is because their sizes increase double-exponentially. Since the size o the last block is bounded by n we can guarantee O(log n) time or doing a constant number o queries on every block (this includes predecessor/successor queries). In the worst case, we need to insert an element in the irst block o the proximity structure, and bubble elements all the way through the proximity structure and inally insert an element in the overlow structure. This will take O(log n) time. At this point we might have to rebalance the structure, but this merely requires deleting and inserting a constant number o elements rom one structure to the other, since we assumed Z q (n) and Z q (n + 1) dier by at most a constant. Deletion works in a similar manner. Reerences 1. Borodin, A., Fich, F.E., Meyer au der Heide, F., Upal, E., Wigderson, A.: A tradeo between search and update time or the implicit dictionary problem. In: Proc. 13th ICALP, LNCS, vol. 226, pp. 50 59. Springer (1986) 2. Brodal, G.S., Kejlberg-Rasmussen, C., Truelsen, J.: A cache-oblivious implicit dictionary with the working set property. In: Proc. 21st ISAAC, Part II. LNCS, vol. 6507, pp. 37 48. Springer (2010) 3. Brodal, G.S.: Finger search trees. In: Mehta, D., Sahni, S. (eds.) Handbook o Data Structures and Applications, chap. 11. CRC Press (2005) 4. Franceschini, G., Grossi, R.: Optimal worst case operations or implicit cacheoblivious search trees. In: Proc. 8th, WADS, LNCS, vol. 2748, pp. 114 126. Springer (2003) 5. Franceschini, G., Grossi, R., Munro, J.I., Pagli, L.: Implicit B-Trees: New results or the dictionary problem. In: Proc. 43rd FOCS. pp. 145 154. IEEE (2002) 6. Frederickson, G.N.: Implicit data structures or the dictionary problem. JACM 30(1), 80 94 (1983) 7. Munro, J.I.: An implicit data structure supporting insertion, deletion, and search in O(log 2 n) time. JCSS 33(1), 66 74 (1986) 8. Munro, J.I., Suwanda, H.: Implicit data structures or ast search and update. JCSS 21(2), 236 250 (1980)