Presentation is loading. Please wait.

Presentation is loading. Please wait.

Lower Bound Techniques for Data Structures Mihai P ă trașcu … Committee: Erik Demaine (advisor) Piotr Indyk Mikkel Thorup.

Similar presentations


Presentation on theme: "Lower Bound Techniques for Data Structures Mihai P ă trașcu … Committee: Erik Demaine (advisor) Piotr Indyk Mikkel Thorup."— Presentation transcript:

1 Lower Bound Techniques for Data Structures Mihai P ă trașcu … Committee: Erik Demaine (advisor) Piotr Indyk Mikkel Thorup

2 Data Structures I don’t study stacks, queues and binary search trees ! I do study data structure problems (a.k.a. Abstract Data Types) predecessor search Preprocess T = { n numbers } pred( q ): max { y є T | y < q } Maintain an array A[ n ] under: update( i, Δ): A[ i ] = Δ sum( i ): return A[0] + … + A[ i ] partial-sums problem

3 Motivation? packet forwarding predecessor search Preprocess T = { n numbers } pred( q ): max { y є T | y < q } Maintain an array A[ n ] under: update( i, Δ): A[ i ] = Δ sum( i ): return A[0] + … + A[ i ] partial-sums problem Support both:* list operations – concatenate, split, … * array operations – index

4 Binary Search Trees = Upper Bound predecessor search Preprocess T = { n numbers } pred( q ): max { y є T | y < q } Maintain an array A[ n ] under: update( i, Δ): A[ i ] = Δ sum( i ): return A[0] + … + A[ i ] partial-sums problem “Binary search trees solve predecessor search” => Complexity of predecessor ≤ O(lg n)/operation “Augmented binary search trees solve partial sums” =>Complexity of partial sums ≤ O(lg n)/operation my work ≤

5 What kind of “lower bound”? Model of computation ≈ real computers: memory words of w > lg n bits(pointers = words) random access to memory any operation on CPU registers (arithmetic, bitwise…) Just prove lower bound on # memory accesses Lower bounds you can trust. TM “Black box” “Array Mem[1.. S ] of w -bit words”

6 Why Data Structures? Other settings: streaming L.B. : many  not very “computational” mostly storage / info thy space-bounded (P vs L) L.B. : a few, Ω(n √lg n)  unnatural questions algebraic L.B. : some  cool, but not real computing… depth 3 circuits with mod-6 gates ?? The gospel: data structures L.B. : some understand some nontrivial computational phenomena efficient algorithms circuit L.B. not forthcoming hard optimization NP-completeness L.B. : one per STOC/FOCS  I want to understand computation.

7 Why Data Structures? Other settings: streaming L.B. : many  not very “computational” mostly storage / info thy space-bounded (P vs L) L.B. : a few, Ω(n √lg n)  unnatural questions algebraic L.B. : some  cool, but not real computing… depth 3 circuits with mod-6 gates ?? The gospel: data structures L.B. : some understand some nontrivial computational phenomena efficient algorithms circuit L.B. not forthcoming hard optimization NP-completeness L.B. : one per STOC/FOCS  I want to understand computation. Weak as some of the lower bounds may be, it’s the area that has gotten farthest towards “understanding computation”

8 History * [Yao, FOCS’78] [Ajtai’88] -- predecessor (static) [Fredman, Saks’89] -- partial sums, union find (dynamic) *Omitted: bounds for succinct data structures. Observations: huge influence 2 nd papers result wrong (better upper bound known) no journal version; many claims without proof

9 History * [Yao, FOCS’78] [Ajtai’88] -- predecessor (static) [Bing Xiao, Stanford’92] ** [Miltersen STOC’94] [Miltersen, Nisan, Safra, Wigderson STOC’95] [Beame, Fich STOC’99] [Sen ICALP’01] (1+ε)-nearest neighbor: [Chakrabarti, Chazelle, Gum, Lvov STOC’99] [Chakrabarti, Regev FOCS’04] [Fredman, Saks’89] -- partial sums, union find (dynamic) [Ben-Amram, Galil FOCS’91] [Miltersen, Subramanian, Vitter, Tamassia’93] [Husfeldt, Rauhe, Skyum’96] [Fredman, Henzinger’98] planar connectivity [Husfeldt, Rauhe ICALP’98] nondeterminism [Alstrup, Husfeldt, Rauhe FOCS’98] marked ancestor [Alstrup, Husfeldt, Rauhe SODA’01] dynamic 2D NN [Alstrup, Ben-Amram, Rauhe STOC’99] union-find === richness lower bounds === [Borodin, Ostrovsky, Rabani STOC’99] p.m. [Barkol, Rabani STOC’00] rand. NN [Jayram,Khot,Kumar,Rabani STOC’03] p.m. [Liu’04] det. ANN *Omitted: bounds for succinct data structures.

10 [Yao, FOCS’78] [Ajtai’88] -- predecessor (static) [Bing Xiao, Stanford’92] ** [Miltersen STOC’94] [Miltersen, Nisan, Safra, Wigderson STOC’95] [Beame, Fich STOC’99] [Sen ICALP’01] (1+ε)-nearest neighbor: [Chakrabarti, Chazelle, Gum, Lvov STOC’99] [Chakrabarti, Regev FOCS’04] [Fredman, Saks’89] -- partial sums, union find (dynamic) [Ben-Amram, Galil FOCS’91] [Miltersen, Subramanian, Vitter, Tamassia’93] [Husfeldt, Rauhe, Skyum’96] [Fredman, Henzinger’98] planar connectivity [Husfeldt, Rauhe ICALP’98] nondeterminism [Alstrup, Husfeldt, Rauhe FOCS’98] marked ancestor [Alstrup, Husfeldt, Rauhe SODA’01] dynamic 2D NN [Alstrup, Ben-Amram, Rauhe STOC’99] union-find Three Main Ideas === richness lower bounds === [Borodin, Ostrovsky, Rabani STOC’99] p.m. [Barkol, Rabani STOC’00] rand. NN [Jayram,Khot,Kumar,Rabani STOC’03] p.m. [Liu’04] det. ANN 1. Epochs 3. Round Elimination 2. Asym. Communication, Rectangles

11 [Yao, FOCS’78] [Ajtai’88] -- predecessor (static) [Bing Xiao, Stanford’92] ** [Miltersen STOC’94] [Miltersen, Nisan, Safra, Wigderson STOC’95] [Beame, Fich STOC’99] [Sen ICALP’01] (1+ε)-nearest neighbor: [Chakrabarti, Chazelle, Gum, Lvov STOC’99] [Chakrabarti, Regev FOCS’04] [Fredman, Saks’89] -- partial sums, union find (dynamic) [Ben-Amram, Galil FOCS’91] [Miltersen, Subramanian, Vitter, Tamassia’93] [Husfeldt, Rauhe, Skyum’96] [Fredman, Henzinger’98] planar connectivity [Husfeldt, Rauhe ICALP’98] nondeterminism [Alstrup, Husfeldt, Rauhe FOCS’98] marked ancestor [Alstrup, Husfeldt, Rauhe SODA’01] dynamic 2D NN [Alstrup, Ben-Amram, Rauhe STOC’99] union-find Three Main Ideas === richness lower bounds === [Borodin, Ostrovsky, Rabani STOC’99] p.m. [Barkol, Rabani STOC’00] rand. NN [Jayram,Khot,Kumar,Rabani STOC’03] p.m. [Liu’04] det. ANN 1. Epochs 3. Round Elimination 2. Asym. Communication, Rectangles

12 Review: Epoch Lower Bounds #updates:r3r3 r2r2 r1r1 r0r0 bits written:t u w∙r 3 t u w∙r 2 t u w∙rtuwtuw update: mark/unmark node [t u ] query: # marked ancestors? [t q ] time epoch j: r j updates epochs {0,.., j-1} write O(t u w∙r j-1 ) bits pick r >> t u w most updates from epoch j not known outside epoch j random query needs to read a cell from epoch j t q = Ω(lg n / lg r) = Ω(lg n / lg(t u w)) max {t q, t u } = Ω(lg n / lglg n)

13 Review: Epoch Lower Bounds “Big Challenges” [Miltersen’99] prove some ω(lg n /lglg n ) bound Candidate: Ω(lg n ) for the partial sums problem prove ω(lg n ) in the bit-probe model Maintain an array A[ n ] under: update( i, Δ): A[ i ] += Δ sum( i ): return A[0] + … + A[ i ] See also: [Fredman JACM ’81] [Fredman JACM ’82] [Yao SICOMP ’85] [Fredman, Saks STOC ’89] [Ben-Amram, Galil FOCS ’91] [Hampapuram, Fredman FOCS ’93] [Chazelle STOC ’95] [Husfeldt, Rauhe, Skyum SWAT ’96] [Husfeldt, Rauhe ICALP ’98] [Alstrup, Husfeldt, Rauhe FOCS ’98]

14 Our contribution [P., Demaine SODA’04]Ω(lg n ) for partial sums [P., Demaine STOC’04]Ω(lg n ) for dynamic trees, etc. * very simple proof * not based on epochs [P., Tarniţ ă ICALP’05] Ω(lg n ) via epoch argument!! => Ω(lg 2 n /lg 2 lg n ) in the bit-probe model Best Student Paper

15 Ω(lg n ) via Epoch Arguments? Old:information about epoch j outside j ≤ #cells written by epochs {0,.., j-1} ≤ O(t u ∙r j-1 ) j

16 Ω(lg n ) via Epoch Arguments? New:information about epoch j outside j ≤ #cells read by epochs { 0,.., j -1} from epoch j still≤ O(t u ∙r j-1 ) in the worst case  Foil worst-case by randomizing epoch construction! j

17 Ω(lg n ) via Epoch Arguments? Foil worst-case by randomizing epoch construction! #cells read by epochs { 0,.., j -1} from epoch j ≤ O ( (t u / #epochs) ∙ r j-1 ) on average => max { t u, t q } = Ω(lg n)

18 The “Very Simple Ω(lg n) Proof”

19 The hard instance: π = random permutation for t = 1 to n : query: sum(π( t )) Δ t = rand() update(π( t ), Δ t ) π time Δ1Δ1 Δ2Δ2 Δ3Δ3 Δ4Δ4 Δ5Δ5 Δ6Δ6 Δ7Δ7 Δ8Δ8 Δ9Δ9 Δ 10 Δ 11 Δ 12 Δ 13 Δ 14 Δ 15 Δ 16 Maintain an array A[ n ] under: update( i, Δ): A[ i ] = Δ sum( i ): return A[0] + … + A[ i ]

20 time Δ1Δ1 Δ2Δ2 Δ3Δ3 Δ4Δ4 Δ5Δ5 Δ6Δ6 Δ7Δ7 Δ8Δ8 Δ9Δ9 Δ 10 Δ 11 Δ 13 Δ 14 Δ 16 Δ 17 Δ 12 t = 5, …, 8 t = 9,…,12 Communication ≈ # memory locations * read during * written during t = 5, …, 8 t = 9,…,12 How can Mac help PC run ?

21 time How much information needs to be transferred? Δ7Δ7 Δ8Δ8 Δ9Δ9 Δ1Δ1 Δ1Δ1 Δ2Δ2 Δ3Δ3 Δ4Δ4 Δ5Δ5 Δ 13 Δ 14 Δ 16 Δ 17 Δ1+Δ5+Δ3Δ1+Δ5+Δ3 Δ1+Δ5+Δ3+Δ7+Δ2Δ1+Δ5+Δ3+Δ7+Δ2 Δ1+Δ5+Δ3+Δ7+Δ2 +Δ8 +Δ4Δ1+Δ5+Δ3+Δ7+Δ2 +Δ8 +Δ4 At least Δ 5, Δ 5 +Δ 7, Δ 5 +Δ 7 +Δ 8 => i.e. at least 3 words (random values incompressible)

22 The general principle Lower bound = # down arrows How many down arrows? (in expectation) (2 k -1) ∙ Pr[ ] ∙ Pr[ ] = (2 k -1) ∙ ½ ∙ ½ = Ω( k ) k operations

23 Recap yellow period pink period Communication = # memory locations * read during * written during Communication between periods of k items = Ω( k ) = Ω( k ) yellow period pink period * read during * written during # memory locations

24 Putting it all together time Ω( n /2) Ω( n /4) Ω( n /8) aaaa total Ω( n lg n ) Every load instruction counted lowest_common_ancestor(, ) write timeread time

25 Q.E.D. Augmented binary search trees are optimal. First “Ω(lg n)” for any dynamic data structure.

26 [Yao, FOCS’78] [Ajtai’88] -- predecessor (static) [Bing Xiao, Stanford’92] ** [Miltersen STOC’94] [Miltersen, Nisan, Safra, Wigderson STOC’95] [Beame, Fich STOC’99] [Sen ICALP’01] (1+ε)-nearest neighbor: [Chakrabarti, Chazelle, Gum, Lvov STOC’99] [Chakrabarti, Regev FOCS’04] [Fredman, Saks’89] -- partial sums, union find (dynamic) [Ben-Amram, Galil FOCS’91] [Miltersen, Subramanian, Vitter, Tamassia’93] [Husfeldt, Rauhe, Skyum’96] [Fredman, Henzinger’98] planar connectivity [Husfeldt, Rauhe ICALP’98] nondeterminism [Alstrup, Husfeldt, Rauhe FOCS’98] marked ancestor [Alstrup, Husfeldt, Rauhe SODA’01] dynamic 2D NN [Alstrup, Ben-Amram, Rauhe STOC’99] union-find Three Main Ideas === richness lower bounds === [Borodin, Ostrovsky, Rabani STOC’99] p.m. [Barkol, Rabani STOC’00] rand. NN [Jayram,Khot,Kumar,Rabani STOC’03] p.m. [Liu’04] det. ANN 1. Epochs 3. Round Elimination 2. Asym. Communication, Rectangles 2. Asym. Communication, Rectangles

27 Review: Communication Complexity

28 lg S bits w bits lg S bits w bits database => space S Traditional communication complexity: “total #bits communicated ≥ X” => t q ∙(lg S + w) ≥ X => t q = Ω(X/w) But wait ! X ≤ CPU input ≤ O(w) query(a,b,c)

29 Review: Communication Complexity lg S bits w bits lg S bits w bits database => space S Asymmetric communication complexity: “either Alice sends A bits or Bob sends B bits” => either t q ∙lg S ≥ A or t q ∙w ≥ B => t q ≥ min { A/lg S, B/w} query(a,b,c)

30 Prove: “either Alice sends A bits or Bob sends B bits” Assume Alice sends o(A), Bob sends o(B) => big monochromatic rectangle Show any big rectangle is bichromatic (standard idea in comm. complex.) Richness Lower Bounds Bob Alice 1/2 o(A) 1/2 o(B) output=1 Example:Alice --> q є {0,1} d Bob --> S=n points in {0,1} d Goal: find argmin xєS || x-q || 2 [Barkol, Rabani] A=Ω(d), B=Ω(n 1-ε ) => t q ≥ min { d/lg S, n 1-ε /w }

31 Example:Alice --> q є {0,1} d Bob --> S=n points in {0,1} d Goal: find argmin xєS || x-q || 2 [Barkol, Rabani] A=Ω(d), B=Ω(n 1-ε ) => t q ≥ min { d/lg S, n 1-ε /w } Richness Lower Bounds What does this really mean? “optimal space lower bound for constant query time” 1 n 1-o(1) Θ(d/lg n) lower bound S = 2 Ω(d/t q ) upper bound ≈ either: exponential space near-linear query time S tqtq Θ(n) 2 Θ(d) Also: optimal lower bound for decision trees

32 Results Partial match -- database of n strings in {0,1} d, query є {0,1,*} d [Borodin, Ostrovsky, Rabani STOC’99] [Jayram,Khot,Kumar,Rabani STOC’03] A = Ω(d/lg n) [P. FOCS’08]A = Ω(d) Nearest Neighbor on hypercube (ℓ 1, ℓ 2 ): deterministic γ-approximate: [Liu’04]A = Ω(d/ γ 2 ) randomized exact: [Barkol, Rabani STOC’00]A = Ω(d) rand. (1+ε)-approx: [Andoni, Indyk, P. FOCS’06]A = Ω(ε -2 lg n) “Johnson-Lindenstrauss space is optimal!” Approximate Nearest Neighbor in ℓ ∞ : [Andoni, Croitoru, P. FOCS’08] “[Indyk FOCS’98] is optimal!” simplify

33 Limits of Communication Approach No separation between S =O( n ) and S = n O(1) ! lg S bits w bits branching programs “ Alice must send Ω(A) bits” => t q = Ω(A / lg S) => t q = Ω ( A / lg(Sd/n) ) 1 n 1-o(1) Θ(d/lg n) S tqtq Θ(n) 2 Θ(d) Θ(d/lg d) Separation of Ω(lg n / lglg n ) between S =O( n ) and S = n O(1) ! Implication of richness lower bound undervalued !

34 CPU(s) -- > memory communication: one query: lg S k queries: lg ( ) =Θ ( k lg ) SkSk SkSk Richness Gets You More

35 SkSk SkSk Prob.1 Prob.2 Prob.3 Prob.k CPU(s) -- > memory communication: one query: lg S k queries: lg ( ) =Θ ( k lg )

36 Any richness lower bound “Alice must send A or Bob must send B” ===> k∙Alice must send k∙A or k∙Bob must send k∙B SkSk SkSk Prob.1 Prob.2 Prob.3 Prob.k Richness Gets You More CPU(s) -- > memory communication: one query: lg S k queries: lg ( ) =Θ ( k lg ) Direct Sum

37 Any richness lower bound “Alice must send A or Bob must send B” ===> k∙Alice must send k∙A or k∙Bob must send k∙B SkSk SkSk Richness Gets You More CPU(s) -- > memory communication: one query: lg S k queries: lg ( ) =Θ ( k lg ) Direct Sum t q = Ω(A / lg(S/k))

38 Three Main Ideas [Yao, FOCS’78] [Ajtai’88] -- predecessor [Bing Xiao, Stanford’92] [Miltersen STOC’94] [Miltersen, Nisan, Safra, Wigderson STOC’95] [Beame, Fich STOC’99] [Sen ICALP’01] (1+ε)-nearest neighbor: [Chakrabarti, Chazelle, Gum, Lvov STOC’99] [Chakrabarti, Regev FOCS’04] [Fredman, Saks’89] - partial sums, union find [Ben-Amram, Galil FOCS’91] [Miltersen, Subramanian, Vitter, Tamassia’93] [Husfeldt, Rauhe, Skyum’96] [Fredman, Henzinger’98] planar connectivity [Husfeldt, Rauhe ICALP’98] nondeterminism [Alstrup, Husfeldt, Rauhe FOCS’98] marked ancestor [Alstrup, Husfeldt, Rauhe SODA’01]dynamic 2D NN [Alstrup, Ben-Amram, Rauhe STOC’99] union-find === richness lower bounds === [Borodin, Ostrovsky, Rabani STOC’99] p.m. [Barkol, Rabani STOC’00] rand. NN [Jayram,Khot,Kumar,Rabani STOC’03] p.m. [Liu’04] det. ANN 1. Epochs 3. Round Elimination 4. Range Queries 2. Asym. Communication, Rectangles

39 Open Hunting Season Nice trick, but “Ω(lg n / lglg n ) with O( n polylg n ) space” not impressive argument for “curse of dimensionality” But space n 1+o(1) is hugely important in data structures => open hunting season for range queries etc. SELECT count(*) FROM employees WHERE salary <= AND startdate <= D range counting

40 Open Hunting Season SELECT count(*) FROM employees WHERE salary <= AND startdate <= D range counting [P. STOC’07] Ω(lg n / lglg n ) with O( n polylg n ) space N.B. : tight! 1 st bound beyond the semigroup model question from [Fredman JACM’82 ] [Chazelle FOCS’86 ]

41 The Power of Reductions SELECT count(*) FROM employees WHERE salary <= AND startdate <= D range counting Preprocess S={ n rectangles} stab(x,y): is (x,y) inside some RєS? 2D stabbing routing ACLs dispatching in some OO languages

42 The Power of Reductions SELECT count(*) FROM employees WHERE salary <= AND startdate <= D range counting Preprocess S={ n rectangles} stab(x,y): is (x,y) inside some RєS? 2D stabbing +1

43 The Power of Reductions Preprocess S={ n rectangles} stab(x,y): is (x,y) inside some RєS? 2D stabbing reachability oracles in butterfly graph Preprocess G = subgraph of butterfly reachable(x,y): is there a path x->y ?

44 The Power of Reductions Preprocess S={ n rectangles} stab(x,y): is (x,y) inside some RєS? 2D stabbing reachability oracles in butterfly graph Preprocess G = subgraph of butterfly reachable(x,y): is there a path x->y ?

45 The Power of Reductions Alice: set SBob: set T “are S and T disjoint?” Lopsided Set Disjointness reachability oracles in butterfly graph Preprocess G = subgraph of butterfly reachable(x,y): is there a path x->y ? Hint: S = {one edge out of every node} => n queries from 1 st to last level T = {deleted edges} S disjoint from T => all queries “yes”

46 Reachability in Butterfly?? update (node): (un)mark node query (leaf): any marked ancestor? marked ancestor problem

47 lopsided set disjointness (LSD) dyn. marked ancestor reachability oracles in the butterfly partial match(1+ε)-ANN ℓ 1, ℓ 2 NN in ℓ 1, ℓ 2 3-ANN in ℓ ∞ worst-case union-find dyn. trees, graphs dyn. 1D stabbing partial sums 2D stabbing 4D reporting2D counting dyn. NN in 2D dyn. 2D reporting [P. FOCS’08]

48 Three Main Ideas [Yao, FOCS’78] [Ajtai’88] -- predecessor [Bing Xiao, Stanford’92] [Miltersen STOC’94] [Miltersen, Nisan, Safra, Wigderson STOC’95] [Beame, Fich STOC’99] [Sen ICALP’01] (1+ε)-nearest neighbor: [Chakrabarti, Chazelle, Gum, Lvov STOC’99] [Chakrabarti, Regev FOCS’04] [Fredman, Saks’89] - partial sums, union find [Ben-Amram, Galil FOCS’91] [Miltersen, Subramanian, Vitter, Tamassia’93] [Husfeldt, Rauhe, Skyum’96] [Fredman, Henzinger’98] planar connectivity [Husfeldt, Rauhe ICALP’98] nondeterminism [Alstrup, Husfeldt, Rauhe FOCS’98] marked ancestor [Alstrup, Husfeldt, Rauhe SODA’01]dynamic 2D NN [Alstrup, Ben-Amram, Rauhe STOC’99] union-find === richness lower bounds === [Borodin, Ostrovsky, Rabani STOC’99] p.m. [Barkol, Rabani STOC’00] rand. NN [Jayram,Khot,Kumar,Rabani STOC’03] p.m. [Liu’04] det. ANN 1. Epochs 3. Round Elimination 4. Range Queries 2. Asym. Communication, Rectangles

49 Packet Forwarding/ Predecessor Search Preprocess n prefixes of ≤ w bits:  make a hash-table H with all prefixes of prefixes  | H |=O( n∙w ), can be reduced to O( n ) Given w -bit IP, find longest matching prefix:  binary search for longest ℓ such that IP[0: ℓ ] є H [van Emde Boas FOCS’75] [Waldvogel, Varghese, Turener, Plattner SIGCOMM’97] [Degermark, Brodnik, Carlsson, Pink SIGCOMM’97] [Afek, Bremler-Barr, Har-Peled SIGCOMM’99] O(lg w )

50 Review: Round Elimination hi lo hash(hi) 0/1 † † 0: continue searching for pred(hi) 1: continue searching for pred(lo) 1 k 2 o(k) bits I want to talk to Alice i Message has negligible info about the typical i => can be eliminated for fixed i

51 The Lemma Observe: can’t work worst-case! Traditional fix: introduce 2-sided error Think outside the : easy proof with a different error model [P.-Thorup, STOC’06]

52 The Model Alice, Bob receive inputs they may reject inputs if they accept, they start communicating and must produce a correct output The point: error probability ½ is trivial reject probability is still hard “We regret to inform you that your input has not been accepted for communication. We receive a large number of inputs, many of them of high quality, and scheduling constraints unfortunately make it impossible to accept all of them.”

53 The Proof Trie for Alice’s input (x 1, …, x k ) leaves = message sent node = set of msgs in subtree Say msg size is m=k/2: |leaf|=1, |root|=2 k/2 => ( ∀ )root-to-leaf path, ½ of nodes have |node|≥ ½|parent| averaging over node-child pairs => ( ∃ ) node: ½ its children have |child|> ½|node| thus ( ∃ ) msg M: ¼ of children have M є child m1m1 m3m3 m2m2 m1m1 m3m3 m1m1 m1m1 m2m2 m2m2 x3x3 x2x2 x1x1 {m 1,m 2 } {m 1,m 2, m 3 } fix i, x 1, …, x i-1 fixed message (eliminate) reject ¾ of inputs (x i )

54 Predecessor Search: Timeline after [van Emde Boas FOCS’75] … O(lg w ) has to be tight! [Beame, Fich STOC’99] slightly better bound with O( n 2 ) space … must improve the algorithm for O( n ) space! [P., Thorup STOC’06] tight Ω(lg w ) for space O( n polylg n ) ! Idea: * consider multiple queries * prove round elimination under direct sum

55 Predecessor Search: Timeline Idea: * consider multiple queries * prove round elimination under direct sum 1k k 1k I want to talk to Alice 2 I want to talk to Alice 1 I want to talk to Alice 2

56 Round Eliminated !

57 The End …or Champagne? Questions?

58

59 The Partial Sums Problem Textbook solution: “augmented” binary search trees Running time: O(lg n ) / operation Maintain an array A[ n ] under: update( i, Δ): A[ i ] += Δ sum( i ): return A[0] + … + A[ i ]


Download ppt "Lower Bound Techniques for Data Structures Mihai P ă trașcu … Committee: Erik Demaine (advisor) Piotr Indyk Mikkel Thorup."

Similar presentations


Ads by Google