1 Cell Probe Complexity - An Invitation Peter Bro Miltersen University of Aarhus.

Slides:



Advertisements
Similar presentations
Models of Computation Prepared by John Reif, Ph.D. Distinguished Professor of Computer Science Duke University Analysis of Algorithms Week 1, Lecture 2.
Advertisements

Introduction to Algorithms Quicksort
Chapter 11 Limitations of Algorithm Power Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Circuit and Communication Complexity. Karchmer – Wigderson Games Given The communication game G f : Alice getss.t. f(x)=1 Bob getss.t. f(y)=0 Goal: Find.
The Communication Complexity of Approximate Set Packing and Covering
Gerth Stølting Brodal University of Aarhus Monday June 9, 2008, IT University of Copenhagen, Denmark International PhD School in Algorithms for Advanced.
Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity.
Fusion Trees Advanced Data Structures Aris Tentes.
Complexity class NP Is the class of languages that can be verified by a polynomial-time algorithm. L = { x in {0,1}* | there exists a certificate y with.
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
Discrete Structures & Algorithms The P vs. NP Question EECE 320.
Hashing Techniques.
Complexity 11-1 Complexity Andrei Bulatov Space Complexity.
1 On Dynamic Shortest Paths Problems Liam Roditty Uri Zwick Tel Aviv University ESA 2004.
This material in not in your text (except as exercises) Sequence Comparisons –Problems in molecular biology involve finding the minimum number of edit.
NP-Completeness NP-Completeness Graphs 4/17/2017 4:10 AM x x x x x x x
Tirgul 6 B-Trees – Another kind of balanced trees Problem set 1 - some solutions.
Chapter 11: Limitations of Algorithmic Power
Quantum Algorithms II Andrew C. Yao Tsinghua University & Chinese U. of Hong Kong.
Chapter 11 Limitations of Algorithm Power Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
1 KU College of Engineering Elec 204: Digital Systems Design Lecture 20 Datapath and Control Datapath - performs data transfer and processing operations.
Data Structures Hashing Uri Zwick January 2014.
The Hat Game 11/19/04 James Fiedler. References Hendrik W. Lenstra, Jr. and Gadiel Seroussi, On Hats and Other Covers, preprint, 2002,
Chapter Tow Search Trees BY HUSSEIN SALIM QASIM WESAM HRBI FADHEEL CS 6310 ADVANCE DATA STRUCTURE AND ALGORITHM DR. ELISE DE DONCKER 1.
Machines with Memory Chapter 3 (Part B). Turing Machines  Introduced by Alan Turing in 1936 in his famous paper “On Computable Numbers with an Application.
Problems and MotivationsOur ResultsTechnical Contributions Membership: Maintain a set S in the universe U with |S| ≤ n. Given an x in U, answer whether.
Theory of Computing Lecture 15 MAS 714 Hartmut Klauck.
CSCE350 Algorithms and Data Structure Lecture 17 Jianjun Hu Department of Computer Science and Engineering University of South Carolina
Streaming Algorithms Piotr Indyk MIT. Data Streams A data stream is a sequence of data that is too large to be stored in available memory Examples: –Network.
TECH Computer Science Dynamic Sets and Searching Analysis Technique  Amortized Analysis // average cost of each operation in the worst case Dynamic Sets.
Week 10Complexity of Algorithms1 Hard Computational Problems Some computational problems are hard Despite a numerous attempts we do not know any efficient.
1 Lower Bounds Lower bound: an estimate on a minimum amount of work needed to solve a given problem Examples: b number of comparisons needed to find the.
1 CSE 326: Data Structures: Hash Tables Lecture 12: Monday, Feb 3, 2003.
Hashing and Hash-Based Index. Selection Queries Yes! Hashing  static hashing  dynamic hashing B+-tree is perfect, but.... to answer a selection query.
Communication vs. Computation S Venkatesh Univ. Victoria Presentation by Piotr Indyk (MIT) Kobbi Nissim Microsoft SVC Prahladh Harsha MIT Joe Kilian NEC.
Chapter 11 Hash Tables © John Urrutia 2014, All Rights Reserved1.
Parallel computation Section 10.5 Giorgi Japaridze Theory of Computability.
Asymmetric Communication Complexity And its implications on Cell Probe Complexity Slides by Elad Verbin Based on a paper of Peter Bro Miltersen, Noam Nisan,
COSC 2007 Data Structures II Chapter 13 Advanced Implementation of Tables IV.
Hashing 1 Hashing. Hashing 2 Hashing … * Again, a (dynamic) set of elements in which we do ‘search’, ‘insert’, and ‘delete’ n Linear ones: lists, stacks,
Week 10 - Friday.  What did we talk about last time?  Graph representations  Adjacency matrix  Adjacency lists  Depth first search.
CSCE 411H Design and Analysis of Algorithms Set 10: Lower Bounds Prof. Evdokia Nikolova* Spring 2013 CSCE 411H, Spring 2013: Set 10 1 * Slides adapted.
1 CSC 421: Algorithm Design & Analysis Spring 2014 Complexity & lower bounds  brute force  decision trees  adversary arguments  problem reduction.
Hashing TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA Course: Data Structures Lecturer: Haim Kaplan and Uri Zwick.
Lower bounds on data stream computations Seminar in Communication Complexity By Michael Umansky Instructor: Ronitt Rubinfeld.
Onlinedeeneislam.blogspot.com1 Design and Analysis of Algorithms Slide # 1 Download From
Graphs 4/13/2018 5:25 AM Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, 2015 NP-Completeness.
NP-Completeness NP-Completeness Graphs 5/7/ :49 PM x x x x x x x
Information Complexity Lower Bounds
New Characterizations in Turnstile Streams with Applications
On the Size of Pairing-based Non-interactive Arguments
Decision trees Polynomial-Time
CPSC 411 Design and Analysis of Algorithms
CSC 421: Algorithm Design & Analysis
Computational Molecular Biology
Branching Programs Part 3
NP-Completeness NP-Completeness Graphs 11/16/2018 2:32 AM x x x x x x
Lower Bound Theory.
Analysis and design of algorithm
CSCE 411 Design and Analysis of Algorithms
Resolution over Linear Equations: (Partial) Survey & Open Problems
Constrained Bipartite Vertex Cover: The Easy Kernel is Essentially Tight Bart M. P. Jansen June 4th, WORKER 2015, Nordfjordeid, Norway.
Alternating tree Automata and Parity games
NP-Completeness NP-Completeness Graphs 12/3/2018 2:46 AM x x x x x x x
An Upper Bound on the GKS Game via Max Bipartite Matching
Chapter 11 Limitations of Algorithm Power
CPSC 411 Design and Analysis of Algorithms
David Kauchak cs161 Summer 2009
Hashing.
Switching Lemmas and Proof Complexity
Presentation transcript:

1 Cell Probe Complexity - An Invitation Peter Bro Miltersen University of Aarhus

2 The “Were-you-last?” Game m players wait in separate cubicles. One by one, they are taken to a game room. When a player leaves the game room, he is asked if he was the last of the m players to go there and must answer correctly.

3 Inside the Game Room

4 Inside the Game Room, II The player can open at most t boxes He can put a pebble in an empty box or remove the pebble from a full box He must close a box before opening the next one

5 Winning the Game The players win if all players answer correctly. How small can t be as a function of m to ensure that the players have a winning strategy?

6 A Counter The players make boxes 0,1,..,t-1 represent the number of players having already been in the room The counter can be incremented and read opening at most t boxes

7 Can we do much better?

8 YES! t = 5 log log m.

9 Alternative Counter I Type of first block Number of blocks Size of each block

10 Alternative Counter, II Each entry in (1,5; 1,1,3,5,2) can be specified using ≈ log log m bits. The Counter can be incremented touching only 4 entries ≈ 4 log log m bits.

11 Problem: How can a player tell if the value m has been reached?

12 Alternative Counter, III Test for 0: The vector should start (0,1; …). Start the counter at m – Each player decrements the counter.

13 Problem: The players can’t make the counter start at m – initially, all boxes must be empty!

14 Alternative Counter, IV The players would have liked boxes to contain pebbles initially. Patch: They follow their protocol, but maintain that boxes are in the opposite state of what they are supposed to. Thus, the players win!

15 Can we do much better?

16 NO! If t = 0.4 log log m, the players lose.

17 Decision Assignment Tree 7 5 Open box 7 If it contains a pebble, Remove it, and go here 1/0 0/1 0/0 Strategy for a player: yes no 1/0

18 Sunflower is the same for all i,j Sunflower center

19 Erdös-Rado Sunflower Lemma Every large set system contains a large sunflower.

20 Why the players lose: : The boxes appearing in the decision assignment tree of each player : A big sunflower. : The other sets

21 Why the players lose, II Sequence: Center states: Pigeon hole principle: for some k, l, New sequence:

22 The Game = A Dynamic Problem Maintain a subset S of {1,..,m} under Insert (x): Insert x into S. Full (): Is S={1,..,m}?

23 The Room = Cell Probe Model Boxes = memory cells. Pebble or no pebble = 0 or 1 (word size w=1) Opening a box = accessing a memory cell. Decision assignment tree for a player = implementation of an operation.

24 The Cell Probe Model An information theoretic model for solutions to dynamic data structure problems Only memory accesses are counted Each operation in the solution is assigned a decision assignment tree – the worst case complexity is the depth of the tree.

25 Why study cell probe complexity? Lower bounds in the cell probe model are valid for a unit cost random access machine with the same word size, independent of the instruction set. Fundamental combinatorial complexity measure.

26 This invitation to cell probe complexity Focuses on worst case time per operation (rather than amortized, and/or tradeoffs) Focuses on dynamic problems (rather than static ones) Focuses first on w = 1, i.e., on bit probe complexity

27 Why Study Bit Probe Complexity? Log cost RAM. Large lower bounds would be still be large if divided by w. Coding theoretic interpretation: Locally decodable source codes. All known results have easy proofs.

28 Dynamic Graph Problems Dynamic Graph Connectivity: Maintain G =({1,..,n},E) under Insert (e): Insert e into E. Delete (e): Delete e from E. Connected (u,v): Are u and v connected?

29 DGC, best known bounds Upper bounds: Worst case time bit probes per operation [Henzinger and King] Amortized time bit probes per operation [Holm, de Lichtenberg, Thorup] Lower bound: Ω(log n/log log n) bit probes per operation [Fredman]

30 A lower bound for dynamic connectivity State 0=(0,0,0,…,0) State 1State 2 State 3 State 4

31 Lower Bound For DGC, II State 1,2,..,n all have Hamming weight at most t. We can distinguish between the states using a decision tree of depth d=O(t log n).

32 Lower Bound For DGC, III State The sequence 001 is unique for State 3 and contains at most t zeros

33 Lower bound for DGC, IV n ≤ # ways to arrange at most t ones in a sequence of length at most d t = Ω(log n/log log n)

34 Dynamic Circuit Value Maintain Boolean Circuit C under Insert (g,h): Insert wire from output of gate g to input of gate h Switch (g): Switch gate g between AND,OR Evaluate (): Return value of circuit

35 Dynamic Circuit Value Best known algorithm for DCV: Reevaluate from scratch. Best known lower bound in bit probe model: Ω(log n) bits must be touched in some operation.

36 Dynamic Language Membership Problems Given Boolean language L. Maintain Boolean string x of length n under Change (i,a): Set ith bit of x to a. Member (): Is x a member of L? Dynamic graph connectivity Dynamic UGAP

37 Completeness of DCV Dynamic circuit value needs Ω(log n) bit probes per operation The dynamic language membership problem for some language in P needs Ω(log n) bit probes per operation.

38 Element distinctness Element distinctness language: ED = binary strings x consisting of n blocks, each of length 2 log n, all different. Dynamic ED needs Ω(log n) bit probes per operation.

39 Lower bound for dynamic ED By fixing all blocks, except the first, we create a solution to a subproblem. #decision assignment tree systems of depth t ≥ #subproblems Dynamic ED needs Ω(log n) bit probes per operation.

40 A sad situation We don’t know any lower bounds for DCV better than Ω(log n) bit probes. Thus we don’t know any lower bound for any dynamic language membership problem in P better than Ω(log n) bit probes. We suspect many problems in P needs Ω(n) bit probes.

41 The transdichotomous model Problem: Problem over universe of size m. Problem instance of size (i.e., #elements of universe) n. Cell probe solution: Word size w = log m. Time bound should be a function of n only.

42 Dynamic Search Problems Maintain subset S of {1,..,m} under Insert (x): Insert x into S. Delete (x): Delete x from S. and queries like Member (x): Is x in S ? Predecessor (x): What is max{y| y≤x, y in S} ? HammingNeighbor (x): Return Hamming neighbor of x in S.

43 Transdichotomous upper bounds Dynamic member and predecessor : AVL trees: O(log n) cell probes per operation. Andersson/Thorup’99: cell probes per operation. Dynamic Hamming neighbor: Trivial bound: O(n) cell probes per operation.

44 Dynamic algebraic problems Dynamic polynomial multiplication: Maintain two polynomials f,g of degree at most n over GF(m) under Change-f (i,a): Let ith coefficient of f be a. Change-g (i,a): Let ith coefficient of g be a. Query (j): What is the jth coefficient of fg?

45 Transdichotomous upper bound Dynamic polynomial multiplication: cell probes per operation [Reif and Tate]

46 Transdichotomous lower bounds Lower bound of Ω(f(n)) No transdichotomous upper bound of o(f(n)). There are c,, so that any solution with parameters uses at least cell probes.

47 Transdichotomous lower bounds, II We are free to choose the size of the universe as large as we want. This makes large lower bounds possible using current techniques. It also makes the lower bounds somewhat less interesting.

48 Transdichotomous lower bounds, III Dynamic predecessor: cell probes per operation is necessary [Beame and Fich] Dynamic Hamming neighbor: cell probes per operation is necessary [Barkol and Rabani] Dynamic polynomial multiplication: cell probes per operation is necessary [Frandsen,Hansen,Miltersen] Dynamic graph connectivity: cell probes per operation is necessary [Fredman and Saks]

49 Two main techniques Communication complexity method [Ajtai] Time stamp method [Fredman and Saks]

50 Communication Complexity Technique Given dynamic problem, construct two-party communication problem: Alice gets state reachable within d operations from the initial state. Bob gets a query operation. Upper bound for dynamic problem implies upper bound for communication problem Hence, lower bound for communication problem implies lower bound for dynamic problem.

51 Transferring Upper Bounds Alice constructs data structure corresponding to her state. Alice sends Bob a perfect hash function of the memory cells that have changed since initial state. Bob simulates the query operation by asking Alice for values of memory cells: He sends her a hashed address, Alice sends back an address with this hash value that were changed and a new value.

52 Challenges for cell probe complexity, I Show a lower bound for a dynamic language membership problem of the form (w = 1) (w = log n)

53 Static problem Set of possible data Set of questions to data Answers to questions

54 Challenges for cell probe complexity, II Show a lower bound for a static problem of the form :