On the tightness of Buhrman- Cleve-Wigderson simulation Shengyu Zhang The Chinese University of Hong Kong On the relation between decision tree complexity.

Slides:



Advertisements
Similar presentations
Estimating Distinct Elements, Optimally
Advertisements

1+eps-Approximate Sparse Recovery Eric Price MIT David Woodruff IBM Almaden.
Tight Bounds for Distributed Functional Monitoring David Woodruff IBM Almaden Qin Zhang Aarhus University MADALGO Based on a paper in STOC, 2012.
Tight Bounds for Distributed Functional Monitoring David Woodruff IBM Almaden Qin Zhang Aarhus University MADALGO.
Optimal Space Lower Bounds for All Frequency Moments David Woodruff MIT
The Polynomial Method In Quantum and Classical Computing Scott Aaronson (MIT) OPEN PROBLEM.
Quantum Lower Bounds You probably Havent Seen Before (which doesnt imply that you dont know OF them) Scott Aaronson, UC Berkeley 9/24/2002.
Quantum Lower Bounds The Polynomial and Adversary Methods Scott Aaronson September 14, 2001 Prelim Exam Talk.
The Future (and Past) of Quantum Lower Bounds by Polynomials Scott Aaronson UC Berkeley.
Limitations of Quantum Advice and One-Way Communication Scott Aaronson UC Berkeley IAS Useful?
Quantum Search of Spatial Regions Scott Aaronson (UC Berkeley) Joint work with Andris Ambainis (U. Latvia)
Optimal Space Lower Bounds for all Frequency Moments David Woodruff Based on SODA 04 paper.
Xiaoming Sun Tsinghua University David Woodruff MIT
Tight Lower Bounds for the Distinct Elements Problem David Woodruff MIT Joint work with Piotr Indyk.
Approximate List- Decoding and Hardness Amplification Valentine Kabanets (SFU) joint work with Russell Impagliazzo and Ragesh Jaiswal (UCSD)
Circuit and Communication Complexity. Karchmer – Wigderson Games Given The communication game G f : Alice getss.t. f(x)=1 Bob getss.t. f(y)=0 Goal: Find.
The Communication Complexity of Approximate Set Packing and Covering
Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity.
Shengyu Zhang The Chinese University of Hong Kong.
Eran Omri, Bar-Ilan University Joint work with Amos Beimel and Ilan Orlov, BGU Ilan Orlov…!??!!
Metric Embeddings As Computational Primitives Robert Krauthgamer Weizmann Institute of Science [Based on joint work with Alex Andoni]
Quantum Computing MAS 725 Hartmut Klauck NTU
CS151 Complexity Theory Lecture 6 April 15, 2015.
CS151 Complexity Theory Lecture 7 April 20, 2004.
On Uniform Amplification of Hardness in NP Luca Trevisan STOC 05 Paper Review Present by Hai Xu.
Avraham Ben-Aroya (Tel Aviv University) Oded Regev (Tel Aviv University) Ronald de Wolf (CWI, Amsterdam) A Hypercontractive Inequality for Matrix-Valued.
Derandomizing LOGSPACE Based on a paper by Russell Impagliazo, Noam Nissan and Avi Wigderson Presented by Amir Rosenfeld.
On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research.
1 The Complexity of Massive Data Set Computations Ziv Bar-Yossef Computer Science Division U.C. Berkeley Ph.D. Dissertation Talk May 6, 2002.
CS151 Complexity Theory Lecture 6 April 15, 2004.
1 Recap (I) n -qubit quantum state: 2 n -dimensional unit vector Unitary op: 2 n  2 n linear operation U such that U † U = I (where U † denotes the conjugate.
Randomness in Computation and Communication Part 1: Randomized algorithms Lap Chi Lau CSE CUHK.
Scott Aaronson (MIT) Andris Ambainis (U. of Latvia) Forrelation: A Problem that Optimally Separates Quantum from Classical Computing H H H H H H f |0 
Sketching and Embedding are Equivalent for Norms Alexandr Andoni (Simons Inst. / Columbia) Robert Krauthgamer (Weizmann Inst.) Ilya Razenshteyn (MIT, now.
Shengyu Zhang CSE CUHK. Roadmap Intro to theoretical computer science Intro to quantum computing Export of quantum computing –Formula Evaluation.
Shengyu Zhang The Chinese University of Hong Kong.
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 13 June 22, 2005
Lower Bounds for Massive Dataset Algorithms T.S. Jayram (IBM Almaden) IIT Kanpur Workshop on Data Streams.
Distributed Verification and Hardness of Distributed Approximation Atish Das Sarma Stephan Holzer Danupon Nanongkai Gopal Pandurangan David Peleg 1 Weizmann.
Tight Bounds for Graph Problems in Insertion Streams Xiaoming Sun and David P. Woodruff Chinese Academy of Sciences and IBM Research-Almaden.
Information Complexity Lower Bounds for Data Streams David Woodruff IBM Almaden.
Quantum Computing MAS 725 Hartmut Klauck NTU TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A A.
Information Theory for Data Streams David P. Woodruff IBM Almaden.
PODC Distributed Computation of the Mode Fabian Kuhn Thomas Locher ETH Zurich, Switzerland Stefan Schmid TU Munich, Germany TexPoint fonts used in.
A limit on nonlocality in any world in which communication complexity is not trivial IFT6195 Alain Tapp.
Asymmetric Communication Complexity And its implications on Cell Probe Complexity Slides by Elad Verbin Based on a paper of Peter Bro Miltersen, Noam Nisan,
Fall 2013 CMU CS Computational Complexity Lectures 8-9 Randomness, communication, complexity of unique solutions These slides are mostly a resequencing.
Umans Complexity Theory Lectures Lecture 7b: Randomization in Communication Complexity.
My Favorite Ten Complexity Theorems of the Past Decade II Lance Fortnow University of Chicago.
Massive Data Sets and Information Theory Ziv Bar-Yossef Department of Electrical Engineering Technion.
Data Stream Algorithms Lower Bounds Graham Cormode
1 Introduction to Quantum Information Processing CS 467 / CS 667 Phys 667 / Phys 767 C&O 481 / C&O 681 Richard Cleve DC 653 Lecture.
Communication Complexity Guy Feigenblat Based on lecture by Dr. Ely Porat Some slides where adapted from various sources Complexity course Computer science.
Smooth Boolean Functions are Easy: Efficient Algorithms for Low-Sensitivity Functions Rocco Servedio Joint work with Parikshit Gopalan (MSR) Noam Nisan.
Lower bounds on data stream computations Seminar in Communication Complexity By Michael Umansky Instructor: Ronitt Rubinfeld.
1 Introduction to Quantum Information Processing CS 467 / CS 667 Phys 467 / Phys 767 C&O 481 / C&O 681 Richard Cleve DC 3524 Course.
The Message Passing Communication Model David Woodruff IBM Almaden.
Multiparty Communication Complexity (A biased introduction) Arkadev Chattopadhyay TIFR, Mumbai TexPoint fonts used in EMF. Read the TexPoint manual before.
Random Access Codes and a Hypercontractive Inequality for
Information Complexity Lower Bounds
Effcient quantum protocols for XOR functions
CS 154, Lecture 6: Communication Complexity
Linear sketching with parities
Linear sketching over
An Upper Bound on the GKS Game via Max Bipartite Matching
Linear sketching with parities
CS21 Decidability and Tractability
Shengyu Zhang The Chinese University of Hong Kong
CS151 Complexity Theory Lecture 7 April 23, 2019.
Switching Lemmas and Proof Complexity
Presentation transcript:

On the tightness of Buhrman- Cleve-Wigderson simulation Shengyu Zhang The Chinese University of Hong Kong On the relation between decision tree complexity and communication complexity

Two concrete models Two concrete models for studying complexity: –Decision tree complexity –Communication complexity

Decision Tree Complexity Task: compute f(x) The input x can be accessed by querying x i ’s We only care about the number of queries made Query (decision tree) complexity: min # queries needed. f(x 1,x 2,x 3 )=x 1 ∧ (x 2 ∨ x 3 ) 0 f(x 1,x 2,x 3 )=0 x 2 = ? x 1 = ? 1 0 f(x 1,x 2,x 3 )=1 1 x 3 = ? 01 f(x 1,x 2,x 3 )=0f(x 1,x 2,x 3 )=1

Randomized/Quantum query models Randomized query model: –We can toss a coin to decide the next query. Quantum query model: –Instead of coin-tossing, we query for all variables in superposition. –|i, a, z  → |i, a  x i, z  i: the position we are interested in a: the register holding the queried variable z: other part of the work space –  i,a,z α i,a,z |i, a, z  →  i,a,z α i,a,z |i, a  x i, z  DT D (f), DT R (f), DT Q (f): deterministic, randomized, and quantum query complexities.

Communication complexity [Yao79] Two parties, Alice and Bob, jointly compute a function F(x,y) with x known only to Alice and y only to Bob. Communication complexity: how many bits are needed to be exchanged? --- CC D (F) AliceBob F(x,y) xy

Various modes Randomized: Alice and Bob can toss coins, and a small error probability is allowed. --- CC R (f) Quantum: Alice and Bob have quantum computers and send quantum messages. --- CC Q (f)

Applications of CC Though defined in an info theoretical setting, it turned out to provide lower bounds to many computational models. –Data structures, circuit complexity, streaming algorithms, decision tree complexity, VLSI, algorithmic game theory, optimization, pseudo-randomness…

Question: Any relation between the two well-studied complexity measures?

One simple bound Composed functions: F(x,y) = f ∘ g (x,y) = f(g 1 (x (1),y (1) ), …, g n (x (n), y (n) )) –f is an n-bit function, g i is a Boolean function. –x (i) is the i-th block of x. [Thm* 1 ] CC(F) = O(DT(f) max i CC(g i )). –A log factor is needed in the bounded-error randomized and quantum models. Proof: Alice runs the DT algorithm for f(z). Whenever she wants z i, she computes g i (x (i),y (i) ) by communicating with Bob. H. Buhrman, R. Cleve, A. Wigderson. STOC, *1. H. Buhrman, R. Cleve, A. Wigderson. STOC, 1998.

A lower bound method for DT Composed functions: F(x,y) = f(g 1 (x (1),y (1) ), …, g n (x (n), y (n) )) [Thm] CC(F) = O(DT(f) max i CC(g i )). Turning the relation around, we have a lower bound for DT(f) by CC(f(g 1, …, g n )): DT(f) = Ω (CC(F)/max i CC(g i )) –In particular, if |Domain(g i )| = O(1), then DT(f) = Ω (CC(f ∘ g))

How tight is the bound? Unfortunately, the bound is also known to be loose in general. f = Parity, g = ⊕: F = Parity(x⊕y) Obs: F = Parity(x) ⊕ Parity(y). So CC D (F) = 1, but DT Q (f) = Ω(n). Similar examples: –f = AND n, g = AND 2, –f = OR n, g = OR 2.

Tightness Question: Can we choose g i ’s s.t. CC(f ∘ g) = Θ(DT(f) max i CC(g i ))? Question: Can we choose g i ’s with O(1) input size s.t. CC(f ∘ g) = Θ(DT(f))? Theorem: Ǝ g i ∊{٧ 2, ٨ 2 } s.t. CC(f ∘ g) = poly ( DT(f)).

More precisely Theorem 1. For all Boolean functions, Theorem 2. For all monotone Boolean functions, –Improve Thm 1 on bounds and range of max. max g i 2 f ^ ; _ g CC R ( f ± g ) = ­ ( DT D ( f ) 1 = 3 ) ; max g i 2 f ^ ; _ g CC Q ( f ± g ) = ­ ( DT D ( f ) 1 = 6 ) : max g 2 f ^ n ; _ n g CC R ( f ± g ) = ­ ( DT D ( f ) 1 = 2 ) ; max g 2 f ^ n ; _ n g CC Q ( f ± g ) = ­ ( DT D ( f ) 1 = 4 ) :

Implications A fundamental question: Are classical and quantum communication complexities polynomially related? –Largest gap: quadratic (by Disjointness) Corollary: For all Boolean functions f, For all monotone Boolean functions f, max g i 2 f ^ ; _ g CC D ( f ± g ) = O µ max g i 2 f ^ ; _ g CC Q ( f ± g ) 6 ¶ : max g 2 f ^ n ; _ n g CC D ( f ± g ) = O µ max g 2 f ^ n ; _ n g CC Q ( f ± g ) 4 ¶ : 12 Sherstov

Proof [Block sensitivity] –f: function, –x: input, –x I ( I ⊆[n]): flipping variables in I –bs(f,x): max number b of disjoint sets I 1, …, I b flipping each of which changes f-value (i.e. f(x) ≠ f(x I _b )). –bs(f): max x bs(f,x) DT D (f) = O(bs 3 (f)) for general Boolean f, DT D (f) = O(bs 2 (f)) for monotone Boolean f.

Through block sensitivity Goal: Known: DT D (f) = O(bs 3 (f)) for general Boolean f. So it’s enough to prove max g i 2 f ^ ; _ g CC R ( f ± g ) = ­ ( DT D ( f ) 1 = 3 ) ; max g i 2 f ^ ; _ g CC Q ( f ± g ) = ­ ( DT D ( f ) 1 = 6 ) : max g i 2 f ^ ; _ g CC R ( f ± g ) = ­ ( b s ( f )) ; max g i 2 f ^ ; _ g CC Q ( f ± g ) = ­ ( p b s ( f )) :

Disjointness Disj(x,y) = OR(x ٨ y). UDisj(x,y): Disj with promise that |x ٨ y| ≤ 1. Theorem Idea (for our proof): Pick g i ’s s.t. f ∘g embeds an instance of UDisj(x,y) of size bs(f). *1: B. Kalyanasundaram and G. Schintger, SIAMJoDM, Z. Bar-Yossef, T. Jayram, R. Kumar, D. Sivakumar, JCSS, A. Razborov, TCS, *2: A. Razborov, IM, A. Sherstov, SIAMJoC, CC R ( UD i s j ) = £ ( n ) ¤ 1 ; CC Q ( UD i s j ) = £ ( p n ) ¤ 2

bs is Unique OR of flipping blocks Protocol for f(g 1, …, g n ) → Protocol for UDisj b. (b = bs(f)). Input (x’,y’) ∊{0,1} 2n ← Input (x,y) ∊{0,1} 2b –Suppose bs(f) is achieved by z and blocks I 1, …, I b. –i ∉ any block: x’ i = y’ i = z i, g i = ٨. –i ∊ I j : x’ i = x j, y’ i = y i, g i = ٨, if z i = 0 x’ i = ¬x j, y’ i = ¬y i, g i = ٧, if z i = 1 –∃! j s.t. g(x’,y’) = z I _j ⇔ ∃! j s.t. x j ٨ y j = 1. x j ٨ y j = 1 ⇔ g i (x’ i, y’ i ) = ¬z i, ∀ i ∊ I j g i (x’ i, y’ i ) = z i

Concluding remarks For monotone functions, observe that each sensitive block contains all 0 or all 1. Using pattern matrix* 1 and its extension* 2, one can show that CC Q (f ∘g) = Ω(deg ε (f) ) for some constant size functions g. –Improving the previous: deg ε (f) = Ω(bs(f) 1/2 ) *1: A. Sherstov, SIAMJoC, 2009 *2: T. Lee, S. Zhang, manuscript, 2008.

About the embedding idea Theorem* 1. CC R ((NAND-formula ∘ NAND) = Ω(n/8 d ). The simple idea of embedding Disj instance was later applied to show depth- independent lower bound: –CC R = Ω(n 1/2 ). –CC Q = Ω(n 1/4 ). arXiv: , with Jain and Klauck.arXiv: *1: *1: Leonardos and Saks, CCC, Jayram, Kopparty and Raghavendra, CCC, 2009.

Question: Can we choose g i ’s s.t. CC(f ∘ g) = Θ(DT(f) max i CC(g i ))?