Imperfectly Shared Randomness

Slides:



Advertisements
Similar presentations
Estimating Distinct Elements, Optimally
Advertisements

1+eps-Approximate Sparse Recovery Eric Price MIT David Woodruff IBM Almaden.
Hardness of Reconstructing Multivariate Polynomials. Parikshit Gopalan U. Washington Parikshit Gopalan U. Washington Subhash Khot NYU/Gatech Rishi Saket.
Optimal Space Lower Bounds for all Frequency Moments David Woodruff Based on SODA 04 paper.
The Average Case Complexity of Counting Distinct Elements David Woodruff IBM Almaden.
Optimal Bounds for Johnson- Lindenstrauss Transforms and Streaming Problems with Sub- Constant Error T.S. Jayram David Woodruff IBM Almaden.
Xiaoming Sun Tsinghua University David Woodruff MIT
Tight Lower Bounds for the Distinct Elements Problem David Woodruff MIT Joint work with Piotr Indyk.
Approximate List- Decoding and Hardness Amplification Valentine Kabanets (SFU) joint work with Russell Impagliazzo and Ragesh Jaiswal (UCSD)
Separating Deterministic from Randomized Multiparty Communication Complexity Joint work with Paul Beame (University of Washington) Matei David (University.
Circuit and Communication Complexity. Karchmer – Wigderson Games Given The communication game G f : Alice getss.t. f(x)=1 Bob getss.t. f(y)=0 Goal: Find.
Inapproximability of MAX-CUT Khot,Kindler,Mossel and O ’ Donnell Moshe Ben Nehemia June 05.
The Communication Complexity of Approximate Set Packing and Covering
Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity.
1 Information complexity and exact communication bounds April 26, 2013 Mark Braverman Princeton University Based on joint work with Ankit Garg, Denis Pankratov,
Eran Omri, Bar-Ilan University Joint work with Amos Beimel and Ilan Orlov, BGU Ilan Orlov…!??!!
Of 27 01/06/2015CMI: Uncertain Communication1 Communication Amid Uncertainty Madhu Sudan Microsoft Research Based on Juba, S. (STOC 2008, ITCS 2011) Juba,
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 12 June 18, 2006
CS151 Complexity Theory Lecture 10 April 29, 2004.
On the Complexity of Approximating the VC Dimension Chris Umans, Microsoft Research joint work with Elchanan Mossel, Microsoft Research June 2001.
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 13 June 22, 2005
Of 19 June 15, 2015CUHK: Communication Amid Uncertainty1 Communication Amid Uncertainty Madhu Sudan Microsoft Research Based on joint works with Brendan.
Sublinear Algorithms via Precision Sampling Alexandr Andoni (Microsoft Research) joint work with: Robert Krauthgamer (Weizmann Inst.) Krzysztof Onak (CMU)
Information Complexity: an Overview Rotem Oshman, Princeton CCI Based on work by Braverman, Barak, Chen, Rao, and others Charles River Science of Information.
The Price of Uncertainty in Communication Brendan Juba (Washington U., St. Louis) with Mark Braverman (Princeton)
Of 22 10/07/2015UMass: Uncertain Communication1 Communication Amid Uncertainty Madhu Sudan Microsoft Research Based on Juba, S. (STOC 2008, ITCS 2011)
Of 22 10/30/2015WUSTL: Uncertain Communication1 Communication Amid Uncertainty Madhu Sudan Harvard Based on Juba, S. (STOC 2008, ITCS 2011) Juba, S. (STOC.
Massive Data Sets and Information Theory Ziv Bar-Yossef Department of Electrical Engineering Technion.
Data Stream Algorithms Lower Bounds Graham Cormode
1 Introduction to Quantum Information Processing CS 467 / CS 667 Phys 467 / Phys 767 C&O 481 / C&O 681 Richard Cleve DC 3524 Course.
Tight Bound for the Gap Hamming Distance Problem Oded Regev Tel Aviv University TexPoint fonts used in EMF. Read the TexPoint manual before you delete.
Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.
Umans Complexity Theory Lecturess Lecture 11: Randomness Extractors.
Imperfectly Shared Randomness
Communication Amid Uncertainty
Information Complexity Lower Bounds
Stochastic Streams: Sample Complexity vs. Space Complexity
New Characterizations in Turnstile Streams with Applications
Dana Ron Tel Aviv University
Unbounded-Error Classical and Quantum Communication Complexity
And now for something completely different!
Communication Amid Uncertainty
General Strong Polarization
Communication Amid Uncertainty
Branching Programs Part 3
Direct product testing
General Strong Polarization
CS 154, Lecture 6: Communication Complexity
Turnstile Streaming Algorithms Might as Well Be Linear Sketches
Communication Amid Uncertainty
Linear sketching with parities
Locally Decodable Codes from Lifting
Quantum Information Theory Introduction
How to Delegate Computations: The Power of No-Signaling Proofs
Uncertain Compression
The Communication Complexity of Distributed Set-Joins
Linear sketching over
Advances in Linear Sketching over Finite Fields
General Strong Polarization
Linear sketching with parities
Classical Algorithms from Quantum and Arthur-Merlin Communication Protocols Lijie Chen MIT Ruosong Wang CMU.
The Byzantine Secretary Problem
Universal Semantic Communication
On The Quantitative Hardness of the Closest Vector Problem
Communication Amid Uncertainty
What should I talk about? Aspects of Human Communication
Shengyu Zhang The Chinese University of Hong Kong
Introduction to Quantum Information Processing CS 467 / CS 667 Phys 467 / Phys 767 C&O 481 / C&O 681 Lecture 17 (2005) Richard Cleve DC 3524
General Strong Polarization
General Strong Polarization
Presentation transcript:

Imperfectly Shared Randomness in Communication Madhu Sudan Microsoft Research Joint work with Clément Canonne (Columbia), Venkatesan Guruswami (CMU) and Raghu Meka (UCLA). 03/26/2015 MSR Theory Day: ISR in Communication

Communication Complexity The model (with shared randomness) 𝑥 𝑦 𝑓: 𝑥,𝑦 ↦Σ 𝑅=$$$ Alice Bob 𝐶𝐶 𝑓 =# bits exchanged by best protocol 𝑓(𝑥,𝑦) w.p. 2/3 03/26/2015 MSR Theory Day: ISR in Communication

Communication Complexity - Motivation Standard: Lower bounds Circuit complexity, Streaming, Data Structures, extended formulations … This talk: Communication complexity as model for communication Food for thought: Shannon vs. Yao? Which is the right model. Typical (human-human/computer-computer) communication involves large context and brief communication. Contexts imperfectly shared. 03/26/2015 MSR Theory Day: ISR in Communication

MSR Theory Day: ISR in Communication This talk Example problems with low communication complexity. New form of uncertainty and overcoming it. 03/26/2015 MSR Theory Day: ISR in Communication

Aside: Easy CC Problems Equality testing: 𝐸𝑄 𝑥,𝑦 =1⇔𝑥=𝑦; 𝐶𝐶 𝐸𝑄 =𝑂 1 Hamming distance: 𝐻 𝑘 𝑥,𝑦 =1⇔Δ 𝑥,𝑦 ≤𝑘; 𝐶𝐶 𝐻 𝑘 =𝑂(𝑘 log 𝑘 ) [Huang etal.] Small set intersection: ∩ 𝑘 𝑥,𝑦 =1⇔ wt 𝑥 , wt 𝑦 ≤𝑘 & ∃𝑖 𝑠.𝑡. 𝑥 𝑖 = 𝑦 𝑖 =1. 𝐶𝐶 ∩ 𝑘 =𝑂(𝑘) [Håstad Wigderson] Gap (Real) Inner Product: 𝑥,𝑦∈ ℝ 𝑛 ; 𝑥 2 , 𝑦 2 =1; 𝐺𝐼 𝑃 𝑐,𝑠 𝑥,𝑦 =1 if 𝑥,𝑦 ≥𝑐; =0 if 𝑥,𝑦 ≤𝑠; 𝐶𝐶 𝐺𝐼 𝑃 𝑐,𝑠 =𝑂 1 𝑐−𝑠 2 ; [Alon, Matias, Szegedy] Protocol: Fix ECC 𝐸: 0,1 𝑛 → 0,1 𝑁 ; Shared randomness: 𝑖← 𝑁 ; Exchange 𝐸 𝑥 𝑖 , 𝐸 𝑦 𝑖 ; Accept iff 𝐸 𝑥 𝑖 =𝐸 𝑦 𝑖 . 𝑝𝑜𝑙𝑦 𝑘 Protocol: Use common randomness to hash 𝑛 →[ 𝑘 2 ] 𝑥=( 𝑥 1 , …, 𝑥 𝑛 ) 𝑦= 𝑦 1 , …, 𝑦 𝑛 〈𝑥,𝑦〉≜ 𝑖 𝑥 𝑖 𝑦 𝑖 Main Insight: If 𝐺←𝑁 0,1 𝑛 , then 𝔼 𝐺,𝑥 ⋅ 𝐺,𝑦 =〈𝑥,𝑦〉 Thanks to Badih Ghazi and Pritish Kamath 03/26/2015 MSR Theory Day: ISR in Communication

Uncertainty in Communication Overarching question: Are there communication mechanisms that can overcome uncertainty? What is uncertainty? This talk: Alice, Bob don’t share randomness perfectly; only approximately. A B f R 03/26/2015 MSR Theory Day: ISR in Communication

MSR Theory Day: ISR in Communication Rest of this talk Model: Imperfectly Shared Randomness Positive results: Coping with imperfectly shared randomness. Negative results: Analyzing weakness of imperfectly shared randomness. 03/26/2015 MSR Theory Day: ISR in Communication

Model: Imperfectly Shared Randomness Alice ←𝑟 ; and Bob ←𝑠 where 𝑟,𝑠 = i.i.d. sequence of correlated pairs 𝑟 𝑖 , 𝑠 𝑖 𝑖 ; 𝑟 𝑖 , 𝑠 𝑖 ∈{−1,+1};𝔼 𝑟 𝑖 =𝔼 𝑠 𝑖 =0;𝔼 𝑟 𝑖 𝑠 𝑖 =𝜌≥0 . Notation: 𝑖𝑠 𝑟 𝜌 (𝑓) = cc of 𝑓 with 𝜌-correlated bits. 𝑐𝑐(𝑓): Perfectly Shared Randomness cc. 𝑝𝑟𝑖𝑣 𝑓 : cc with PRIVate randomness Starting point: for Boolean functions 𝑓 𝑐𝑐 𝑓 ≤𝑖𝑠 𝑟 𝜌 𝑓 ≤𝑝𝑟𝑖𝑣 𝑓 ≤𝑐𝑐 𝑓 + log 𝑛 What if 𝑐𝑐 𝑓 ≪log 𝑛? E.g. 𝑐𝑐 𝑓 =𝑂(1) =𝑖𝑠 𝑟 1 𝑓 =𝑖𝑠 𝑟 0 𝑓 𝜌≤𝜏⇒ 𝑖𝑠 𝑟 𝜌 𝑓 ≥𝑖𝑠 𝑟 𝜏 𝑓 03/26/2015 MSR Theory Day: ISR in Communication

MSR Theory Day: ISR in Communication Results Model first studied by [Bavarian,Gavinsky,Ito’14] (“Independently and earlier”). Their focus: Simultaneous Communication; general models of correlation. They show 𝑖𝑠𝑟 Equality =𝑂 1 (among other things) Our Results: Generally: 𝑐𝑐 𝑓 ≤𝑘⇒𝑖𝑠𝑟 𝑓 ≤ 2 𝑘 Converse: ∃𝑓 with 𝑐𝑐 𝑓 ≤𝑘 & 𝑖𝑠𝑟 𝑓 ≥ 2 𝑘 03/26/2015 MSR Theory Day: ISR in Communication

Equality Testing (our proof) Key idea: Think inner products. Encode 𝑥↦𝑋=𝐸(𝑥);𝑦↦𝑌=𝐸 𝑦 ;𝑋,𝑌∈ −1,+1 𝑁 𝑥=𝑦⇒ 〈𝑋,𝑌〉=𝑁 𝑥≠𝑦⇒ 〈𝑋,𝑌〉≤𝑁/2 Estimating inner products: Building on sketching protocols … Alice: Picks Gaussians 𝐺 1 ,… 𝐺 𝑡 ∈ ℝ 𝑁 , Sends 𝑖∈ 𝑡 maximizing 𝐺 𝑖 ,𝑋 to Bob. Bob: Accepts iff 𝐺′ 𝑖 ,𝑌 ≥0 Analysis: 𝑂 𝜌 (1) bits suffice if 𝐺 ≈ 𝜌 𝐺′ Gaussian Protocol 03/26/2015 MSR Theory Day: ISR in Communication

General One-Way Communication Idea: All communication ≤ Inner Products (For now: Assume one-way-cc 𝑓 ≤ 𝑘) For each random string 𝑅 Alice’s message = 𝑖 𝑅 ∈ 2 𝑘 Bob’s output = 𝑓 𝑅 ( 𝑖 𝑅 ) where 𝑓 𝑅 : 2 𝑘 → 0,1 W.p. ≥ 2 3 over 𝑅, 𝑓 𝑅 𝑖 𝑅 is the right answer. 03/26/2015 MSR Theory Day: ISR in Communication

General One-Way Communication For each random string 𝑅 Alice’s message = 𝑖 𝑅 ∈ 2 𝑘 Bob’s output = 𝑓 𝑅 ( 𝑖 𝑅 ) where 𝑓 𝑅 : 2 𝑘 → 0,1 W.p. ≥ 2 3 , 𝑓 𝑅 𝑖 𝑅 is the right answer. Vector representation: 𝑖 𝑅 ↦ 𝑥 𝑅 ∈ 0,1 2 𝑘 (unit coordinate vector) 𝑓 𝑅 ↦ 𝑦 𝑅 ∈ 0,1 2 𝑘 (truth table of 𝑓 𝑅 ). 𝑓 𝑅 𝑖 𝑅 =〈 𝑥 𝑅 , 𝑦 𝑅 〉; Acc. Prob. ∝ 𝑋,𝑌 ;𝑋= 𝑥 𝑅 𝑅 ;𝑌= 𝑦 𝑅 𝑅 Gaussian protocol estimates inner products of unit vectors to within ±𝜖 with 𝑂 𝜌 1 𝜖 2 communication. 03/26/2015 MSR Theory Day: ISR in Communication

Two-way communication Still decided by inner products. Simple lemma: ∃ 𝐾 𝐴 𝑘 , 𝐾 𝐵 𝑘 ⊆ ℝ 2 𝑘 convex, that describe private coin k-bit comm. strategies for Alice, Bob s.t. accept prob. of 𝜋 𝐴 ∈ 𝐾 𝐴 𝑘 , 𝜋 𝐵 ∈ 𝐾 𝐵 𝑘 equals 〈 𝜋 𝐴 , 𝜋 𝐵 〉 Putting things together: Theorem: 𝑐𝑐 𝑓 ≤𝑘⇒𝑖𝑠𝑟 𝑓 ≤ 𝑂 𝜌 ( 2 𝑘 ) 03/26/2015 MSR Theory Day: ISR in Communication

Main Technical Result: Matching lower bound Theorem:There exists a (promise) problem 𝑓 s.t. 𝑐𝑐 𝑓 ≤𝑘 𝑖𝑠 𝑟 𝜌 𝑓 ≥exp⁡(𝑘) The Problem: Gap Sparse Inner Product (G-Sparse-IP). Alice gets sparse 𝑥∈ 0,1 𝑛 ; wt 𝑥 ≈ 2 −𝑘 ⋅𝑛 Bob gets 𝑦∈ 0,1 𝑛 Promise: 〈𝑥,𝑦〉 ≥ .9 2 −𝑘 ⋅𝑛 or 𝑥,𝑦 ≤ .6 2 −𝑘 ⋅𝑛. Decide which. G-Sparse-IP: 𝒙, 𝒚 ∈ 𝟎,𝟏 𝒏 ;𝒘𝒕 𝒙 ≈ 𝟐 −𝒌 ⋅𝒏 Decide 𝒙,𝒚 ≥(.𝟗) 𝟐 −𝒌 ⋅𝒏 or 𝒙,𝒚 ≤(.𝟔) 𝟐 −𝒌 ⋅𝒏? 03/26/2015 MSR Theory Day: ISR in Communication

𝒑𝒔𝒓 Protocol for G-Sparse-IP Note: Gaussian protocol takes 𝑂 2 𝑘 bits. Need to get exponentially better. Idea: 𝑥 𝑖 ≠0 ⇒ 𝑦 𝑖 correlated with answer. Use (perfectly) shared randomness to find random index 𝑖 s.t. 𝑥 𝑖 ≠0 . Shared randomness: 𝑖 1 , 𝑖 2 , 𝑖 3 , … uniform over [𝑛] Alice → Bob: smallest index 𝑗 s.t. 𝑥 𝑖 𝑗 ≠0. Bob: Accept if 𝑦 𝑖 𝑗 =1 Expect 𝑗≈ 2 𝑘 ; 𝑐𝑐≤𝑘. G-Sparse-IP: 𝒙, 𝒚 ∈ 𝟎,𝟏 𝒏 ;𝒘𝒕 𝒙 ≈ 𝟐 −𝒌 ⋅𝒏 Decide 𝒙,𝒚 ≥(.𝟗) 𝟐 −𝒌 ⋅𝒏 or 𝒙,𝒚 ≤(.𝟔) 𝟐 −𝒌 ⋅𝒏? 03/26/2015 MSR Theory Day: ISR in Communication

MSR Theory Day: ISR in Communication Lower Bound Idea Catch: ∀ Dist., ∃ Det. Protocol w. comm. ≤𝑘. Need to fix strategy first, construct dist. later. Main Idea: Protocol can look like G-Sparse-Inner-Product But implies players can agree on common index to focus on … Agreement is hard Protocol can ignore sparsity Requires 2 𝑘 bits Protocol can look like anything! Invariance Principle [MOO, Mossel … ] 03/26/2015 MSR Theory Day: ISR in Communication

MSR Theory Day: ISR in Communication Summarizing 𝑘 bits of comm. with perfect sharing → 2 𝑘 bits with imperfect sharing. This is tight Invariance principle for communication Agreement distillation Low-influence strategies G-Sparse-IP: 𝒙, 𝒚 ∈ 𝟎,𝟏 𝒏 ;𝒘𝒕 𝒙 ≈ 𝟐 −𝒌 ⋅𝒏 Decide 𝒙,𝒚 ≥(.𝟗) 𝟐 −𝒌 ⋅𝒏 or 𝒙,𝒚 ≤(.𝟔) 𝟐 −𝒌 ⋅𝒏? 03/26/2015 MSR Theory Day: ISR in Communication

MSR Theory Day: ISR in Communication Conclusions Imperfect agreement of context important. Dealing with new layer of uncertainty. Notion of scale (context LARGE) Many open directions+questions: Imperfectly shared randomness: One-sided error? Does interaction ever help? How much randomness? More general forms of correlation? 03/26/2015 MSR Theory Day: ISR in Communication

MSR Theory Day: ISR in Communication Thank You! 03/26/2015 MSR Theory Day: ISR in Communication