Presentation is loading. Please wait.

Presentation is loading. Please wait.

Imperfectly Shared Randomness

Similar presentations


Presentation on theme: "Imperfectly Shared Randomness"β€” Presentation transcript:

1 Imperfectly Shared Randomness
in Communication Madhu Sudan Microsoft Research Joint work with ClΓ©ment Canonne (Columbia), Venkatesan Guruswami (CMU) and Raghu Meka (UCLA). 03/26/2015 MSR Theory Day: ISR in Communication

2 Communication Complexity
The model (with shared randomness) π‘₯ 𝑦 𝑓: π‘₯,𝑦 ↦Σ 𝑅=$$$ Alice Bob 𝐢𝐢 𝑓 =# bits exchanged by best protocol 𝑓(π‘₯,𝑦) w.p. 2/3 03/26/2015 MSR Theory Day: ISR in Communication

3 Communication Complexity - Motivation
Standard: Lower bounds Circuit complexity, Streaming, Data Structures, extended formulations … This talk: Communication complexity as model for communication Food for thought: Shannon vs. Yao? Which is the right model. Typical (human-human/computer-computer) communication involves large context and brief communication. Contexts imperfectly shared. 03/26/2015 MSR Theory Day: ISR in Communication

4 MSR Theory Day: ISR in Communication
This talk Example problems with low communication complexity. New form of uncertainty and overcoming it. 03/26/2015 MSR Theory Day: ISR in Communication

5 Aside: Easy CC Problems
Equality testing: 𝐸𝑄 π‘₯,𝑦 =1⇔π‘₯=𝑦; 𝐢𝐢 𝐸𝑄 =𝑂 1 Hamming distance: 𝐻 π‘˜ π‘₯,𝑦 =1⇔Δ π‘₯,𝑦 β‰€π‘˜; 𝐢𝐢 𝐻 π‘˜ =𝑂(π‘˜ log π‘˜ ) [Huang etal.] Small set intersection: ∩ π‘˜ π‘₯,𝑦 =1⇔ wt π‘₯ , wt 𝑦 β‰€π‘˜ & βˆƒπ‘– 𝑠.𝑑. π‘₯ 𝑖 = 𝑦 𝑖 =1. 𝐢𝐢 ∩ π‘˜ =𝑂(π‘˜) [HΓ₯stad Wigderson] Gap (Real) Inner Product: π‘₯,π‘¦βˆˆ ℝ 𝑛 ; π‘₯ 2 , 𝑦 2 =1; 𝐺𝐼 𝑃 𝑐,𝑠 π‘₯,𝑦 =1 if π‘₯,𝑦 β‰₯𝑐; =0 if π‘₯,𝑦 ≀𝑠; 𝐢𝐢 𝐺𝐼 𝑃 𝑐,𝑠 =𝑂 1 π‘βˆ’π‘  ; [Alon, Matias, Szegedy] Protocol: Fix ECC 𝐸: 0,1 𝑛 β†’ 0,1 𝑁 ; Shared randomness: 𝑖← 𝑁 ; Exchange 𝐸 π‘₯ 𝑖 , 𝐸 𝑦 𝑖 ; Accept iff 𝐸 π‘₯ 𝑖 =𝐸 𝑦 𝑖 . π‘π‘œπ‘™π‘¦ π‘˜ Protocol: Use common randomness to hash 𝑛 β†’[ π‘˜ 2 ] π‘₯=( π‘₯ 1 , …, π‘₯ 𝑛 ) 𝑦= 𝑦 1 , …, 𝑦 𝑛 〈π‘₯,𝑦βŒͺβ‰œ 𝑖 π‘₯ 𝑖 𝑦 𝑖 Main Insight: If 𝐺←𝑁 0,1 𝑛 , then 𝔼 𝐺,π‘₯ β‹… 𝐺,𝑦 =〈π‘₯,𝑦βŒͺ Thanks to Badih Ghazi and Pritish Kamath 03/26/2015 MSR Theory Day: ISR in Communication

6 Uncertainty in Communication
Overarching question: Are there communication mechanisms that can overcome uncertainty? What is uncertainty? This talk: Alice, Bob don’t share randomness perfectly; only approximately. A B f R 03/26/2015 MSR Theory Day: ISR in Communication

7 MSR Theory Day: ISR in Communication
Rest of this talk Model: Imperfectly Shared Randomness Positive results: Coping with imperfectly shared randomness. Negative results: Analyzing weakness of imperfectly shared randomness. 03/26/2015 MSR Theory Day: ISR in Communication

8 Model: Imperfectly Shared Randomness
Alice β†π‘Ÿ ; and Bob ←𝑠 where π‘Ÿ,𝑠 = i.i.d. sequence of correlated pairs π‘Ÿ 𝑖 , 𝑠 𝑖 𝑖 ; π‘Ÿ 𝑖 , 𝑠 𝑖 ∈{βˆ’1,+1};𝔼 π‘Ÿ 𝑖 =𝔼 𝑠 𝑖 =0;𝔼 π‘Ÿ 𝑖 𝑠 𝑖 =𝜌β‰₯0 . Notation: 𝑖𝑠 π‘Ÿ 𝜌 (𝑓) = cc of 𝑓 with 𝜌-correlated bits. 𝑐𝑐(𝑓): Perfectly Shared Randomness cc. π‘π‘Ÿπ‘–π‘£ 𝑓 : cc with PRIVate randomness Starting point: for Boolean functions 𝑓 𝑐𝑐 𝑓 ≀𝑖𝑠 π‘Ÿ 𝜌 𝑓 β‰€π‘π‘Ÿπ‘–π‘£ 𝑓 ≀𝑐𝑐 𝑓 + log 𝑛 What if 𝑐𝑐 𝑓 β‰ͺlog 𝑛? E.g. 𝑐𝑐 𝑓 =𝑂(1) =𝑖𝑠 π‘Ÿ 1 𝑓 =𝑖𝑠 π‘Ÿ 0 𝑓 πœŒβ‰€πœβ‡’ 𝑖𝑠 π‘Ÿ 𝜌 𝑓 β‰₯𝑖𝑠 π‘Ÿ 𝜏 𝑓 03/26/2015 MSR Theory Day: ISR in Communication

9 MSR Theory Day: ISR in Communication
Results Model first studied by [Bavarian,Gavinsky,Ito’14] (β€œIndependently and earlier”). Their focus: Simultaneous Communication; general models of correlation. They show π‘–π‘ π‘Ÿ Equality =𝑂 1 (among other things) Our Results: Generally: 𝑐𝑐 𝑓 β‰€π‘˜β‡’π‘–π‘ π‘Ÿ 𝑓 ≀ 2 π‘˜ Converse: βˆƒπ‘“ with 𝑐𝑐 𝑓 β‰€π‘˜ & π‘–π‘ π‘Ÿ 𝑓 β‰₯ 2 π‘˜ 03/26/2015 MSR Theory Day: ISR in Communication

10 Equality Testing (our proof)
Key idea: Think inner products. Encode π‘₯↦𝑋=𝐸(π‘₯);π‘¦β†¦π‘Œ=𝐸 𝑦 ;𝑋,π‘Œβˆˆ βˆ’1,+1 𝑁 π‘₯=𝑦⇒ βŒ©π‘‹,π‘ŒβŒͺ=𝑁 π‘₯≠𝑦⇒ βŒ©π‘‹,π‘ŒβŒͺ≀𝑁/2 Estimating inner products: Building on sketching protocols … Alice: Picks Gaussians 𝐺 1 ,… 𝐺 𝑑 ∈ ℝ 𝑁 , Sends π‘–βˆˆ 𝑑 maximizing 𝐺 𝑖 ,𝑋 to Bob. Bob: Accepts iff 𝐺′ 𝑖 ,π‘Œ β‰₯0 Analysis: 𝑂 𝜌 (1) bits suffice if 𝐺 β‰ˆ 𝜌 𝐺′ Gaussian Protocol 03/26/2015 MSR Theory Day: ISR in Communication

11 General One-Way Communication
Idea: All communication ≀ Inner Products (For now: Assume one-way-cc 𝑓 ≀ π‘˜) For each random string 𝑅 Alice’s message = 𝑖 𝑅 ∈ 2 π‘˜ Bob’s output = 𝑓 𝑅 ( 𝑖 𝑅 ) where 𝑓 𝑅 : 2 π‘˜ β†’ 0,1 W.p. β‰₯ 2 3 over 𝑅, 𝑓 𝑅 𝑖 𝑅 is the right answer. 03/26/2015 MSR Theory Day: ISR in Communication

12 General One-Way Communication
For each random string 𝑅 Alice’s message = 𝑖 𝑅 ∈ 2 π‘˜ Bob’s output = 𝑓 𝑅 ( 𝑖 𝑅 ) where 𝑓 𝑅 : 2 π‘˜ β†’ 0,1 W.p. β‰₯ 2 3 , 𝑓 𝑅 𝑖 𝑅 is the right answer. Vector representation: 𝑖 𝑅 ↦ π‘₯ 𝑅 ∈ 0,1 2 π‘˜ (unit coordinate vector) 𝑓 𝑅 ↦ 𝑦 𝑅 ∈ 0,1 2 π‘˜ (truth table of 𝑓 𝑅 ). 𝑓 𝑅 𝑖 𝑅 =〈 π‘₯ 𝑅 , 𝑦 𝑅 βŒͺ; Acc. Prob. ∝ 𝑋,π‘Œ ;𝑋= π‘₯ 𝑅 𝑅 ;π‘Œ= 𝑦 𝑅 𝑅 Gaussian protocol estimates inner products of unit vectors to within Β±πœ– with 𝑂 𝜌 1 πœ– 2 communication. 03/26/2015 MSR Theory Day: ISR in Communication

13 Two-way communication
Still decided by inner products. Simple lemma: βˆƒ 𝐾 𝐴 π‘˜ , 𝐾 𝐡 π‘˜ βŠ† ℝ 2 π‘˜ convex, that describe private coin k-bit comm. strategies for Alice, Bob s.t. accept prob. of πœ‹ 𝐴 ∈ 𝐾 𝐴 π‘˜ , πœ‹ 𝐡 ∈ 𝐾 𝐡 π‘˜ equals 〈 πœ‹ 𝐴 , πœ‹ 𝐡 βŒͺ Putting things together: Theorem: 𝑐𝑐 𝑓 β‰€π‘˜β‡’π‘–π‘ π‘Ÿ 𝑓 ≀ 𝑂 𝜌 ( 2 π‘˜ ) 03/26/2015 MSR Theory Day: ISR in Communication

14 Main Technical Result: Matching lower bound
Theorem:There exists a (promise) problem 𝑓 s.t. 𝑐𝑐 𝑓 β‰€π‘˜ 𝑖𝑠 π‘Ÿ 𝜌 𝑓 β‰₯exp⁑(π‘˜) The Problem: Gap Sparse Inner Product (G-Sparse-IP). Alice gets sparse π‘₯∈ 0,1 𝑛 ; wt π‘₯ β‰ˆ 2 βˆ’π‘˜ ⋅𝑛 Bob gets π‘¦βˆˆ 0,1 𝑛 Promise: 〈π‘₯,𝑦βŒͺ β‰₯ βˆ’π‘˜ ⋅𝑛 or π‘₯,𝑦 ≀ βˆ’π‘˜ ⋅𝑛. Decide which. G-Sparse-IP: 𝒙, π’š ∈ 𝟎,𝟏 𝒏 ;π’˜π’• 𝒙 β‰ˆ 𝟐 βˆ’π’Œ ⋅𝒏 Decide 𝒙,π’š β‰₯(.πŸ—) 𝟐 βˆ’π’Œ ⋅𝒏 or 𝒙,π’š ≀(.πŸ”) 𝟐 βˆ’π’Œ ⋅𝒏? 03/26/2015 MSR Theory Day: ISR in Communication

15 𝒑𝒔𝒓 Protocol for G-Sparse-IP
Note: Gaussian protocol takes 𝑂 2 π‘˜ bits. Need to get exponentially better. Idea: π‘₯ 𝑖 β‰ 0 β‡’ 𝑦 𝑖 correlated with answer. Use (perfectly) shared randomness to find random index 𝑖 s.t. π‘₯ 𝑖 β‰ 0 . Shared randomness: 𝑖 1 , 𝑖 2 , 𝑖 3 , … uniform over [𝑛] Alice β†’ Bob: smallest index 𝑗 s.t. π‘₯ 𝑖 𝑗 β‰ 0. Bob: Accept if 𝑦 𝑖 𝑗 =1 Expect π‘—β‰ˆ 2 π‘˜ ; π‘π‘β‰€π‘˜. G-Sparse-IP: 𝒙, π’š ∈ 𝟎,𝟏 𝒏 ;π’˜π’• 𝒙 β‰ˆ 𝟐 βˆ’π’Œ ⋅𝒏 Decide 𝒙,π’š β‰₯(.πŸ—) 𝟐 βˆ’π’Œ ⋅𝒏 or 𝒙,π’š ≀(.πŸ”) 𝟐 βˆ’π’Œ ⋅𝒏? 03/26/2015 MSR Theory Day: ISR in Communication

16 MSR Theory Day: ISR in Communication
Lower Bound Idea Catch: βˆ€ Dist., βˆƒ Det. Protocol w. comm. β‰€π‘˜. Need to fix strategy first, construct dist. later. Main Idea: Protocol can look like G-Sparse-Inner-Product But implies players can agree on common index to focus on … Agreement is hard Protocol can ignore sparsity Requires 2 π‘˜ bits Protocol can look like anything! Invariance Principle [MOO, Mossel … ] 03/26/2015 MSR Theory Day: ISR in Communication

17 MSR Theory Day: ISR in Communication
Summarizing π‘˜ bits of comm. with perfect sharing β†’ 2 π‘˜ bits with imperfect sharing. This is tight Invariance principle for communication Agreement distillation Low-influence strategies G-Sparse-IP: 𝒙, π’š ∈ 𝟎,𝟏 𝒏 ;π’˜π’• 𝒙 β‰ˆ 𝟐 βˆ’π’Œ ⋅𝒏 Decide 𝒙,π’š β‰₯(.πŸ—) 𝟐 βˆ’π’Œ ⋅𝒏 or 𝒙,π’š ≀(.πŸ”) 𝟐 βˆ’π’Œ ⋅𝒏? 03/26/2015 MSR Theory Day: ISR in Communication

18 MSR Theory Day: ISR in Communication
Conclusions Imperfect agreement of context important. Dealing with new layer of uncertainty. Notion of scale (context LARGE) Many open directions+questions: Imperfectly shared randomness: One-sided error? Does interaction ever help? How much randomness? More general forms of correlation? 03/26/2015 MSR Theory Day: ISR in Communication

19 MSR Theory Day: ISR in Communication
Thank You! 03/26/2015 MSR Theory Day: ISR in Communication


Download ppt "Imperfectly Shared Randomness"

Similar presentations


Ads by Google