Download presentation

Presentation is loading. Please wait.

Published byChloe Rickey Modified over 2 years ago

1
Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity

2
“A Mathematical Theory of Communication” Claude Shannon 1948 An exact formula for the channel capacity of any noisy channel

3
-noisy channel: Each bit is flipped with prob (independently) Alice wants to send an n bit message to Bob. How many bits does Alice need to send over the -noisy channel, so Bob can retrieve w.p. 1-o(1)? – Is the blow-up even constant? Shannon: Channel Capacity 1- 0 1 1 0 n bits noiseless channel AB ? bits -noisy channel A B

4
-noisy channel: Each bit is flipped with prob (independently) Alice wants to send an n bit message to Bob. How many bits does Alice need to send over the -noisy channel, so Bob can retrieve w.p. 1-o(1)? [Shannon ‘48]: # bits n / 1-H( ) – Entropy function H( ) = - log( ) – (1- ) log(1- ) – Matching upper and lower bounds Shannon: Channel Capacity Channel Capacity 1- 0 1 1 0

5
Alice and Bob want to have an n bits long conversation. How many bits do they need to send over the -noisy channel, so both can retrieve transcript w.p. 1-o(1)? -noisy channel A B ? bits n bits noiseless channel AB Us: Interactive Channel Capacity

6
Communication Complexity Setting: Alice has input x, Bob has input y. They want to compute f(x,y) (f is publicly known) Communication Complexity of f: The least number of bits they need to communicate – Deterministic, CC(f): x,y, compute f(x,y) w.p. 1 – Randomized, RCC(f): x,y, compute f(x,y) w.p. 1-o(1) Players share a random string – Noisy, CC (f): x,y, compute f(x,y) w.p. 1-o(1) Players communicate over the -noisy channel Players share a random string

7
Def: Interactive Channel Capacity – RCC(f) = Randomized CC (over the noiseless channel) – CC (f) = Noisy CC (over the -noisy channel) * Results hold when we use CC(f) instead of RCC(f) * Results hold for worst case & average case RCC(f),CC (f) Def: Interactive Channel Capacity

8
– RCC(f) = Randomized CC (over the noiseless channel) – CC (f) = Noisy CC (over the -noisy channel) For f(x,y) = x (msg transmission), we get Channel Capacity – Interactive Channel Capacity Channel Capacity In the interactive case, an error in the first bit may cause the whole conversation to be meaningless. We may need to “encode” every bit separately. Def: Interactive Channel Capacity

9
[Schulman ’92]: – Theorem: If RCC(f) = n then CC (f) O(n) Corollary: C( ) > 0 – Open Question: Is Interactive Channel Capacity = Channel Capacity? Many other works [Sch,BR,B,GMS,BK,BN,FGOS…]: – Simulation of any communication protocol with adversarial noise – Large constants, never made explicit Previous Works

10
Our Results

11
Channel Types Synchronous Channel: Exactly one player sends a bit at each time step Asynchronous Channel: If both send bits at the same time these bits are lost Two channels: Each player sends a bit at any time this work

12
Channel Types Synchronous Channel: Exactly one player sends a bit at each time step – The order of turns in a protocol is pre-determined (independent of the inputs, randomness, noise). Otherwise players may send bits at the same time – Alternating turns is a special case Asynchronous Channel: If both send bits at the same time these bits are lost Two channels: Each player sends a bit at any time this work

14
Example f with CC > RCC: 2 k -Pointer Jumping Game Parameters: – 2 k -ary tree, depth d – k = O(1), d – = logk / k 2 Alice owns odd layers, Bob owns even layers Pointer Jumping Game: – Inputs: Each player gets an edge going out of every node he owns – Goal: Find the leaf reached depth = d Pointer Jumping

15
depth = d Pointer Jumping

16
Bounding CC (PJG) - The Idea “Any good PJG protocol does the following:” Alice starts by sending the first edge (k bits) – wp k a bit was flipped Case 1: Alice sends additional bits to correct first edge – Even if a single error occurred and Alice knows its index, she needs to send the index logk bit waste Case 2: Bob sends the next edge (k bits) – wp k these k bits are wasted, as Bob had wrong first edge In expectation, k 2 = logk bit waste In both cases, sending the first edge costs k+ (log k)! – was chosen to balance the 2 losses = logk / k 2

17
Let players exchange the first 1.25k bits of the protocol. t 1 = #bits out of the first 1.25k bits sent by Alice (well defined due to pre-determined order of turns) Case 1: Alice sends additional bits to correct first edge corresponds to t 1 k+0.5logk Case 2: Bob sends the next edge corresponds to t 1 < k+0.5logk Bounding CC (PJG) - More Formal = logk / k 2

18
After the exchange of the first 1.25k bits, we “voluntarily” reveal the first edge to Bob. The players now play a new PJG of depth d-1. We need to show that sending the first edge of the new PJG also costs k+ (log k). Challenge: In the new PJG, some info about the players’ inputs may already be known – How do we measure the players’ progress? d Bounding CC (PJG) - Why is the actual proof challenging?

20
Simulation Parameters (same): – k = O(1) – = logk / k 2 Given a communication protocol P, we simulate P over the -noisy channel using a recursive protocol: – The basic step simulates k steps of P – The i th inductive step simulates k i+1 steps of P

21
Simulating Protocol - Basic Step Simulating Protocol (Basic Step): – Players run k steps of P. Alice observes transcript T a, and Bob transcript T b – Players run an O(logk) bit consistency check of T a,T b using hash functions, each bit sent many times – A player that finds an inconsistency starts over and removes this step’s bits from his transcript k bits Protocol P O(logk) bits consistency check inconsistency

22
Simulating Protocol - Interactive Step Simulating Protocol (first inductive step): – Players run the Basic Step k consecutive times. Alice observes transcript T a, and Bob transcript T b (Players may go out of sync, but due to the alternating turns they know who should speak next) – Players run an O(log 2 k) bit consistency check of T a,T b using hash functions, each bit sent many times – A player that finds an inconsistency starts over and removes this step’s bits from his transcript k times inconsistency O(log 2 k) bits

23
Analysis: Correctness The final protocol simulates P with probability 1-o(1): – If an error occurred or the players went out of sync, they will eventually fix it, as the consistency check checks the whole transcript so far and is done with larger and larger parameters

24
Analysis: Waste in Basic Step = logk / k 2 k bits Protocol P O(logk) bits consistency check inconsistency

25
Analysis: Waste in First Inductive Step Length of consistency check: O(log 2 k) bits Probability to start over: << O(1/k 10 ) Prob of undetected error in one of the k Basic Steps Total waste (in expectation): O(log 2 k) + O(1/k 10 ) O(k 2 ) = O(log 2 k) bits Fraction of bits wasted: O(log 2 k / k 2 ) << O(logk/k) negligible compared to the basic step! – Waste in next inductive steps is even smaller k times inconsistency O(log 2 k) bits = logk / k 2

26
Thank You!

Similar presentations

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google