CS 154, Lecture 6: Communication Complexity

Slides:



Advertisements
Similar presentations
1+eps-Approximate Sparse Recovery Eric Price MIT David Woodruff IBM Almaden.
Advertisements

Limitations of Quantum Advice and One-Way Communication Scott Aaronson UC Berkeley IAS Useful?
C&O 355 Lecture 22 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A A A A A A A A.
Theory of Computing Lecture 23 MAS 714 Hartmut Klauck.
Quantum Information and the PCP Theorem Ran Raz Weizmann Institute.
Foundations of Cryptography Lecture 10 Lecturer: Moni Naor.
C&O 355 Mathematical Programming Fall 2010 Lecture 12 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A.
Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity.
T(n) = 4 T(n/3) +  (n). T(n) = 2 T(n/2) +  (n)
Great Theoretical Ideas in Computer Science.
Chapter 3 Growth of Functions
CS151 Complexity Theory Lecture 6 April 15, 2015.
CS151 Complexity Theory Lecture 7 April 20, 2004.
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 12 June 18, 2006
1 Fingerprint 2 Verifying set equality Verifying set equality v String Matching – Rabin-Karp Algorithm.
Sublinear time algorithms Ronitt Rubinfeld Blavatnik School of Computer Science Tel Aviv University TexPoint fonts used in EMF. Read the TexPoint manual.
Theory of Computing Lecture 22 MAS 714 Hartmut Klauck.
Finite probability space set  (sample space) function P:  R + (probability distribution)  P(x) = 1 x 
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 13 June 22, 2005
The RSA Algorithm Based on the idea that factorization of integers into their prime factors is hard. ★ n=p . q, where p and q are distinct primes Proposed.
Chapter 14 Randomized algorithms Introduction Las Vegas and Monte Carlo algorithms Randomized Quicksort Randomized selection Testing String Equality Pattern.
February 18, 2015CS21 Lecture 181 CS21 Decidability and Tractability Lecture 18 February 18, 2015.
Languages with Bounded Multiparty Communication Complexity Arkadev Chattopadhyay (McGill) Joint work with: Andreas Krebs (Tubingen) Michal Koucky (Czech.
Fall 2013 CMU CS Computational Complexity Lectures 8-9 Randomness, communication, complexity of unique solutions These slides are mostly a resequencing.
Lecture 6.1: Misc. Topics: Number Theory CS 250, Discrete Structures, Fall 2011 Nitesh Saxena.
Umans Complexity Theory Lectures Lecture 7b: Randomization in Communication Complexity.
Data Stream Algorithms Lower Bounds Graham Cormode
Great Theoretical Ideas In Computer Science Anupam GuptaCS Fall 2006 Lecture 15Oct 17, 2006Carnegie Mellon University Algebraic Structures: Groups,
Communication Complexity Guy Feigenblat Based on lecture by Dr. Ely Porat Some slides where adapted from various sources Complexity course Computer science.
Lower bounds on data stream computations Seminar in Communication Complexity By Michael Umansky Instructor: Ronitt Rubinfeld.
1 Introduction to Quantum Information Processing CS 467 / CS 667 Phys 467 / Phys 767 C&O 481 / C&O 681 Richard Cleve DC 3524 Course.
1 Introduction to Quantum Information Processing CS 467 / CS 667 Phys 467 / Phys 767 C&O 481 / C&O 681 Richard Cleve DC 3524 Course.
FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA.
Diffie-Hellman-Merkle Ramki Thurimella. 2 Key Exchange Protocol Establishing secret keys for N people Requires N(N-1)/2 separate keys This is a quadratic.
Algorithms for Big Data: Streaming and Sublinear Time Algorithms
The Best Algorithms are Randomized Algorithms
High order procedures Primality testing The RSA cryptosystem
Probabilistic Algorithms
Information Complexity Lower Bounds
Introduction to Randomized Algorithms and the Probabilistic Method
New Characterizations in Turnstile Streams with Applications
Great Theoretical Ideas in Computer Science
Randomness and Computation: Some Prime Examples
Material in the textbook on pages
CS154, Lecture 7: Turing Machines.
Number Theory (Chapter 7)
The first Few Slides stolen from Boaz Barak
Course Business I am traveling April 25-May 3rd
COMS E F15 Lecture 2: Median trick + Chernoff, Distinct Count, Impossibility Results Left to the title, a presenter can insert his/her own image.
Busch Complexity Lectures: Undecidable Problems (unsolvable problems)
Turnstile Streaming Algorithms Might as Well Be Linear Sketches
Alternating Bit Protocol
CS 154, Lecture 4: Limitations on DFAs (I),
Linear sketching with parities
Great Theoretical Ideas in Computer Science
Linear sketching over
Renaming and Oriented Manifolds
Linear sketching with parities
Great Theoretical Ideas in Computer Science
CS154, Lecture 12: Time Complexity
Universal Semantic Communication
Hash Functions Motivation Hash Functions: collision, pre-images SHA-1
CS21 Decidability and Tractability
Instructor: Aaron Roth
One Way Functions Motivation Complexity Theory Review, Motivation
CS151 Complexity Theory Lecture 7 April 23, 2019.
Cryptography Lecture 18.
Randomized Algorithm & Public Key Cryptography
Introduction to Quantum Information Processing CS 467 / CS 667 Phys 467 / Phys 767 C&O 481 / C&O 681 Lecture 17 (2005) Richard Cleve DC 3524
Presentation transcript:

CS 154, Lecture 6: Communication Complexity http://a2ru.org/ TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA

Communication Complexity A model capturing one aspect of distributed computing. Here focus on two parties: Alice and Bob Function f : {0,1}*  {0,1}*  {0,1} Two inputs, 𝑥 ∈ {0,1}* and 𝑦 ∈ {0,1}* We assume |𝑥|=|𝑦|=𝑛, Think of 𝑛 as HUGE Alice only knows 𝑥, Bob only knows 𝑦 Goal: Compute f(𝑥, 𝑦) by communicating as few bits as possible between Alice and Bob We do not count computation cost. We only care about the number of bits communicated.

Alice and Bob Have a Conversation f(x,y)=0 f(x,y)=0 A(x,ε) = 0 B(y,0) = 1 A(x,01) = 1 B(y,011) = 0 A(x,0110) = STOP x y In every step: A bit is sent, which is a function of the party’s input and all the bits communicated so far. Communication cost = number of bits communicated = 4 (in the example) We assume Alice and Bob alternate in communicating, and the last bit sent is 𝑓(x,y) More sophisticated models: separate number of rounds from number of bits

Def. A protocol for a function f is a pair of functions A, B : {0,1} Def. A protocol for a function f is a pair of functions A, B : {0,1}* × {0,1}*  {0, 1, STOP} with the semantics: On input (𝑥, 𝑦), let 𝑟 := 0, 𝑏 0 = ε While ( 𝑏 𝑟 ≠ STOP), 𝑟++ If 𝑟 is odd, Alice sends 𝑏 𝑟 = 𝐴 𝑥, 𝑏 1 ⋯ 𝑏 𝑟−1 else Bob sends 𝑏 𝑟 =𝐵 𝑦, 𝑏 1 ⋯ 𝑏 𝑟−1 Output 𝑏 𝑟−1 . Number of rounds = 𝑟−1

Def. The cost of a protocol P for f on 𝑛-bit strings is 𝑚𝑎𝑥 𝑥, 𝑦 ∈ 0,1 𝑛 [number of rounds in P to compute f(𝑥,𝑦)] The communication complexity of f on 𝑛-bit strings is the minimum cost over all protocols for f on 𝑛-bit strings = the minimum number of rounds used by any protocol that computes f(𝑥,𝑦), over all 𝑛-bit 𝑥,𝑦

Example. Let f : {0,1}* × {0,1}*  {0,1} be arbitrary There is always a “trivial” protocol: Alice sends the bits of her 𝑥 in odd rounds Bob sends the bits of his 𝑦 in even rounds After 2𝑛 rounds, they both know each other’s input! The communication complexity of every f is at most 2𝑛

Example: PARITY(𝑥,𝑦) = 𝑖 𝑥 𝑖 + 𝑖 𝑦 𝑖 mod 2. What’s a good protocol for computing PARITY? Alice sends 𝑏 1 = ( 𝑖 𝑥 𝑖 mod 2) Bob sends 𝑏 2 = ( 𝑏 1 + 𝑖 𝑦 𝑖 mod 2). Alice stops. The communication complexity of PARITY is 2

Example: MAJORITY(𝑥,𝑦) = most frequent bit in 𝑥𝑦 What’s a good protocol for computing MAJORITY? Alice sends 𝑁 𝑥 = number of 1s in 𝑥 Bob computes 𝑁 𝑦 = number of 1s in 𝑦, sends 1 iff 𝑁 𝑥 + 𝑁 𝑦 is greater than (|x|+|y|)/2 = 𝑛 Communication complexity of MAJORITY is O(log 𝑛)

Example: EQUALS(𝑥,𝑦) = 1 ⇔ 𝑥 = 𝑦 What’s a good protocol for computing EQUALS ??? Communication complexity of EQUALS is at most 2𝑛

Connection to Streaming and DFAs Let 𝐿 ⊆ {0,1}* Def. 𝑓 𝐿 : {0,1}*×{0,1}* {0,1} for 𝑥,𝑦 with |𝑥|=|𝑦| as: 𝑓 𝐿 𝑥,𝑦 =1⇔ 𝑥𝑦∈𝐿 x y Examples: 𝐿 = { x | x has an odd number of 1s} ⇒ 𝑓 𝐿 𝑥,𝑦 = PARITY(x,y) = 𝑖 𝑥 𝑖 + 𝑖 𝑦 𝑖 mod 2 𝐿 = { x | x has more 1s than 0s} ⇒ 𝑓 𝐿 𝑥,𝑦 = MAJORITY(x,y) 𝐿 = { xx | x ∈ {0,1}*} ⇒ 𝑓 𝐿 𝑥,𝑦 = EQUALS(x,y)

Connection to Streaming and DFAs Let 𝐿 ⊆ {0,1}* Def. 𝑓 𝐿 : {0,1}*×{0,1}* {0,1} for 𝑥,𝑦 with |𝑥|=|𝑦| as: 𝑓 𝐿 𝑥,𝑦 =1⇔ 𝑥𝑦∈𝐿 x y Theorem: If 𝐿 has a streaming algorithm using ≤ 𝑠 space, then the comm. complexity of 𝑓 𝐿 is at most O(𝑠). Proof: Alice runs streaming algorithm A on 𝑥. Sends the memory content of A: this is 𝑠 bits of space Bob starts up A with that memory content, runs A on 𝑦. Gets an output bit, sends to Alice.

Connection to Streaming and DFAs Let 𝐿 ⊆ {0,1}* Def. 𝑓 𝐿 𝑥,𝑦 =1⇔ 𝑥𝑦∈𝐿 Theorem: If 𝐿 has a streaming algorithm using ≤ 𝑠 space, then the comm. complexity of 𝑓 𝐿 is at most O(𝑠). Corollary: For every regular language 𝐿, the comm. complexity of 𝑓 𝐿 is O(1). Example: Comm. Complexity of PARITY is O(1) Corollary: Comm. Complexity of MAJORITY is O(log n), because there’s a streaming algorithm for {x : x has more 1’s than 0’s} with O(log n) space What about the Comm. Complexity of EQUALS?

Communication Complexity of EQUALS Theorem: The comm. complexity of EQUALS is Θ(𝑛). In particular, every protocol for EQUALS needs ≥𝑛 bits of communication. No communication protocol can do much better than “send your whole input”! Corollary: 𝐿 = {ww | w in {0,1}*} is not regular. Moreover, every streaming algorithm for 𝐿 needs 𝑐 𝑛 bits of memory, for some constant 𝑐 > 0

Communication Complexity of EQUALS Theorem: The comm. complexity of EQUALS is Θ(𝑛). In particular, every protocol for EQUALS needs ≥𝑛 bits of communication. Idea: Consider all possible ways A & B can communicate. Definition: The communication pattern of a protocol on inputs (𝑥,𝑦) is the sequence of bits that Alice & Bob send. Pattern: 0110

Communication Complexity of EQUALS Theorem: The communication complexity of EQUALS is Θ(𝑛). In particular, every protocol for EQUALS needs ≥𝑛 bits of communication. Proof: By contradiction. Suppose CC of EQUALS is ≤𝑛−1. Then there are ≤ 2 𝑛−1 possible communication patterns of that protocol, over all pairs of inputs (𝑥,𝑦) with n bits each. Claim: There are 𝑥 ≠𝑦 such that on (𝑥,𝑥) and on (𝑦,𝑦), the protocol uses the same pattern 𝑃. Now, what is the communication pattern on (𝑥,𝑦)? This pattern is also 𝑃 (WHY?) So Alice & Bob output the same bit on (𝑥,𝑦) and (𝑥,𝑥). But EQUALS(𝑥,𝑦) = 0 and EQUALS(𝑥,𝑥) = 1. Contradiction!

Randomized Protocols Help! EQUALS needs 𝑐 𝑛 bits of communication, but… Theorem: For all 𝑥,𝑦 of 𝑛 bits each, there is a randomized protocol for EQUALS 𝑥,𝑦 using only O(log 𝑛) bits of communication, which works with probability 99.9%! Use Error Correcting Codes … E.g: Alice picks a random prime number p between 2 and n^2. She sends p and her string x modulo p. This is a number between 0 and n^2, takes O(log n) bits to send. Bob checks whether y = x modulo p. Sends output bit. Why does it work (with high probability)?

Randomness – could be a useful resource of computation (II) Questions? Communication Complexity: Powerful Tool (we seen just a tiny demonstration). Communication Complexity, Streaming Algorithms and Regular languages – connected. Randomness – could be a useful resource of computation (II) Questions? TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA