Information Complexity Lower Bounds

Slides:



Advertisements
Similar presentations
Estimating Distinct Elements, Optimally
Advertisements

1+eps-Approximate Sparse Recovery Eric Price MIT David Woodruff IBM Almaden.
The Data Stream Space Complexity of Cascaded Norms T.S. Jayram David Woodruff IBM Almaden.
Tight Bounds for Distributed Functional Monitoring David Woodruff IBM Almaden Qin Zhang Aarhus University MADALGO Based on a paper in STOC, 2012.
Tight Bounds for Distributed Functional Monitoring David Woodruff IBM Almaden Qin Zhang Aarhus University MADALGO.
Optimal Space Lower Bounds for All Frequency Moments David Woodruff MIT
Limitations of Quantum Advice and One-Way Communication Scott Aaronson UC Berkeley IAS Useful?
Numerical Linear Algebra in the Streaming Model Ken Clarkson - IBM David Woodruff - IBM.
Optimal Space Lower Bounds for all Frequency Moments David Woodruff Based on SODA 04 paper.
The Average Case Complexity of Counting Distinct Elements David Woodruff IBM Almaden.
Optimal Bounds for Johnson- Lindenstrauss Transforms and Streaming Problems with Sub- Constant Error T.S. Jayram David Woodruff IBM Almaden.
Xiaoming Sun Tsinghua University David Woodruff MIT
Tight Lower Bounds for the Distinct Elements Problem David Woodruff MIT Joint work with Piotr Indyk.
Why Simple Hash Functions Work : Exploiting the Entropy in a Data Stream Michael Mitzenmacher Salil Vadhan And improvements with Kai-Min Chung.
Circuit and Communication Complexity. Karchmer – Wigderson Games Given The communication game G f : Alice getss.t. f(x)=1 Bob getss.t. f(y)=0 Goal: Find.
The Communication Complexity of Approximate Set Packing and Covering
1 Nondeterministic Space is Closed Under Complement Presented by Jing Zhang and Yingbo Wang Theory of Computation II Professor: Geoffrey Smith.
Complexity Theory Lecture 3 Lecturer: Moni Naor. Recap Last week: Non deterministic communication complexity Probabilistic communication complexity Their.
COMP 553: Algorithmic Game Theory Fall 2014 Yang Cai Lecture 21.
Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity.
Complexity 11-1 Complexity Andrei Bulatov Space Complexity.
Complexity 18-1 Complexity Andrei Bulatov Probabilistic Algorithms.
CS151 Complexity Theory Lecture 6 April 15, 2015.
On the tightness of Buhrman- Cleve-Wigderson simulation Shengyu Zhang The Chinese University of Hong Kong On the relation between decision tree complexity.
Avraham Ben-Aroya (Tel Aviv University) Oded Regev (Tel Aviv University) Ronald de Wolf (CWI, Amsterdam) A Hypercontractive Inequality for Matrix-Valued.
Derandomizing LOGSPACE Based on a paper by Russell Impagliazo, Noam Nissan and Avi Wigderson Presented by Amir Rosenfeld.
CS151 Complexity Theory Lecture 6 April 15, 2004.
Quantum Algorithms II Andrew C. Yao Tsinghua University & Chinese U. of Hong Kong.
Theory of Computing Lecture 22 MAS 714 Hartmut Klauck.
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 13 June 22, 2005
1 Information and interactive computation January 16, 2012 Mark Braverman Computer Science, Princeton University.
Information Complexity Lower Bounds for Data Streams David Woodruff IBM Almaden.
Information Theory for Data Streams David P. Woodruff IBM Almaden.
One-way multi-party communication lower bound for pointer jumping with applications Emanuele Viola & Avi Wigderson Columbia University IAS work done while.
A limit on nonlocality in any world in which communication complexity is not trivial IFT6195 Alain Tapp.
Information Complexity: an Overview Rotem Oshman, Princeton CCI Based on work by Braverman, Barak, Chen, Rao, and others Charles River Science of Information.
Umans Complexity Theory Lectures Lecture 7b: Randomization in Communication Complexity.
Massive Data Sets and Information Theory Ziv Bar-Yossef Department of Electrical Engineering Technion.
Data Stream Algorithms Lower Bounds Graham Cormode
Lower bounds on data stream computations Seminar in Communication Complexity By Michael Umansky Instructor: Ronitt Rubinfeld.
The Message Passing Communication Model David Woodruff IBM Almaden.
Approximation algorithms
Theory of Computational Complexity Probability and Computing Chapter Hikaru Inada Iwama and Ito lab M1.
Imperfectly Shared Randomness
Random Access Codes and a Hypercontractive Inequality for
Hypothesis testing and statistical decision theory
Probabilistic Algorithms
Stochastic Streams: Sample Complexity vs. Space Complexity
New Characterizations in Turnstile Streams with Applications
Streaming & sampling.
Communication Amid Uncertainty
Communication Amid Uncertainty
Branching Programs Part 3
Effcient quantum protocols for XOR functions
CS 154, Lecture 6: Communication Complexity
Turnstile Streaming Algorithms Might as Well Be Linear Sketches
Linear sketching with parities
The Curve Merger (Dvir & Widgerson, 2008)
Uncertain Compression
The Communication Complexity of Distributed Set-Joins
Linear sketching with parities
Classical Algorithms from Quantum and Arthur-Merlin Communication Protocols Lijie Chen MIT Ruosong Wang CMU.
Imperfectly Shared Randomness
Indistinguishability by adaptive procedures with advice, and lower bounds on hardness amplification proofs Aryeh Grinberg, U. Haifa Ronen.
Communication Amid Uncertainty
CS21 Decidability and Tractability
Shengyu Zhang The Chinese University of Hong Kong
Clustering.
CS151 Complexity Theory Lecture 7 April 23, 2019.
Switching Lemmas and Proof Complexity
Presentation transcript:

Information Complexity Lower Bounds Rotem Oshman, Princeton CCI Based on: Bar-Yossef,Jayram,Kumar,Srinivasan’04 Braverman,Barak,Chen,Rao’10

Communication Complexity 𝑓 𝑋,π‘Œ = ? 𝑋 π‘Œ Yao β€˜79, β€œSome complexity questions related to distributive computing”

Communication Complexity Applications: Circuit complexity Streaming algorithms Data structures Distributed computing Property testing …

Deterministic Protocols A protocol Ξ  specifies, at each point: Which player speaks next What should the player say When to halt and what to output Formally, Ξ : 0,1 βˆ— β†’ 𝐴,𝐡,βŠ₯ Γ— 0,1 what we’ve said so far who speaks next: 𝐴= Alice, 𝐡= Bob, βŠ₯ = halt what to say/output

Randomized Protocols Can use randomness to decide what to say Private randomness: each player has a separate source of random bits Public randomness: both players can use the same random bits Goal: for any 𝑋,π‘Œ compute 𝑓 𝑋,π‘Œ correctly with probability β‰₯1 βˆ’πœ– Communication complexity: worst-case length of transcript in any execution

Randomness Can Help a Lot Example: Equality 𝑋,π‘Œ Input: 𝑋,π‘Œβˆˆ 0,1 𝑛 Output: is 𝑋=π‘Œ ? Trivial protocol: Alice sends 𝑋 to Bob For deterministic protocols, this is optimal!

Equality Lower Bound 0 𝑛 1 𝑛 … 1 #rectangles ≀ 2 transcript

Randomized Protocol Protocol with public randomness: Select random π‘βˆˆ 0,1 𝑛 Alice sends 𝑋,𝑍 = 𝑖=1 𝑛 𝑋 𝑖 𝑍 𝑖 mod 2 Bob accepts iff π‘Œ,𝑍 = 𝑋,𝑍 If 𝑋=π‘Œ: always accept If π‘‹β‰ π‘Œ: 𝑋,𝑍 + π‘Œ,𝑍 mod 2= 𝑋+π‘Œ,𝑍 mod 2 Reject with probability 1/2 non-zero vector

Set Disjointness Input: 𝑋,π‘ŒβŠ† 1,…,𝑛 Output: π‘‹βˆ©π‘Œ=βˆ… ? Theorem [Kalyanasundaran, Schnitger β€˜92, Razborov β€˜92]: randomized CC = Ξ© 𝑛 Easy to see for deterministic protocols Today we’ll see a proof by Bar-Yossef, Jayram, Kumar, Srinivasan β€˜04

Application: Streaming Lower Bounds Streaming algorithm: Example: how many distinct items in the data? Reduction from Disjointness [Alon, Matias, Szegedy ’99] How much space is required to approximate f(data)? algorithm data

Reduction from Disjointness: Fix a streaming algorithm for Distinct Elements with space 𝑆, universe size 𝑛 Construct a protocol for Disj. with 𝑛 elements: 𝑋={ π‘₯ 1 ,…, π‘₯ π‘˜ } π‘Œ={ 𝑦 1 ,…, 𝑦 β„“ } algorithm π‘‹βˆ©π‘Œ=βˆ… ⇔ #distinct elements in 𝑋βˆͺπ‘Œ is 𝑋 + π‘Œ State of the algorithm and 𝑋 (#bits = 𝑆+log 𝑛)

Application 2: KW Games Circuit depth lower bounds: How deep does the circuit need to be? ∧ ∨ π‘₯ 1 … π‘₯ 𝑛 𝑓(π‘₯ 1 ,…, π‘₯ 𝑛 )

Application 2: KW Games Karchmer-Wigderson’93,Karchmer-Raz-Wigderson’94: find 𝑖 such that 𝑋 𝑖 β‰  π‘Œ 𝑖 𝑋 :𝑓 𝑋 =0 π‘Œ:𝑓 π‘Œ =1

Application 2: KW Games Claim: if 𝐾 π‘Š 𝑓 has deterministic CC β‰₯𝑑, then 𝑓 requires circuit depth β‰₯𝑑. Circuit with depth 𝑑 β‡’ protocol with length 𝑑 1 ∧ ∨ π‘₯ 1 … π‘₯ 𝑛 1 1 1 𝑋 :𝑓 𝑋 =0 π‘Œ:𝑓 π‘Œ =1

Information-Theoretic Lower Bound on Set Disjointness

Some Basic Concepts from Info Theory Entropy of a random variable: 𝐻 𝑋 =βˆ’ π‘₯ Pr 𝑋=π‘₯ log Pr 𝑋=π‘₯ Important properties: 𝐻 𝑋 β‰₯0 𝐻 𝑋 =0 β‡’ 𝑋 is deterministic 𝐻(𝑋) = expected # bits needed to encode 𝑋

Some Basic Concepts from Info Theory Conditional entropy: 𝐻 𝑋 π‘Œ = 𝐸 𝑦 𝐻( 𝑋 |π‘Œ=𝑦 ) Important properties: 𝐻 𝑋|π‘Œ ≀𝐻(𝑋) 𝐻 𝑋|π‘Œ =𝐻 𝑋 β‡’ 𝑋,π‘Œ are independent Example: 𝑋,π‘βˆΌBernoulli 1 2 𝐻 𝑋 =βˆ’ 1 2 log 1 2 βˆ’ 1 2 log 1 2 =1 If 𝑍=0 then π‘Œ=𝑋, if 𝑍=1 then π‘Œ=1βˆ’π‘‹ 𝐻 𝑋 π‘Œ,𝑍 = 1 2 𝐻 𝑋|𝑋 + 1 2 𝐻 𝑋|1βˆ’π‘‹ =0 𝐻 𝑋 π‘Œ =1

Some Basic Concepts from Info Theory Mutual information: 𝐼 𝑋;π‘Œ =𝐻 𝑋 βˆ’π» 𝑋 π‘Œ =𝐻 π‘Œ βˆ’π»(π‘Œ|𝑋) Conditional mutual information: 𝐼 𝑋;π‘Œ|𝑍 =𝐻 𝑋|𝑍 βˆ’π» 𝑋 π‘Œ,𝑍 =𝐻 π‘Œ|𝑍 βˆ’π»(π‘Œ|𝑋,𝑍) Important properties: 𝐼 𝑋;π‘Œ β‰₯0 𝐼 𝑋;π‘Œ =0 β‡’ 𝑋,π‘Œ are independent

Some Basic Concepts from Info Theory Chain rule for mutual information: 𝐼 𝑋 1 , 𝑋 2 ;π‘Œ =𝐼 𝑋 1 ;π‘Œ +𝐼 𝑋 2 ;π‘Œ 𝑋 1 More generally, 𝐼 𝑋 1 ,…, 𝑋 π‘˜ ;π‘Œ = 𝑖=1 π‘˜ 𝐼 𝑋 𝑖 ;π‘Œ 𝑋 1 ,…, 𝑋 π‘–βˆ’1

Information Cost of Protocols Fix an input distribution πœ‡ on 𝑋,π‘Œ Given a protocol Ξ , let Ξ  also denote the distribution of Π’s transcript Information cost of Ξ : 𝐼𝐢 Ξ  =𝐼 Ξ ;π‘Œ 𝑋 +𝐼 Ξ ;𝑋 π‘Œ Information cost of a function 𝑓: 𝐼 𝐢 πœ– 𝑓 = inf Ξ  solves 𝑓 𝑀/errorβ‰€πœ– 𝐼𝐢 Ξ 

Information Cost of Protocols Important property: 𝐼𝐢 Ξ  ≀|Ξ | Proof: by induction. Let Ξ = Ξ  1 … Ξ  𝑑 . βˆ€π‘Ÿβ‰€π‘‘ : 𝐼 Ξ  β‰€π‘Ÿ ;π‘Œ 𝑋 +𝐼 Ξ  β‰€π‘Ÿ ;𝑋 π‘Œ β‰€π‘Ÿ. 𝐼 Ξ  β‰€π‘Ÿ ;π‘Œ 𝑋 +𝐼 Ξ  β‰€π‘Ÿ ;𝑋 π‘Œ what we know after r rounds =𝐼 Ξ  <π‘Ÿ ;π‘Œ 𝑋 +𝐼 Ξ  <π‘Ÿ ;𝑋 π‘Œ what we knew after r-1 rounds + 𝐼 Ξ  π‘Ÿ ;Y X, Ξ  <π‘Ÿ +𝐼 Ξ  π‘Ÿ ;X Y, Ξ  <π‘Ÿ what we learn in round r, given what we already know

Information vs. Communication Want: 𝐼 Ξ  π‘Ÿ ;Y X, Ξ  <π‘Ÿ +𝐼 Ξ  π‘Ÿ ;X Y, Ξ  <π‘Ÿ ≀1. Suppose Ξ  π‘Ÿ is sent by Alice. What does Alice learn? Ξ  π‘Ÿ is a function of Ξ  <π‘Ÿ and 𝑋, so 𝐼 Ξ  π‘Ÿ ;Y X, Ξ  <π‘Ÿ =0. What does Bob learn? 𝐼 Ξ  π‘Ÿ ;Y X, Ξ  <π‘Ÿ ≀ Ξ  π‘Ÿ =1.

Information vs. Communication Important property: 𝐼𝐢 Ξ  ≀|Ξ | Lower bound on information cost β‡’ lower bound on communication complexity In fact, IC lower bounds are the most powerful technique we know

Information Complexity of Disj. Disjointness: is π‘‹βˆ©π‘Œ=βˆ… ? Disj 𝑋,π‘Œ = 𝑖=1 𝑛 𝑋 𝑖 ∧ π‘Œ 𝑖 Strategy: for some β€œhard distribution” πœ‡, Direct sum: 𝐼 𝐢 πœ‡ 𝑛 Disj β‰₯𝑛⋅𝐼 𝐢 πœ‡ (And) Prove that 𝐼 𝐢 πœ‡ And β‰₯Ξ©(1).

Hard Distribution for Disjointness For each coordinate π‘–βˆˆ 𝑛 : 𝑋 𝑖 =0 𝑋 𝑖 =1 π‘Œ 𝑖 =0 1/3 1/3 π‘Œ 𝑖 =1 1/3

𝐼 𝐢 πœ‡ 𝑛 Disj β‰₯𝑛⋅𝐼 𝐢 πœ‡ (And) Let Ξ  be a protocol for Disj on 𝑋,π‘Œβˆˆ 0,1 𝑛 Construct Ξ  β€² for And as follows: Alice and Bob get inputs π‘ˆ,π‘‰βˆˆ 0,1 Choose a random coordinate π‘–βˆˆ 𝑛 , set 𝑋 𝑖 =π‘ˆ, π‘Œ 𝑖 =𝑉 Sample 𝑋 βˆ’π‘– , π‘Œ βˆ’π‘– and run Ξ  For each 𝑗≠𝑖, 𝑋 𝑗 ∧ π‘Œ 𝑗 =0 β‡’ Disj 𝑋,π‘Œ =And 𝑋 𝑖 , π‘Œ 𝑖 𝑋 π‘Œ π‘ˆ 𝑉

𝐼 𝐢 πœ‡ 𝑛 Disj β‰₯𝑛⋅𝐼 𝐢 πœ‡ (And) Let Ξ  be a protocol for Disj on 𝑋,π‘Œβˆˆ 0,1 𝑛 Construct Ξ  β€² for And as follows: Alice and Bob get inputs π‘ˆ,π‘‰βˆˆ 0,1 Choose a random coordinate π‘–βˆˆ 𝑛 , set 𝑋 𝑖 =π‘ˆ, π‘Œ 𝑖 =𝑉 Bad idea: publicly sample 𝑋 βˆ’π‘– , π‘Œ βˆ’π‘– 𝑋 Suppose in Ξ , Alice sends 𝑋 1 βŠ•β€¦βŠ• 𝑋 𝑛 . In Ξ , Bob learns one bit β‡’ in Ξ  β€² he should learn 1/𝑛 bit But if 𝑋 βˆ’π‘– is public Bob learns 1 bit about π‘ˆ! π‘ˆ π‘Œ 𝑉

𝐼 𝐢 πœ‡ 𝑛 Disj β‰₯𝑛⋅𝐼 𝐢 πœ‡ (And) Let Ξ  be a protocol for Disj on 𝑋,π‘Œβˆˆ 0,1 𝑛 Construct Ξ  β€² for And as follows: Alice and Bob get inputs π‘ˆ,π‘‰βˆˆ 0,1 Choose a random coordinate π‘–βˆˆ 𝑛 , set 𝑋 𝑖 =π‘ˆ, π‘Œ 𝑖 =𝑉 Another bad idea: publicly sample 𝑋 βˆ’π‘– , Bob privately samples π‘Œ βˆ’π‘– given 𝑋 βˆ’π‘– But the players can’t sample 𝑋 βˆ’π‘– , π‘Œ βˆ’π‘– independently…

𝐼 𝐢 πœ‡ 𝑛 Disj β‰₯𝑛⋅𝐼 𝐢 πœ‡ (And) Let Ξ  be a protocol for Disj on 𝑋,π‘Œβˆˆ 0,1 𝑛 Construct Ξ  β€² for And as follows: Alice and Bob get inputs π‘ˆ,π‘‰βˆˆ 0,1 Choose a random coordinate π‘–βˆˆ 𝑛 , set 𝑋 𝑖 =π‘ˆ, π‘Œ 𝑖 =𝑉 Publicly sample 𝑋 1 ,…, 𝑋 π‘–βˆ’1 Privately sample 𝑋 (𝑖+1) ,…, 𝑋 𝑛 𝑋 π‘ˆ Privately sample π‘Œ 1 ,…, π‘Œ π‘–βˆ’1 Publicly sample π‘Œ 𝑖+1 ,…, π‘Œ 𝑛 π‘Œ 𝑉

Direct Sum Theorem Transcript of Ξ  β€² =𝑖, 𝑋 <𝑖 , π‘Œ >𝑖 ,Ξ  Need to show: 𝐼 πœ‡ Ξ  β€² ;𝑉 π‘ˆ + 𝐼 πœ‡ Ξ  β€² ;π‘ˆ 𝑉 ≀ 𝐼 πœ‡ 𝑛 Ξ ;π‘Œ 𝑋 + 𝐼 πœ‡ 𝑛 Ξ ;𝑋 π‘Œ /𝑛 𝐼 πœ‡ Ξ  β€² ;𝑉 π‘ˆ = 𝐼 πœ‡ 𝑛 𝑖, 𝑋 <𝑖 , π‘Œ >𝑖 ,Ξ ; π‘Œ 𝑖 𝑋 𝑖 = 𝐼 πœ‡ 𝑛 Ξ ; π‘Œ 𝑖 𝑋 ≀𝑖 , π‘Œ >𝑖 ,𝑖 + 𝐼 πœ‡ 𝑛 𝑖, 𝑋 <𝑖 , π‘Œ >𝑖 ; π‘Œ 𝑖 𝑋 𝑖 ≀𝐼 Ξ , 𝑋 >𝑖 ; π‘Œ 𝑖 𝑋 ≀𝑖 , π‘Œ >𝑖 ,𝑖 =𝐼 𝑋 >𝑖 ; π‘Œ 𝑖 𝑋 ≀𝑖 , π‘Œ >𝑖 ,𝑖 +𝐼 Ξ ; π‘Œ 𝑖 𝑋, π‘Œ >𝑖 ,𝑖 =(1/𝑛) 𝑖=1 𝑛 𝐼 Ξ ; π‘Œ 𝑖 𝑋, π‘Œ >𝑖 =𝐼(Ξ ;π‘Œβ”‚π‘‹)/𝑛.

Information Complexity of Disj. Disjointness: is π‘‹βˆ©π‘Œ=βˆ… ? Disj 𝑋,π‘Œ = 𝑖=1 𝑛 𝑋 𝑖 ∧ π‘Œ 𝑖 Strategy: for some β€œhard distribution” πœ‡, Direct sum: 𝐼 𝐢 πœ‡ 𝑛 Disj β‰₯𝑛⋅𝐼 𝐢 πœ‡ (And) Prove that 𝐼 𝐢 πœ‡ And β‰₯Ξ©(1). οƒΌ

Hardness of And 𝐼𝐢 And =𝐼 Ξ ;π‘Œ 𝑋 + 𝐼 Ξ ;𝑋 π‘Œ β‰₯Ξ© 1 2 3 𝐼 Ξ ;π‘Œ 𝑋=0 + 1 3 𝐼 Ξ ;π‘Œ 𝑋=1 + 2 3 𝐼 Ξ ;𝑋 π‘Œ=0 + 1 3 𝐼 Ξ ;𝑋 π‘Œ=1 11 01 00 10 1/3 transcript on 11 should be β€œvery different” =0 =0

Hellinger Distance β„Ž 2 𝑃,𝑄 =1βˆ’ πœ” 𝑃 πœ” 𝑄 πœ” Examples: β„Ž 2 𝑃,𝑄 =1βˆ’ πœ” 𝑃 πœ” 𝑄 πœ” Examples: β„Ž 2 𝑃,𝑃 =1βˆ’ πœ” 𝑃 πœ” 2 =1βˆ’ πœ” 𝑃 πœ” =0 If 𝑃,𝑄 have disjoint support, β„Ž 𝑃,𝑄 =1

Hellinger Distance Hellinger distance is a metric β„Ž 𝑃,𝑄 β‰₯0, with equality iff 𝑃=𝑄 β„Ž 𝑃,𝑄 =β„Ž 𝑄,𝑃 Triangle inequality: β„Ž 𝑃,𝑄 β‰€β„Ž 𝑃,𝑅 +β„Ž 𝑅,𝑄 𝑃 𝑄 𝑅

Hellinger Distance If for some πœ” we have 𝑃 πœ” βˆ’π‘„ πœ” =𝛿, then β„Ž 2 𝑃,𝑄 β‰₯ 𝛿 2 11 01 00 10 β„Žβ‰₯ 2 3 2

Hellinger Distance vs. Mutual Info Let 𝑃 0 , 𝑃 1 be two distributions Select 𝑍 by choosing 𝐽∼Bernoulli 1 2 , then drawing π‘βˆΌ 𝑃 𝐽 Then 𝐼 𝑍;𝐽 β‰₯ β„Ž 2 𝑃 0 , 𝑃 1 11 01 00 10 1/3 𝐼 Ξ ;π‘Œ 𝑋=0 β‰₯ β„Ž 2 Ξ  00 , Ξ  01 𝐼 Ξ ;𝑋 π‘Œ=0 β‰₯ β„Ž 2 Ξ  00 , Ξ  10

Hardness of And 01 11 00 10 1/3 1/3 1/3 Same for Bob until Alice acts differently 1/3 01 β„Žβ‰₯ 2 3 2 11 Same for Alice until Bob acts differently 1/3 1/3 00 10

β€œCut-n-Paste Lemma” β„Ž Ξ  00 , Ξ  11 =β„Ž Ξ  01 , Ξ  10 β„Ž Ξ  00 , Ξ  11 =β„Ž Ξ  01 , Ξ  10 Recall: β„Ž 2 Ξ  π‘‹π‘Œ , Ξ  𝑋 β€² π‘Œ β€² =1βˆ’ 𝑑 Ξ  π‘‹π‘Œ 𝑑 Ξ  𝑋 β€² π‘Œ β€² 𝑑 Enough to show: we can write Ξ  π‘‹π‘Œ 𝑑 = π‘ž 𝐴 𝑑,𝑋 β‹… π‘ž 𝐡 𝑑,π‘Œ β„Ž 2 Ξ  00 , Ξ  11 =1βˆ’ 𝑑 π‘ž 𝐴 𝑑,0 π‘ž 𝐡 𝑑,0 π‘ž 𝐴 𝑑,1 π‘ž 𝐡 𝑑,1 =1βˆ’ 𝑑 π‘ž 𝐴 𝑑,0 π‘ž 𝐡 𝑑,1 π‘ž 𝐴 𝑑,0 π‘ž 𝐡 𝑑,1 = β„Ž 2 Ξ  01 , Ξ  11

β€œCut-n-Paste Lemma” We can write Ξ  π‘‹π‘Œ 𝑑 = π‘ž 𝐴 𝑑,𝑋 β‹… π‘ž 𝐡 𝑑,π‘Œ Proof: Ξ  π‘‹π‘Œ 𝑑 = π‘ž 𝐴 𝑑,𝑋 β‹… π‘ž 𝐡 𝑑,π‘Œ Proof: Ξ  induces a distribution on β€œpartial transcripts” of each length π‘˜: Ξ  π‘‹π‘Œ π‘˜ 𝑑 = probability that first π‘˜ bits are 𝑑 By induction: Ξ  π‘‹π‘Œ π‘˜ 𝑑 = π‘ž 𝐴 π‘˜ 𝑑,𝑋 β‹… π‘ž 𝐡 π‘˜ 𝑑,π‘Œ Base case: Ξ  π‘‹π‘Œ π‘˜ πœ– =1 Set π‘ž 𝐴 π‘˜ πœ–,𝑋 = π‘ž 𝐡 π‘˜ πœ–,π‘Œ =1

β€œCut-n-Paste Lemma” Step: Ξ  π‘‹π‘Œ π‘˜+1 𝑑 = Ξ  π‘‹π‘Œ π‘˜ 𝑑 β‰€π‘˜ β‹… Pr next bit= 𝑑 π‘˜+1 Suppose after 𝑑 β‰€π‘˜ it is Alice’s turn to speak What Alice says depends on: Her input Her private randomness The transcript so far, 𝑑 β‰€π‘˜ So Pr next bit= 𝑑 π‘˜+1 =𝑓 𝑑 β‰€π‘˜ ,𝑋, 𝑑 π‘˜+1 =𝑓 𝑑,𝑋 Set π‘ž 𝐴 π‘˜+1 𝑑,𝑋 = π‘ž 𝐴 π‘˜ 𝑑 β‰€π‘˜ ,𝑋 ⋅𝑓 𝑑,𝑋 , π‘ž 𝐡 π‘˜+1 𝑑,𝑋 = π‘ž 𝐡 π‘˜ 𝑑 β‰€π‘˜ ,𝑋

Hardness of And 1/3 𝐼𝐢 And =𝐼 Ξ ;π‘Œ 𝑋 + 𝐼 Ξ ;𝑋 π‘Œ = 2 3 𝐼 Ξ ;π‘Œ 𝑋=0 + 2 3 𝐼 Ξ ;𝑋 π‘Œ=0 β‰₯constβ‹… β„Ž 2 Ξ  00 , Ξ  01 + β„Ž 2 Ξ  00 , Ξ  10 β‰₯constβ€²β‹… β„Ž Ξ  00 , Ξ  01 +β„Ž Ξ  00 , Ξ  10 2 β‰₯constβ€²β‹… β„Ž 2 Ξ  01 , Ξ  10 =constβ€²β‹… β„Ž 2 Ξ  00 , Ξ  11 β‰₯Ξ© 1 01 β„Žβ‰₯ 2 3 2 11 1/3 1/3 00 10

Multi-Player Communication Complexity

The Coordinator Model 𝑓 𝑋 1 ,…, 𝑋 π‘˜ = ? π‘˜ sites 𝑛 bits 𝑋 1 𝑋 2 𝑋 π‘˜ …

Multi-Party Set Disjointness Input: 𝑋 1 ,…, 𝑋 π‘˜ βŠ† 𝑛 Output: is β‹‚ 𝑋 𝑖 =βˆ…? Braverman,Ellen,O.,Pitassi,Vaikuntanathan’13: lower bound of Ξ© π‘›π‘˜ bits

Reduction from Disj to graph connectivity Given 𝑋 1 ,…, 𝑋 π‘˜ , we want to Choose vertices 𝑉 Design inputs 𝐸 1 ,…, 𝐸 π‘˜ such that 𝐺 𝑉, 𝐸 1 βˆͺ…βˆͺ 𝐸 π‘˜ 𝑉, 𝐸 1 βˆͺ…βˆͺ 𝐸 π‘˜ is connected iff β‹‚ 𝑋 𝑖 =βˆ…

Reduction from Disj to graph connectivity 𝑝 1 𝑝 2 𝑝 π‘˜ (Players) (Elements) 1 2 3 4 5 6 𝑋 𝑖 𝑛 βˆ– ⋃ 𝑋 𝑖 input graph connected ⇔ ⋃ 𝑋 𝑖 β‰  𝑛 β‹‚ 𝑋 𝑖 β‰ βˆ…

Other Stuff Distributed computing

Other Stuff Compressing down to information cost Number-on-forehead lower bounds Open questions in communication complexity