1 Information complexity and exact communication bounds April 26, 2013 Mark Braverman Princeton University Based on joint work with Ankit Garg, Denis Pankratov,

Slides:



Advertisements
Similar presentations
1+eps-Approximate Sparse Recovery Eric Price MIT David Woodruff IBM Almaden.
Advertisements

Xiaoming Sun Tsinghua University David Woodruff MIT
Foundations of Cryptography Lecture 7 Lecturer:Danny Harnik.
Information theory Multi-user information theory A.J. Han Vinck Essen, 2004.
Entropy and Information Theory
Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity.
Rational Oblivious Transfer KARTIK NAYAK, XIONG FAN.
On Fair Exchange, Fair Coins and Fair Sampling Shashank Agrawal, Manoj Prabhakaran University of Illinois at Urbana-Champaign.
Eran Omri, Bar-Ilan University Joint work with Amos Beimel and Ilan Orlov, BGU Ilan Orlov…!??!!
Of 27 01/06/2015CMI: Uncertain Communication1 Communication Amid Uncertainty Madhu Sudan Microsoft Research Based on Juba, S. (STOC 2008, ITCS 2011) Juba,
Rotem Zach November 1 st, A rectangle in X × Y is a subset R ⊆ X × Y such that R = A × B for some A ⊆ X and B ⊆ Y. A rectangle R ⊆ X × Y is called.
On the tightness of Buhrman- Cleve-Wigderson simulation Shengyu Zhang The Chinese University of Hong Kong On the relation between decision tree complexity.
Avraham Ben-Aroya (Tel Aviv University) Oded Regev (Tel Aviv University) Ronald de Wolf (CWI, Amsterdam) A Hypercontractive Inequality for Matrix-Valued.
Private Analysis of Data Sets Benny Pinkas HP Labs, Princeton.
On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research.
Bit Complexity of Breaking and Achieving Symmetry in Chains and Rings.
1 Introduction to Secure Computation Benny Pinkas HP Labs, Princeton.
Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference.
EECS 598 Fall ’01 Quantum Cryptography Presentation By George Mathew.
Shengyu Zhang The Chinese University of Hong Kong.
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 13 June 22, 2005
Noise, Information Theory, and Entropy
Quantum Shannon Theory Patrick Hayden (McGill) 17 July 2005, Q-Logic Meets Q-Info.
1 Cross-Domain Secure Computation Chongwon Cho (HRL Laboratories) Sanjam Garg (IBM T.J. Watson) Rafail Ostrovsky (UCLA)
Basic Concepts in Information Theory
2. Mathematical Foundations
Distributed Verification and Hardness of Distributed Approximation Atish Das Sarma Stephan Holzer Danupon Nanongkai Gopal Pandurangan David Peleg 1 Weizmann.
Equality Function Computation (How to make simple things complicated) Nitin Vaidya University of Illinois at Urbana-Champaign Joint work with Guanfeng.
Tight Bounds for Graph Problems in Insertion Streams Xiaoming Sun and David P. Woodruff Chinese Academy of Sciences and IBM Research-Almaden.
1 Information and interactive computation January 16, 2012 Mark Braverman Computer Science, Princeton University.
Information Complexity Lower Bounds for Data Streams David Woodruff IBM Almaden.
Secure two-party computation: a visual way by Paolo D’Arco and Roberto De Prisco.
Channel Capacity.
Rei Safavi-Naini University of Calgary Joint work with: Hadi Ahmadi iCORE Information Security.
On the Communication Complexity of SFE with Long Output Daniel Wichs (Northeastern) joint work with Pavel Hubáček.
Coding Theory Efficient and Reliable Transfer of Information
Interactive Channel Capacity Ran Raz Weizmann Institute Joint work with Gillat Kol Technion.
Information Complexity: an Overview Rotem Oshman, Princeton CCI Based on work by Braverman, Barak, Chen, Rao, and others Charles River Science of Information.
Mathematical Foundations Elementary Probability Theory Essential Information Theory Updated 11/11/2005.
The Price of Uncertainty in Communication Brendan Juba (Washington U., St. Louis) with Mark Braverman (Princeton)
Umans Complexity Theory Lectures Lecture 7b: Randomization in Communication Complexity.
Of 22 10/07/2015UMass: Uncertain Communication1 Communication Amid Uncertainty Madhu Sudan Microsoft Research Based on Juba, S. (STOC 2008, ITCS 2011)
Of 22 10/30/2015WUSTL: Uncertain Communication1 Communication Amid Uncertainty Madhu Sudan Harvard Based on Juba, S. (STOC 2008, ITCS 2011) Juba, S. (STOC.
Massive Data Sets and Information Theory Ziv Bar-Yossef Department of Electrical Engineering Technion.
Data Stream Algorithms Lower Bounds Graham Cormode
Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel.
Presented by Minkoo Seo March, 2006
Communication Complexity Guy Feigenblat Based on lecture by Dr. Ely Porat Some slides where adapted from various sources Complexity course Computer science.
1 Introduction to Quantum Information Processing CS 467 / CS 667 Phys 467 / Phys 767 C&O 481 / C&O 681 Richard Cleve DC 3524 Course.
Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.
RELIABLE COMMUNICATION 1 IN THE PRESENCE OFLIMITEDADVERSARIES.
Basic Concepts of Information Theory A measure of uncertainty. Entropy. 1.
Ankit Garg Princeton Univ. Joint work with Mark Braverman Young Kun Ko Princeton Univ. Princeton Univ. Jieming Mao Dave Touchette Princeton Univ. Univ.
Hartmut Klauck Centre for Quantum Technologies Nanyang Technological University Singapore.
Lower bounds for Unconditionally Secure MPC Ivan Damgård Jesper Buus Nielsen Antigoni Polychroniadou Aarhus University.
1 CSCD 433 Network Programming Fall 2016 Lecture 4 Digital Line Coding and other...
Statistical methods in NLP Course 2 Diana Trandab ă ț
Imperfectly Shared Randomness
The Exact Round Complexity of Secure Computation
Information Complexity Lower Bounds
Unbounded-Error Classical and Quantum Communication Complexity
Multiple Access Covert Channels
CS 154, Lecture 6: Communication Complexity
CSE-490DF Robotics Capstone
Quantum Information Theory Introduction
Uncertain Compression
Imperfectly Shared Randomness
Hash Functions Motivation Hash Functions: collision, pre-images SHA-1
Shengyu Zhang The Chinese University of Hong Kong
One Way Functions Motivation Complexity Theory Review, Motivation
Presentation transcript:

1 Information complexity and exact communication bounds April 26, 2013 Mark Braverman Princeton University Based on joint work with Ankit Garg, Denis Pankratov, and Omri Weinstein

Overview: information complexity Information complexity :: communication complexity as Shannon’s entropy :: transmission cost 2

Background – information theory Shannon (1948) introduced information theory as a tool for studying the communication cost of transmission tasks. 3 communication channel Alice Bob

Shannon’s entropy 4 communication channel X

Shannon’s noiseless coding 5

Shannon’s entropy – cont’d communication channel X Y

A simple example 7 Easy and complete!

Communication complexity [Yao] Focus on the two party randomized setting. 8 A B X Y F(X,Y) Meanwhile, in a galaxy far far away… Shared randomness R

Communication complexity A B X Y F(X,Y) m 1 (X,R) m 2 (Y,m 1,R) m 3 (X,m 1,m 2,R) Communication cost = #of bits exchanged. Shared randomness R

Communication complexity Numerous applications/potential applications (streaming, data structures, circuits lower bounds…) Considerably more difficult to obtain lower bounds than transmission (still much easier than other models of computation). Many lower-bound techniques exists. Exact bounds?? 10

Communication complexity 11

Set disjointness and intersection

Information complexity 13

Basic definition 1: The information cost of a protocol A B X Y Protocol π what Alice learns about Y + what Bob learns about X

Mutual information 15 H(A) H(B) I(A,B)

Basic definition 1: The information cost of a protocol A B X Y Protocol π what Alice learns about Y + what Bob learns about X

Example A B X Y what Alice learns about Y + what Bob learns about X MD5(X) [128 bits] X=Y? [1 bit]

Information complexity 18

Prior-free information complexity 19

Connection to privacy 20

Information equals amortized communication 21

Without priors 22

Intersection 23

The two-bit AND 24

The optimal protocol for AND A B X  {0,1} Y  {0,1} If X=1, A=1 If X=0, A=U [0,1] If Y=1, B=1 If Y=0, B=U [0,1] 0 1 “Raise your hand when your number is reached”

The optimal protocol for AND A B If X=1, A=1 If X=0, A=U [0,1] If Y=1, B=1 If Y=0, B=U [0,1] 0 1 “Raise your hand when your number is reached” X  {0,1} Y  {0,1}

Analysis 27

The analytical view A message is just a mapping from the current prior to a distribution of posteriors (new priors). Ex: 28 Y=0Y=1 X= X= Y=0Y=1 X=02/31/3 X=100 Y=0Y=1 X=000 X= Alice sends her bit “0”: 0.6 “1”: 0.4

The analytical view 29 Y=0Y=1 X= X= Y=0Y=1 X= X= Y=0Y=1 X=02/91/9 X=11/21/6 Alice sends her bit w.p ½ and unif. random bit w.p ½. “0”: 0.55 “1”: 0.45

Analytical view – cont’d 30

IC of AND 31

*Not a real protocol 32

Previous numerical evidence [Ma,Ishwar’09] – numerical calculation results. 33

Applications: communication complexity of intersection 34

Applications 2: set disjointness 35

A hard distribution? Y=0Y=1 X=01/4 X=11/4 Very easy!

A hard distribution Y=0Y=1 X=01/3 X=11/3 At most one (1,1) location!

Communication complexity of Disjointness 38

Small-set Disjointness 39

Using information complexity Y=0Y=1 X=01-2k/nk/n X=1k/n

Overview: information complexity Information complexity :: communication complexity as Shannon’s entropy :: transmission cost Today: focused on exact bounds using IC. 41

Selected open problems 1

Interactive compression? 43

Interactive compression? 44

Selected open problems 2 45

External information cost A B X Y Protocol π C

External information complexity 47

48 Thank You!