The Complexity of Agreement A 100% Quantum-Free Talk Scott Aaronson MIT.

Slides:



Advertisements
Similar presentations
Estimating Distinct Elements, Optimally
Advertisements

1+eps-Approximate Sparse Recovery Eric Price MIT David Woodruff IBM Almaden.
How Much Information Is In Entangled Quantum States? Scott Aaronson MIT |
Quantum t-designs: t-wise independence in the quantum world Andris Ambainis, Joseph Emerson IQC, University of Waterloo.
Limitations of Quantum Advice and One-Way Communication Scott Aaronson UC Berkeley IAS Useful?
Pretty-Good Tomography Scott Aaronson MIT. Theres a problem… To do tomography on an entangled state of n qubits, we need exp(n) measurements Does this.
Scott Aaronson Institut pour l'Étude Avançée Le Principe de la Postselection.
Solving Hard Problems With Light Scott Aaronson (Assoc. Prof., EECS) Joint work with Alex Arkhipov vs.
The Average Case Complexity of Counting Distinct Elements David Woodruff IBM Almaden.
Xiaoming Sun Tsinghua University David Woodruff MIT
Tight Lower Bounds for the Distinct Elements Problem David Woodruff MIT Joint work with Piotr Indyk.
Revisiting the efficiency of malicious two party computation David Woodruff MIT.
Truthful Mechanisms for Combinatorial Auctions with Subadditive Bidders Speaker: Shahar Dobzinski Based on joint works with Noam Nisan & Michael Schapira.
Properties of Least Squares Regression Coefficients
Lecture 6. Prefix Complexity K The plain Kolmogorov complexity C(x) has a lot of “minor” but bothersome problems Not subadditive: C(x,y)≤C(x)+C(y) only.
Introduction to Proofs
Circuit and Communication Complexity. Karchmer – Wigderson Games Given The communication game G f : Alice getss.t. f(x)=1 Bob getss.t. f(y)=0 Goal: Find.
Random non-local games Andris Ambainis, Artūrs Bačkurs, Kaspars Balodis, Dmitry Kravchenko, Juris Smotrovs, Madars Virza University of Latvia.
Non myopic strategy Truth or Lie?. Scoring Rules One important feature of market scoring rules is that they are myopic strategy proof. That means that.
Last Class: The Problem BobAlice Eve Private Message Eavesdropping.
Quantum Computing MAS 725 Hartmut Klauck NTU
C&O 355 Mathematical Programming Fall 2010 Lecture 12 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A.
Gillat Kol joint work with Ran Raz Competing Provers Protocols for Circuit Evaluation.
Introduction to Basic Statistical Methodology. CHAPTER 1 ~ Introduction ~
Foundations of Cryptography Lecture 4 Lecturer: Moni Naor.
Section 1.6: Sets Sets are the most basic of discrete structures and also the most general. Several of the discrete structures we will study are built.
Instructions First-price No Communication treatment.
For Bayesian Wannabes, Are Disagreements Not About Info? Robin Hanson Economics, GMU.
Consensus problem Agreement. All processes that decide choose the same value. Termination. All non-faulty processes eventually decide. Validity. The common.
Complexity 26-1 Complexity Andrei Bulatov Interactive Proofs.
The Mean Square Error (MSE):. Now, Examples: 1) 2)
Quantum Key Distribution (QKD) John A Clark Dept. of Computer Science University of York, UK
Binomial Random Variables. Binomial experiment A sequence of n trials (called Bernoulli trials), each of which results in either a “success” or a “failure”.
Probability Distributions Finite Random Variables.
Bit Complexity of Breaking and Achieving Symmetry in Chains and Rings.
Optimistic Synchronous Multi-Party Contract Signing N. Asokan, Baum-Waidner, M. Schunter, M. Waidner Presented By Uday Nayak Advisor: Chris Lynch.
Quantum Key Establishment Wade Trappe. Talk Overview Quantum Demo Quantum Key Establishment.
DANSS Colloquium By Prof. Danny Dolev Presented by Rica Gonen
Lecture 20: April 12 Introduction to Randomized Algorithms and the Probabilistic Method.
Theory of Computing Lecture 22 MAS 714 Hartmut Klauck.
K-Anonymous Message Transmission Luis von Ahn Andrew Bortz Nick Hopper The Aladdin Center Carnegie Mellon University.
Finite probability space set  (sample space) function P:  R + (probability distribution)  P(x) = 1 x 
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 13 June 22, 2005
Statistical Hypothesis Testing. Suppose you have a random variable X ( number of vehicle accidents in a year, stock market returns, time between el nino.
Section 3.1: Proof Strategy Now that we have a fair amount of experience with proofs, we will start to prove more difficult theorems. Our experience so.
Chapter 6 Lecture 3 Sections: 6.4 – 6.5.
Copyright © Cengage Learning. All rights reserved. CHAPTER 8 RELATIONS.
Quantum Cryptography Slides based in part on “A talk on quantum cryptography or how Alice outwits Eve,” by Samuel Lomonaco Jr. and “Quantum Computing”
Automated Mechanism Design Tuomas Sandholm Presented by Dimitri Mostinski November 17, 2004.
LECTURE 19 THE COSMOLOGICAL ARGUMENT CONTINUED. THE QUANTUM MECHANICAL OBJECTION DEPENDS UPON A PARTICULAR INTERPRETATION WE MIGHT REASONABLY SUSPEND.
Umans Complexity Theory Lectures Lecture 7b: Randomization in Communication Complexity.
Chap 15. Agreement. Problem Processes need to agree on a single bit No link failures A process can fail by crashing (no malicious behavior) Messages take.
Data Stream Algorithms Lower Bounds Graham Cormode
Multi-Party Proofs and Computation Based in part on materials from Cornell class CS 4830.
Communication Complexity Guy Feigenblat Based on lecture by Dr. Ely Porat Some slides where adapted from various sources Complexity course Computer science.
Negotiating Socially Optimal Allocations of Resources U. Endriss, N. Maudet, F. Sadri, and F. Toni Presented by: Marcus Shea.
1 Fault-Tolerant Consensus. 2 Communication Model Complete graph Synchronous, network.
The Message Passing Communication Model David Woodruff IBM Almaden.
1 Introduction to Quantum Information Processing CS 467 / CS 667 Phys 467 / Phys 767 C&O 481 / C&O 681 Richard Cleve DC 3524 Course.
Lower bounds for Unconditionally Secure MPC Ivan Damgård Jesper Buus Nielsen Antigoni Polychroniadou Aarhus University.
Information Complexity Lower Bounds
The Duality Theorem Primal P: Maximize
Introduction to Randomized Algorithms and the Probabilistic Method
New Characterizations in Turnstile Streams with Applications
Course Business I am traveling April 25-May 3rd
CS 154, Lecture 6: Communication Complexity
Turnstile Streaming Algorithms Might as Well Be Linear Sketches
Topic 3: Perfect Secrecy
Imperfectly Shared Randomness
CS21 Decidability and Tractability
Presentation transcript:

The Complexity of Agreement A 100% Quantum-Free Talk Scott Aaronson MIT

People disagree. Why? Because some people are idiots? Because people care about things other than truth (winning debates, not admitting error, etc.)? Because even given the same information, people would reach different honest conclusions? Can rational agents ever agree to disagree solely because of differing information?

Aumann 1976: No. Suppose Alice and Bob are Bayesians with a common prior, but different knowledge. Let E A be Alices estimate of (say) the chance of rain tomorrow, conditioned on all her knowledge. Let E B be Bobs estimate, conditioned all his knowledge. Suppose E A and E B are common knowledge (Alice and Bob both know their values, both know that they both know them, etc.) Theorem: E A = E B

Standard Protocol Alices initial expectation E A,0 Bobs new expectation E B,1 Alices new expectation E A,2 Bobs new expectation E B,3 Geanakoplos & Polemarchakis 1982: Provided the state space is finite, Alice and Bob will agree after a finite number of rounds

Whats Going On? Alice knows the first of 2 bits 11 Bob knows their sum We can represent an agents knowledge by a partition of the state space… If Alice and Bob dont agree with certainty, then one of them can learn something from the others messagemeaning its partition gets refined But the partitions can get refined only finitely many times MESSAGE

Problems Huge number of messages might be needed before Alice and Bob agree Messages are real numbers Conjecture For some function f of Alices input x {0,1} n and Bobs input y {0,1} n, and some distribution over (x,y), Alice and Bob will need to exchange (n) bits even to probably approximately agree about EX[f(x,y)] Intuition comes from communication complexity

Main Result: Conjecture Is False For all f:{0,1} n {0,1} n [0,1], and all prior distributions over (x,y) pairs, Alice and Bob can (, )-agree about the expectation of f, by exchanging only bits. Moral: Agreeing to disagree is problematic, even for agents subject to realistic communication constraints Independent of n Say Alice and Bob (, )-agree if they agree within with probability at least 1- over their prior

Intuition In the standard protocol, as long as Alice and Bob disagree by, their expectations E A and E B follow an unbiased random walk with step size. But since E A,E B [0,1], such a walk hits an absorbing barrier after ~1/ 2 steps. To prove, use (E A ) 2,(E B ) 2 as progress measures

A bit more formally… Let E A,t ( ), E B,t ( ) be Alice and Bobs expectations at time t, assuming the true state of the world is For any function F: [0,1], let ||F|| = EX D [F( ) 2 ] Let ={0,1} n {0,1} n be the state space, and let D be the shared prior distribution

Lemma: Suppose Alice sends the t th message. Then ||E B,t || - ||E A,t-1 || = ||E B,t - E A,t-1 || Proof: Crucial observation: if Alice has just sent Bob a message, then her expectation of Bobs expectation equals her expectation. Hence

Theorem: The standard protocol causes Alice and Bob to (, )-agree after 1/( 2 ) messages Proof: Suppose Alice sends the t th message, and suppose Then Likewise, after Bob sends Alice the (t+1) st message, ||E A,t+1 || > ||E B,t || + 2. But max{||E A,t ||,||E B,t ||} is initially 0 and can never exceed 1.

Werent we cheating, since messages in the standard protocol are unbounded-precision real numbers? Discretized standard protocol: Let Charlie (C) be a monkey in the middle who sees Alice and Bobs messages but doesnt know their inputs. At the t th step, Alice tells Bob whether E A,t >E C,t + /4, E A,t <E C,t - /4, or neither. Bob does likewise. Theorem: The discretized standard protocol causes Alice and Bob to (, )-agree after at most 3072/( 2 ) messages.

Two interesting questions: 1. Does the standard protocol ever need ~1/ 2 messages? 2. Is there ever a protocol that does better? Example answering both questions: Let Alices input x=x 1 …x n and Bobs input y=y 1 …y n be uniform over {-1,1} n. Then let

Theorem: The standard protocol requires messages before Alice and Bobs expectations of f agree within with constant probability. On the other hand, theres an attenuated protocol that causes them to agree within with constant probability after only O(1) messages

Three or More Agents Could a weak-willed Charlie fail to bring Alice and Bob into agreement with each other? Theorem: On a strongly-connected graph with N agents and diameter d, messages suffice for every pair of agents to (, )-agree

Computational Complexity Fine, few messages suffice for agreementbut what if it took Alice and Bob billions of years to calculate their messages? Result: Provided the agents can sample from their initial partitions, they can simulate Bayesian rationality using a number of samples independent of n (but alas, exponential in 1/ 6 ) Meaning: By examining their messages, you couldnt distinguish these Bayesian wannabes from true Bayesians with non-negligible bias

Computational Complexity Problem: An agents expectation could lie on a knife-edge between two messages 10 Pr[message] Solution: Have agents smooth their messages by adding random noise to them

Open Problems Do Alice and Bob need to exchange (1/ 2 ) bits to agree within with high probability? Best lower bound I can show is the trivial (log1/ ) Is there a scenario where Alice and Bob must exchange (n) bits to (,0)-agree? Can Bayesian wannabes agree within with high probability using poly(1/ ) samples, rather than 2 poly(1/ ) ?

Open Problems (cont) Suppose Alice and Bob (, )-agree about the expectation of f: [0,1]. Then do they also have approximate common knowledge? (Alice is pretty sure of Bobs expectation, Alice is pretty sure Bobs pretty sure of her expectation, Alice is pretty sure Bobs pretty sure shes pretty sure of his expectation, etc.)