Tony Short University of Cambridge (with Sabri Al-Safi – PRA 84, 042323 (2011))

Slides:



Advertisements
Similar presentations
1+eps-Approximate Sparse Recovery Eric Price MIT David Woodruff IBM Almaden.
Advertisements

The Learnability of Quantum States Scott Aaronson University of Waterloo.
Quantum t-designs: t-wise independence in the quantum world Andris Ambainis, Joseph Emerson IQC, University of Waterloo.
The Average Case Complexity of Counting Distinct Elements David Woodruff IBM Almaden.
QCRYPT 2011, Zurich, September 2011 Lluis Masanes 1, Stefano Pironio 2 and Antonio Acín 1,3 1 ICFO-Institut de Ciencies Fotoniques, Barcelona 2 Université.
Circuit and Communication Complexity. Karchmer – Wigderson Games Given The communication game G f : Alice getss.t. f(x)=1 Bob getss.t. f(y)=0 Goal: Find.
Random non-local games Andris Ambainis, Artūrs Bačkurs, Kaspars Balodis, Dmitry Kravchenko, Juris Smotrovs, Madars Virza University of Latvia.
Random non-local games Andris Ambainis, Artūrs Bačkurs, Kaspars Balodis, Dmitry Kravchenko, Juris Smotrovs, Madars Virza University of Latvia.
Foundations of Cryptography Lecture 10 Lecturer: Moni Naor.
I NFORMATION CAUSALITY AND ITS TESTS FOR QUANTUM COMMUNICATIONS I- Ching Yu Host : Prof. Chi-Yee Cheung Collaborators: Prof. Feng-Li Lin (NTNU) Prof. Li-Yi.
Fixing the lower limit of uncertainty in the presence of quantum memory Archan S. Majumdar S. N. Bose National Centre for Basic Sciences, Kolkata Collaborators:
Congestion Games with Player- Specific Payoff Functions Igal Milchtaich, Department of Mathematics, The Hebrew University of Jerusalem, 1993 Presentation.
Nonlocal Boxes And All That Daniel Rohrlich Atom Chip Group, Ben Gurion University, Beersheba, Israel 21 January 2010.
The Unique Games Conjecture with Entangled Provers is False Julia Kempe Tel Aviv University Oded Regev Tel Aviv University Ben Toner CWI, Amsterdam.
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
Properties of State Variables
Copyright © Cengage Learning. All rights reserved. CHAPTER 5 SEQUENCES, MATHEMATICAL INDUCTION, AND RECURSION SEQUENCES, MATHEMATICAL INDUCTION, AND RECURSION.
Entropy in the Quantum World Panagiotis Aleiferis EECS 598, Fall 2001.
Michael A. Nielsen University of Queensland Quantum entropy Goals: 1.To define entropy, both classical and quantum. 2.To explain data compression, and.
Information Theoretical Security and Secure Network Coding NCIS11 Ning Cai May 14, 2011 Xidian University.
Universal Uncertainty Relations Gilad Gour University of Calgary Department of Mathematics and Statistics Gilad Gour University of Calgary Department of.
Chain Rules for Entropy
Bell inequality & entanglement
Quantum Computing MAS 725 Hartmut Klauck NTU
Chapter 6 Information Theory
Computability and Complexity 20-1 Computability and Complexity Andrei Bulatov Random Sources.
Superdense coding. How much classical information in n qubits? Observe that 2 n  1 complex numbers apparently needed to describe an arbitrary n -qubit.
Avraham Ben-Aroya (Tel Aviv University) Oded Regev (Tel Aviv University) Ronald de Wolf (CWI, Amsterdam) A Hypercontractive Inequality for Matrix-Valued.
Oded Regev (Tel Aviv University) Ben Toner (CWI, Amsterdam) Simulating Quantum Correlations with Finite Communication.
Erasing correlations, destroying entanglement and other new challenges for quantum information theory Aram Harrow, Bristol Peter Shor, MIT quant-ph/
EECS 598 Fall ’01 Quantum Cryptography Presentation By George Mathew.
Paraty, Quantum Information School, August 2007 Antonio Acín ICFO-Institut de Ciències Fotòniques (Barcelona) Quantum Cryptography.
Foundations of Cryptography Lecture 2 Lecturer: Moni Naor.
1 Introduction to Quantum Information Processing QIC 710 / CS 768 / PH 767 / CO 681 / AM 871 Richard Cleve QNC 3129 Lecture 18 (2014)
Is Communication Complexity Physical? Samuel Marcovitch Benni Reznik Tel-Aviv University arxiv
1 Introduction to Quantum Information Processing QIC 710 / CS 667 / PH 767 / CO 681 / AM 871 Richard Cleve DC 2117 Lecture 16 (2011)
Feynman Festival, Olomouc, June 2009 Antonio Acín N. Brunner, N. Gisin, Ll. Masanes, S. Massar, M. Navascués, S. Pironio, V. Scarani Quantum correlations.
A Few Simple Applications to Cryptography Louis Salvail BRICS, Aarhus University.
1 Introduction to Quantum Information Processing CS 667 / PH 767 / CO 681 / AM 871 Richard Cleve DC 2117 Lecture 19 (2009)
QCCC07, Aschau, October 2007 Miguel Navascués Stefano Pironio Antonio Acín ICFO-Institut de Ciències Fotòniques (Barcelona) Cryptographic properties of.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
Channel Capacity.
1 New Coins from old: Computing with unknown bias Elchanan Mossel, U.C. Berkeley
A limit on nonlocality in any world in which communication complexity is not trivial IFT6195 Alain Tapp.
1 Experimenter‘s Freedom in Bell‘s Theorem and Quantum Cryptography Johannes Kofler, Tomasz Paterek, and Časlav Brukner Non-local Seminar Vienna–Bratislava.
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
1 Use graphs and not pure logic Variables represented by nodes and dependencies by edges. Common in our language: “threads of thoughts”, “lines of reasoning”,
The question Can we generate provable random numbers? …. ?
1 Conference key-agreement and secret sharing through noisy GHZ states Kai Chen and Hoi-Kwong Lo Center for Quantum Information and Quantum Control, Dept.
On Minimum Reversible Entanglement Generating Sets Fernando G.S.L. Brandão Cambridge 16/11/2009.
Joint Moments and Joint Characteristic Functions.
1 On the Channel Capacity of Wireless Fading Channels C. D. Charalambous and S. Z. Denic School of Information Technology and Engineering, University of.
1 Introduction to Quantum Information Processing CS 467 / CS 667 Phys 467 / Phys 767 C&O 481 / C&O 681 Richard Cleve DC 3524 Course.
Quantum Cryptography Antonio Acín
Basic Theory (for curve 01). 1.1 Points and Vectors  Real life methods for constructing curves and surfaces often start with points and vectors, which.
1 Introduction to Quantum Information Processing CS 467 / CS 667 Phys 467 / Phys 767 C&O 481 / C&O 681 Richard Cleve DC 3524 Course.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Non-Locality Swapping and emergence of quantum correlations Nicolas Brunner Paul Skrzypczyk, Sandu Popescu University of Bristol.
Quantum Non-locality: From Bell to Information Causality Alex Thompson Physics 486 March 7, 2016.
Non-locality and quantum games Dmitry Kravchenko University of Latvia Theory days at Jõulumäe, 2008.
Random Access Codes and a Hypercontractive Inequality for
The Duality Theorem Primal P: Maximize
Entropic uncertainty relations for anti-commuting observables
Simulating entanglement without communication
LECTURE 10: EXPECTATION MAXIMIZATION (EM)
Information primitives and laws of nature Mai 2008 Alain Tapp
Quantum Information Theory Introduction
Atom Chip Group, Ben Gurion University, Beersheba, Israel
Information Theoretical Analysis of Digital Watermarking
Sequential sharing of nonlocal correlations
Presentation transcript:

Tony Short University of Cambridge (with Sabri Al-Safi – PRA 84, (2011))

Overview  Comparing the information-theoretic capabilities of quantum theory with other possible theories can help us:  Understand why nature is quantum  Hone our intuitions about quantum applications  Surprisingly, despite entanglement, quantum theory is no better than classical for some non-local tasks.... Why?  Non-local computation [Linden et al., 2007]  Guess your neighbour’s input [Almeida et al. 2010]  Information causality [Pawlowski et al. 2009]

The CHSH game  What correlations P(ab|xy) are achievable given certain resources?  What is the maximum success probability p in this game? b  {0,1} Alice Random x  {0,1}Random y  {0,1} Bob a  {0,1} Shared resources

 Local (classical): P(a,b|x,y) = Σ λ q λ P λ (a|x)P λ (b|y) p C  3/4 (Bell’s Theorem - CHSH inequality)  Quantum: P(a,b|x,y) = Tr(P x a ⊗ P y b ρ) p Q  (2+√2)/4 (Tsirelson’s bound)  General (box-world): Σ a P(a,b|x,y) independent of x Non-signalling conditions Σ b P(a,b|x,y) independent of y p G  1 (PR-boxes)

 PR-box correlations [Popescu, Rohrlich (1994)] Optimal non-signalling correlations (p = 1)  Problem: Is there a good, physical intuition behind p Q  (2 + √2) / 4 ? x b a b y

Information Causality  Information causality relates to a particular communication task [Pawlowski et al, Nature 461, 1101 (2009)] m classical bits byby Alice N random bits x 1... x N Random y  {1,...,N} Bob (Bob’s best Guess of x y ) Task : Maximize

 I(x:y) is the classical mutual information  The Information causality principle states  Physical intuition: The total information that Bob can extract about Alice’s N bits must be no greater than the m bits Alice sends him.  However, note that Bob only guesses 1 bit in each game.  The bound on J can easily be saturated: Alice simply sends Bob the first m bits of a.

 Information Causality is obeyed in quantum theory and classical theory, and in any theory in which a ‘good’ measure of mutual information can be defined (see later)  Information Causality can be violated by general non-signalling correlations. E.g. One can achieve J=N >> m=1, using  Information Causality can be violated using any correlations which violate Tsirelson’s bound for the CHSH game (when N=2 n, m=1)

 Hence Information Causality Tsirelson’s bound  Furthermore, it can even generate part of the curved surface of quantum correlations [Allcock, Brunner, Pawlowski, Scarani 2009]  But why is this particular task and figure of merit J so important?  What about probability of success in the game?  Given that J is a strange non-linear function of the probabilities, how does this yield nice bounds on quantum correlations  Is mutual information central to quantum theory?

I.C. - A probabilistic perspective  If we use probability of success in the Information Causality game, quantum theory can do better than classical m classical bits byby Alice N random bits x 1... x N Random y  {1,...,N} Bob (Bob’s best Guess of x y ) Task : Maximize

 When m=1, N=2, maximum success probabilities are the same as for the CHSH game  The m=1 case for general N has been studied as ‘Random access coding’ [Ambainis et al 2008, Pawlowski & Zukowski 2010] (Known to be tight for N=2 k 3 j )

 Furthermore, J=  y I(x y :b y ) and the success probability are not monotonically related. E.g. For N=2, m=1  Strategy 1: Alice sends x 1 with a bit of noise J= 1- , p=3/4-  ’  Strategy 2: Alice sends either x 1 or x 2 perfectly, based on random bit shared with Bob J  0.38, p= ¾  What is the relation between bounds on J and on the success probability, and how do these relate to Tsirelson’s bound?

 Define p y as the probability of success when Bob is given y, and the corresponding bias E y = 2p y – 1  When proving Tsirelson’s bound, the crucial step uses a quadratic bound on the entropy when m=1, Information Causality therefore implies  Can we derive a ‘quadratic bias bound’ like this directly?

Information Causality as a non-local game  It is helpful to consider a nonlocal version of the Information Causality game. This is at least as hard as the previous version with m=1 (as Alice can send the message a, and Bob output b y =a  b ) b Alice N random bits x 1... x N Random y  {1,...,N} Bob a

 For any quantum strategy  Using similar techniques to those in the non-local computation paper [Linden et al (2007)] we define and note that

 Hence we obtain the quantum bound  This is easily saturated classically (a=x 1, b=0)  With this figure of merit quantum theory is no better than classical. Yet with general correlations the sum can equal N  It is stronger than the bound given by Information Causality (  y E y 2  2ln2)  Furthermore any set of biases E y satisfying Σ y E y 2 ≤ 1 is quantum realizable. This bound therefore characterizes the achievable set of biases more comprehensively than Information Causality.

 When we set all E y equal, then E y = 1/  N, and we achieve  As this non-local game is at least as hard as the original, we can achieve the previously known upper bound on the success probability of the (m=1) Information Causality game for all N.  We can easily extend the proof to get quadratic bounds for a more general class of inner product games

Inner product game (with Bob’s input having any distribution)  When Bob’s bit string is restricted to contain a single 1, this implies the Information Causality result. When N=1, it yields Tsirelson’s bound, and the stronger quadratic version [Uffink 2002] b Alice N random bits x 1... x N Bob a N bits y 1... y N

Summary of probabilistic perspective  The form of the mutual information does not seem crucial in deriving Tsirelson’s bound from Information Causality.  Instead, quadratic bias bounds seem to naturally characterise quantum correlations.  The inner product game with figure of merit  y E y 2 is another task for which quantum theory is no better than classical, but which slightly-stronger correlations help with.

I.C. - An entropic perspective  The key role of the mutual information is in deriving Information Causality. The bound J  m follows from the existence of a mutual information I(X:Y) for all systems XY, satisfying: 1. Symmetry I(X:Y) = I(Y:X) 2. Consistency I(X:Y)= Classical I when X, Y are classical 3. Data Processing I(X:Y) ≥ I(X:T(Y)) for any transformation T 4. Chain Rule I(XY:Z) – I(X:Z) = I(Y:XZ) – I(X:Y) (Plus the existence of some natural transformations)

 But mutual information is a complicated quantity (two arguments), and this list of properties is quite extensive.  Instead, we can derive Information Causality from the existence of an entropy H(X), defined for all systems X in the theory, satisfying just 2 conditions: 1. Consistency H(X)= Shannon entropy when X is classical 2. Local Evolution ΔH(XY) ≥ ΔH(X)+ ΔH(Y) for any local transformation on X and Y  The intuition behind the 2 nd condition is that local transformations can destroy but not create correlations, generally leading to more uncertainty than their local effect.

 To derive information causality, we can use H to construct a measure of mutual information I(X:Y)=H(X)+H(Y) – H(XY), then use the original proof.  The desired properties of I(X:Y) follow simply 1. Symmetry trivial 2. Consistency from consistency of H(X) 3. Data Processing equivalent to Local Evolution of H(X) 4. Chain Rule trivial  Hence, Information causality holds in any theory which admits a `good’ measure of entropy. I.e. One which obeys Consistency and Local Evolution.  The Shannon and von Neumann entropies are both `good’.

 We can prove that any `good’ entropy shares the following standard properties of the Shannon and Von Neumann entropies:  SubadditivityH(X,Y) ≤ H(X) + H(Y)  Strong subadditivityH(X 1 X 2 | Y) ≤ H(X 1 | Y) + H(X 2 | Y)  Classical positivityH(X | Y) ≥ 0 whenever X is classical (where we have defined H(X|Y)= H(XY)-H(Y) )  Instead of proceeding via the mutual information, we can use these relations to derive information causality directly.

 This actually allows us to prove a slight generalisation of Information Causality:  This generalized form of Information Causality makes no assumptions about the distribution on Alice’s inputs x 1...x N.  The intuition here is that the uncertainty that Bob has about Alice’s bits at the end of the game, must be greater than the original uncertainty about her inputs minus the information gained by the message.

Entropy in general probabilistic theories  We can define an entropy operationally in any theory [Short, Wehner / Barrett et al. / Kimura et al. (2010) ]  Measurement entropy: H(X) is the minimal Shannon entropy of the outputs for a fine-grained measurement on X  Decomposition entropy: H(X) is the minimal Shannon entropy of the coefficients when X is written as a mixture of pure states.  These both obey consistency, and give the von Neumann entropy for quantum theory. However, for many theories they violate local evolution.

Entropy and Tsirelson’s bound (also in Dahlsten et al. 2011)  Finally note that due to information causality, Existence of a `good’ entropy Tsirelson’s bound  The existence of a `good’ measure of entropy seems like a very general property, yet remarkably it leads to a very specific piece of quantum structure.  This also means that no ‘good’ measure of entropy exists in physical theories more nonlocal than Tsirelson’s bound, (such as box-world, which admits all non-signalling correlations).

Summary and open questions  Quantum theory satisfies and saturates a simple quadratic bias bound  y E y 2  1 for the Inner Product and Information Causality games, which generalises Tsirelson’s bound.  Can we find other similar quadratic bounds?  The existence of a ‘good’ measure of entropy in a theory (satisfying just 2 properties) is sufficient to derive information causality and Tsirelson’s bound.  Is quantum theory the most general case with such an entropy?  Is there a connection to thermodynamics?