Presentation is loading. Please wait.

Presentation is loading. Please wait.

Stronger Subadditivity Fernando G.S.L. Brandão University College London -> Microsoft Research Coogee 2015 based on arXiv:1411.4921 with Aram Harrow Jonathan.

Similar presentations


Presentation on theme: "Stronger Subadditivity Fernando G.S.L. Brandão University College London -> Microsoft Research Coogee 2015 based on arXiv:1411.4921 with Aram Harrow Jonathan."— Presentation transcript:

1 Stronger Subadditivity Fernando G.S.L. Brandão University College London -> Microsoft Research Coogee 2015 based on arXiv:1411.4921 with Aram Harrow Jonathan Oppenheim Sergii Strelchuk

2 Strong Subadditivity (Lieb and Ruskai ‘73) For ρ CRB Conjectured by (Robinson, Ruelle ’66) and ( Lanford, Robinson ’68)

3 Strong Subadditivity (Lieb and Ruskai ‘73) For ρ CRB Equivalent to data processing: For every ρ, σ and channel Λ Relative Entropy: Several applications in quantum information theory and beyond E.g. used to prove converses for almost all protocols (channel capacity, entanglement distillation, key distribution)

4 Strong Subadditivity (Lieb and Ruskai ‘73) For ρ CRB Equivalent to data processing: For every ρ, σ and channel Λ Many different proofs: Petz ’84 Carlen, Lieb ‘99 Nielsen, Petz ’04 Devetak, Yard ’06 Ruskai ’06 Kim ’12 Beaudry, Renner ’11 Headrick, Takayanagi ‘07

5 Strong Subadditivity and CMI (Lieb and Ruskai ‘73) For ρ CRB Conditional Mutual Information (CMI) By SSA:

6 Strong Subadditivity and CMI (Lieb and Ruskai ‘73) For ρ CRB Conditional Mutual Information (CMI) By SSA: Can we have a stronger subadditivity? I.e. Is there a positive non-identically-zero function f s.t.

7 For a probability distribution p XYZ : CMI for Probability Distributions

8 For a probability distribution p XYZ : and MC := {q : x-z-y form a Markov chain} CMI for Probability Distributions Relative entropy:

9 For a probability distribution p XYZ : and MC := {q : x-z-y form a Markov chain} CMI for Probability Distributions Relative entropy: Since (Pinsker’s inequality), I(X:Y|Z) ≤ε implies p is O(ε 1/2 ) close to a Markov chain

10 For a probability distribution p XYZ : and MC := {q : x-z-y form a Markov chain} CMI for Probability Distributions Relative entropy: Since (Pinsker’s inequality), I(X:Y|Z) ≤ε implies p is O(ε 1/2 ) close to a Markov chain How about for quantum states? We don’t know how to condition on quantum information… The second equality could still work. But what is the set of “quantum Markov chains”? ???

11 Quantum Markov States Classical: i) x-z-y Markov if x and y are independent conditioned on z ii) x-z-y Markov iff there is a channel Λ : Z -> YZ s.t. Λ(p XZ ) = p XYZ Quantum: i) ρ CRB Markov quantum state iff C and R are independent conditioned on B, i.e. and ii)ρ CRB Markov iff there is channel Λ : B -> RB s.t. Λ(ρ CB ) = ρ CRB (Hayden, Jozsa, Petz, Winter ’03) I(C:R|B)=0 iff ρ CRB is quantum Markov

12 Quantum Markov States Classical: i) x-z-y Markov if x and y are independent conditioned on z ii) x-z-y Markov if there is a channel Λ : Z -> YZ s.t. Λ(p XZ ) = p XYZ Quantum: i) ρ CRB Markov quantum state iff C and R are independent conditioned on B, i.e. and ii)ρ CRB Markov iff there is channel Λ : B -> RB s.t. Λ(ρ CB ) = ρ CRB (Hayden, Jozsa, Petz, Winter ’03) I(C:R|B)=0 iff ρ CRB is quantum Markov X Z Λ X Z Y

13 Quantum Markov States Classical: i) x-z-y Markov if x and y are independent conditioned on z ii) x-z-y Markov if there is a channel Λ : Z -> YZ s.t. Λ(p XZ ) = p XYZ Quantum: i) ρ CRB Markov quantum state if C and R are independent conditioned on B, i.e. and ii)ρ CRB Markov iff there is channel Λ : B -> RB s.t. Λ(ρ CB ) = ρ CRB (Hayden, Jozsa, Petz, Winter ’03) I(C:R|B)=0 iff ρ CRB is quantum Markov

14 Quantum Markov States Classical: i) x-z-y Markov if x and y are independent conditioned on z ii) x-z-y Markov if there is a channel Λ : Z -> YZ s.t. Λ(p XZ ) = p XYZ Quantum: i) ρ CRB Markov quantum state if C and R are independent conditioned on B, i.e. and ii)ρ CRB Markov if there is channel Λ : B -> RB s.t. Λ(ρ CB ) = ρ CRB

15 Quantum Markov States vs CMI (Hayden, Jozsa, Petz, Winter ’03) I(C:R|B)=0 iff ρ CRB is quantum Markov

16 Quantum Markov States vs CMI (Hayden, Jozsa, Petz, Winter ’03) I(C:R|B)=0 iff ρ CRB is quantum Markov obs:

17 Applications (Brown and Poulin ’12, …) (Quantum Hammersley-Clifford thm ) Every many body state with vanishing CMI is the Gibbs state of a commuting model (on triangle-free graphs) Useful in Quantum Belief Propagation (Hastings, Poulin ‘10) (Lewin and Wen ‘06) CMI is equal to twice topological entanglement entropy. If, I(A:C|B) = 2γ (Kitaev, unpublished) If γ=0, the state (+ ancilas) can be created by a constant depth circuit What if merely CMI ≈ 0? (Ex. ) A C BB

18 CMI vs Quantum Markov States Does ? or just ? QMC := { σ CRB : }

19 CMI vs Quantum Markov States Does ? or just ? QMC := { σ CRB : } (Ibinson, Linden, Winter ’06; Christandl, Schuch, Winter ‘11) If ρ CRB is QMC, ρ CR is separable ( ) For an extension ρ CRB of the d x d anti-symmetric state ρ CR, NO!

20 One Stronger Subadditivity (B., Christandl, Yard ’10; Li, Winter ’12; B., Harrow, Lee, Peres ‘13)

21 Applications: quasipolynomial-time algorithm for testing separability, faithfulness squashed entanglement, SoS hierarchy, etc… One Stronger Subadditivity

22 (B., Christandl, Yard ’10; Li, Winter ’12; B., Harrow, Lee, Peres ‘13) Applications: quasipolynomial-time algorithm for testing separability, faithfulness squashed entanglement, SoS hierarchy, etc… How about the distance to QMC? Open question: ? One Stronger Subadditivity

23 Approximate Reconstruction “Small CMI” and “being close to QMC state” are not equivalent (in a dimensional independent way for trace norm or fidelity) Classically: Conjecture (I. Kim, A. Kitaev, L. Zhang, …) or just

24 Approximate Reconstruction “Small CMI” and “being close to QMC state” are not equivalent (in a dimensional independent way for trace norm or fidelity) Classically: Conjecture (I. Kim, A. Kitaev, L. Zhang, …) or just Conjecture (A. Kitaev; I. Kim; L. Zhang; Berta, Seshadreesan, Wilde, …) or

25 Fawzi-Renner Inequality thm (Fawzi, Renner ’14) Fidelity: ½ Renyi Relative Entropy: We have: Weaker than classical inequality

26 Strengthening of Fawzi-Renner thm (B., Harrow, Oppenheim, Strelchuk ‘14)

27 Strengthening of Fawzi-Renner

28 thm (B., Harrow, Oppenheim, Strelchuk ‘14) Strengthening of Fawzi-Renner For classical ρ BCR, we recover the original bound on CMI:

29 thm (B., Harrow, Oppenheim, Strelchuk ‘14) Strengthening of Fawzi-Renner For, big gap:

30 thm (B., Harrow, Oppenheim, Strelchuk ‘14) From: and Strengthening of Fawzi-Renner

31 thm (B., Harrow, Oppenheim, Strelchuk ‘14) From: and From: Properties relative entropy (Piani ‘09) Strengthening of Fawzi-Renner

32 thm (B., Harrow, Oppenheim, Strelchuk ‘14) From: and From: Properties relative entropy (Piani ‘09) From: State Redistribution Protocol (Devetak, Yard ‘04) Strengthening of Fawzi-Renner

33 Comparison Lower Bounds on CMI 1.State Redistribution: 1.Hypothesis Testing: Regularized relative entropy of entanglement: 1.State Redistribution: 2. Hypothesis Testing:

34 State Redistribution AliceBob A C Reference BR X Y

35 State Redistribution AliceBob Reference A C BR X Y (Devetak, Yard ‘06) Optimal quantum communication rate: ½ I(C:R|B) I.e. there exist encodings E n : A n C n X n -> A n G n and decodings D n : B n G n Y n -> B n C n s.t.

36 Proof of … Idea: Consider the optimal state redistribution protocol and replace the quantum communication by white noise (maximally mixed state) …

37 Proof of … Idea: Consider the optimal state redistribution protocol and replace the quantum communication by white noise (maximally mixed state) After encoding: …

38 Proof of … Idea: Consider the optimal state redistribution protocol and replace the quantum communication by white noise (maximally mixed state) After encoding: As and …

39 Proof of … Idea: Consider the optimal state redistribution protocol and replace the quantum communication by white noise (maximally mixed state) After encoding: As and By op monotonicity of the log: 0 …

40 Multiplicativity Fidelity of Recovery (Berta, Tomamichel ‘15) Let: Can get Fawzi-Renner bound as follows:

41 Open questions: Improve bound to relative entropy (like the classical) Prove the bound for the transpose channel Prove Li-Winter conjecture: For every states ρ, σ and channel Λ there is a channel Γ s.t. Γ(Λ(σ)) = σ and see (Berta, Lemm, Wilde ‘14) for partial progress Find (more) applications of the FR inequality! Find more improvements of SSA; see (Kim ‘12) for another (dis)Prove:


Download ppt "Stronger Subadditivity Fernando G.S.L. Brandão University College London -> Microsoft Research Coogee 2015 based on arXiv:1411.4921 with Aram Harrow Jonathan."

Similar presentations


Ads by Google