Presentation is loading. Please wait.

Presentation is loading. Please wait.

Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School.

Similar presentations


Presentation on theme: "Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School."— Presentation transcript:

1 Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

2 Overview  Part I:  What is information theory?  Entropy, compression, noisy coding and beyond  What does it have to do with quantum mechanics?  Noise in the quantum mechanical formalism  Density operators, the partial trace, quantum operations  Some quantum information theory highlights  Part II:  Resource inequalities  A skeleton key

3 Information (Shannon) theory  A practical question:  How to best make use of a given communications resource?  A mathematico-epistemological question:  How to quantify uncertainty and information?  Shannon:  Solved the first by considering the second.  A mathematical theory of communication [1948] The

4 Quantifying uncertainty  Entropy: H(X) = -  x p(x) log 2 p(x)  Proportional to entropy of statistical physics  Term suggested by von Neumann (more on him later)  Can arrive at definition axiomatically:  H(X,Y) = H(X) + H(Y) for independent X, Y, etc.  Operational point of view…

5 X1X1 X 2 … XnXn Compression Source of independent copies of X {0,1} n : 2 n possible strings 2 nH(X) typical strings If X is binary: 0000100111010100010101100101 About nP(X=0) 0’s and nP(X=1) 1’s Can compress n copies of X to a binary string of length ~nH(X)

6 Typicality in more detail  Let x n = x 1,x 2,…,x n with x j 2 X  We say that x n is  -typical with respect to p(x) if  For all a 2 X with p(a)>0, |1/n N(a|x n ) – p(a) | <  / | X |  For all a 2 X with p(a) = 0, N(a|x n )=0.  For  >0, the probability that a random string X n is  -typical goes to 1.  If x n is  -typical, 2 -n[H(X)+  ] · p(x n ) · 2 -n[H(X)-  ]  The number of  -typical strings is bounded above by 2 n[H(X)+ 

7 H(Y) Quantifying information H(X) H(Y|X) Information is that which reduces uncertainty I(X;Y) H(X|Y) Uncertainty in X when value of Y is known H(X|Y)= H(X,Y)-H(Y) = E Y H(X|Y=y) I(X;Y) = H(X) – H(X|Y) = H(X)+H(Y)-H(X,Y) H(X,Y)

8 Sending information through noisy channels Statistical model of a noisy channel: ´ m Encoding Decoding m’ Shannon’s noisy coding theorem: In the limit of many uses, the optimal rate at which Alice can send messages reliably to Bob through  is given by the formula

9 Data processing inequality Alice Bob time I(X;Y) ¸ I(Z;Y) I(X;Y) X Y p(z|x) Z Y I(Z;Y)

10 Optimality in Shannon’s theorem m Encoding Decoding m’ Shannon’s noisy coding theorem: In the limit of many uses, the optimal rate at which Alice can send messages reliably to Bob through  is given by the formula Assume there exists a code with rate R and perfect decoding. Let M be the random variable corresponding to the uniform distribution over messages. XnXn YnYn nR = H(M) = I(M;M’) · I(M;Y n ) · I(X n ;Y n ) ·  j=1 n I(X j,Y j ) · n ¢ max p(x) I(X;Y) M has nR bits of entropy Perfect decoding: M = M ’ Data processing Some fiddling Term by term

11 Shannon theory provides  Practically speaking:  A holy grail for error-correcting codes  Conceptually speaking:  A operationally-motivated way of thinking about correlations  What’s missing (for a quantum mechanic)?  Features from linear structure: Entanglement and non-orthogonality

12 Quantum Shannon Theory provides  General theory of interconvertibility between different types of communications resources: qubits, cbits, ebits, cobits, sbits…  Relies on a  Major simplifying assumption: Computation is free  Minor simplifying assumption: Noise and data have regular structure

13 Before we get going: Some unavoidable formalism  We need quantum generalizations of:  Probability distributions (density operators)  Marginal distributions (partial trace)  Noisy channels (quantum operations)

14 Mixing quantum states: The density operator Draw |  x i with probability p(x) Perform a measurement {|0 i,|1 i }: Probability of outcome j: q j =  x p(x) | h j|  x i | 2 =  x p(x) tr[|j ih j|  x ih  x |] = tr[ |j ih j|  ], where Outcome probability is linear in 

15 Properties of the density operator   is Hermitian:   y = [  x p(x) |  x ih  x |] y =  x p(x) [|  x ih  x |] y =    is positive semidefinite:  h  |  |  i =  x p(x) h  |  x ih  x |  i¸ 0  tr[  ] = 1:  tr[  ] =  x p(x) tr[|  x ih  x |] =  x p(x) = 1  Ensemble ambiguity:  I/2 = ½[|0ih 0| + |1ih 1|] = ½[|+ih+| + |-ih-|]

16 The density operator: examples Which of the following are density operators?

17   The partial trace  Suppose that  AB is a density operator on A­B  Alice measures {M k } on A  Outcome probability is q k = tr[ (M k ­ I B )  AB ]  Define  A = tr B [  AB ] =  j B hj|  AB |ji B.  Then q k = tr[ M k  A ]   A describes outcome statistics for all possible experiments by Alice alone {M k } 

18  i Purification  Suppose that  A is a density operator on A  Diagonalize  A =  i i |  i ih  i |  Let |  i =  i i 1/2 |  i i A |ii B  Note that  A = tr B [  ]  |  i is a purification of   Symmetry:   =   and   have same non-zero eigenvalues 

19 Quantum (noisy) channels: Analogs of p(y|x) What reasonable constraints might such a channel  :A ! B satisfy? 1)Take density operators to density operators 2)Convex linearity: a mixture of input states should be mapped to a corresponding mixture of output states All such maps can, in principle, be realized physically Must be interpreted very strictly Require that (  ­ I C )(  AC ) always be a density operator too Doesn’t come for free! Let T be the transpose map on A. If |  i = |00 i AC + |11 i AC, then (T ­ I C )(|  ih  |) has negative eigenvalues The resulting set of transformations on density operators are known as trace-preserving, completely positive maps

20 Quantum channels: examples  Adjoining ancilla:    ­ |0ih0|  Unitary transformations:   U  U y  Partial trace:  AB  tr B [  AB ]  That’s it! All channels can be built out of these operations: U  |0 i 

21 Further examples  The depolarizing channel:   (1-p)  + p I/2  The dephasing channel    j hj|  |ji  |0 i  Equivalent to measuring {|j i } then forgetting the outcome

22 One last thing you should see...  What happens if a measurement is preceded by a general quantum operation?  Leads to more general types of measurements: Positive Operator- Valued Measures (forevermore POVM)  {M k } such that M k ¸ 0,  k M k = 1  Probability of outcome k is tr[M k  ]

23 POVM’s: What are they good for? Try to distinguish |  0 i =|0 i and |  1 i = |+ i = (|0 i +|1 i )/2 1/2 States are non-orthogonal, so projective measurements won’t work. Let N = 1/(1+1/2 1/2 ). Exercise: M 0 = N |1 ih 1|, M 1 = N |- ih -|, M 2 = I – M 0 – M 1 is a POVM Note:* Outcome 0 implies |  1 i * Outcome 1 implies |  0 i * Outcome 2 is inconclusive Instead of imperfect distinguishability all of the time, the POVM provides perfect distinguishability some of the time.

24 Notions of distinguishability Basic requirement: quantum channels do not increase “distinguishability” FidelityTrace distance F( ,  )=max | h   |   i | 2 T( ,  )=|  -  | 1 F( ,  )=[Tr(  1/2  1/2 )] 2 F=0 for perfectly distinguishable F=1 for identical T=2 for perfectly distinguishable T=0 for identical T( ,  )=2max|p(k=0|  )-p(k=0|  )| where max is over measurements {M k } F(  (  ),  (  )) ¸ F( ,  )T( ,  ) ¸ T(  ( ,  (  )) Statements made today hold for both measures

25 Back to information theory!

26 Quantifying uncertainty  Let  =  x p(x) |  x ih  x | be a density operator  von Neumann entropy: H(  ) = - tr [  log   Equal to Shannon entropy of  eigenvalues  Analog of a joint random variable:   AB describes a composite system A ­ B  H(A)  = H(  A ) = H( tr B  AB )

27 Quantifying uncertainty: examples  H(|  ih  |) = 0  H(I/2) = 1  H(  ­  ) = H(  ) + H(  )  H(I/2 n ) = n  H(p  © (1-p)  ) = H(p,1-p) + pH(  ) + (1-p)H(  )

28 ­­ ­ …­ … ­  Compression Source of independent copies of   : B ­ n dim(Effective supp of  B ­ n ) ~ 2 nH(B) Can compress n copies of B to a system of ~nH(B) qubits while preserving correlations with A No statistical assumptions: Just quantum mechanics! A A A BBB (aka typical subspace) [Schumacher, Petz]

29 The typical subspace  Diagonalize  =  x p(x) |e x ihe x |  Then  ­n =  x n p(x n ) |e x n ihe x n |  The  -typical projector  t is the projector onto the span of the |e x n ihe x n | such that x n is typical  tr[  ­ n  t ] ! 1 as n ! 1

30 H(B)  Quantifying information H(A)  H(B|A)  H(A|B)  Uncertainty in A when value of B is known? H(A|B)= H(AB)-H(B) |  i AB =|0 i A |0 i B +|1 i A |1 i B  B = I/2 H(A|B)  = 0 – 1 = -1 Conditional entropy can be negative ! H(AB) 

31 H(B)  Quantifying information H(A)  H(B|A)  Information is that which reduces uncertainty I(A;B)  H(A|B)  Uncertainty in A when value of B is known? H(A|B)= H(AB)-H(B) I(A;B) = H(A) – H(A|B) = H(A)+H(B)-H(AB) ¸ 0 H(AB) 

32 Sending classical information through noisy channels Physical model of a noisy channel: (Trace-preserving, completely positive map) m Encoding (  state) Decoding (measurement) m’ HSW noisy coding theorem: In the limit of many uses, the optimal rate at which Alice can send messages reliably to Bob through  is given by the ( regularization of the ) formula where

33 Sending classical information through noisy channels m Encoding (  state) Decoding (measurement) m’ B ­ n 2 nH(B) X 1,X 2,…,X n 2 nH(B|A)

34 Sending classical information through noisy channels m Encoding (  state) Decoding (measurement) m’ B ­ n 2 nH(B) X 1,X 2,…,X n 2 nH(B|A) Distinguish using well-chosen POVM

35 Data processing inequality (Strong subadditivity) Alice Bob time U I(A;B)  I(A;B)    I(A;B)  ¸ I(A;B) 

36 Optimality in the HSW theorem Assume there exists a code with rate R with perfect decoding. Let M be the random variable corresponding to the uniform distribution over messages. nR = H(M) = I(M;M’) · I(A;B)  M has nR bits of entropy Perfect decoding: M = M ’ Data processing m Encoding (  state) Decoding (measurement) m’ where mm

37 Sending quantum information through noisy channels Physical model of a noisy channel: (Trace-preserving, completely positive map) |  i 2 C d Encoding (TPCP map) Decoding (TPCP map) ‘‘ LSD noisy coding theorem: In the limit of many uses, the optimal rate at which Alice can reliably send qubits to Bob (1/n log d) through  is given by the ( regularization of the ) formula where Conditional entropy!

38 All x Random 2 n(I(X;Y)-  ) x Entanglement and privacy: More than an analogy p(y,z|x) x = x 1 x 2 … x n y=y 1 y 2 … y n z = z 1 z 2 … z n How to send a private message from Alice to Bob? AC93 Can send private messages at rate I(X;Y)-I(X;Z) Sets of size 2 n(I(X;Z)+  )

39 All x Random 2 n(I(X:A)-  ) x Entanglement and privacy: More than an analogy U A’->BE ­ n |  x i A’ |  i BE = U ­ n |  x i How to send a private message from Alice to Bob? D03 Can send private messages at rate I(X:A)-I(X:E) Sets of size 2 n(I(X:E)+  )

40 All x Random 2 n(I(X:A)-  ) x Entanglement and privacy: More than an analogy U A’->BE ­ n  x p x 1/2 |x i A |  x i A’  x p x 1/2 |x i A |  x i BE How to send a private message from Alice to Bob? SW97 D03 Can send private messages at rate I(X:A)-I(X:E)=H(A)-H(E) Sets of size 2 n(I(X:E)+  ) H(E)=H(AB)

41 Conclusions: Part I  Information theory can be generalized to analyze quantum information processing  Yields a rich theory, surprising conceptual simplicity  Operational approach to thinking about quantum mechanics:  Compression, data transmission, superdense coding, subspace transmission, teleportation

42 Some references: Part I: Standard textbooks: * Cover & Thomas, Elements of information theory. * Nielsen & Chuang, Quantum computation and quantum information. (and references therein) * Devetak, The private classical capacity and quantum capacity of a quantum channel, quant-ph/0304127 Part II: Papers available at arxiv.org: * Devetak, Harrow & Winter, A family of quantum protocols, quant-ph/0308044. * Horodecki, Oppenheim & Winter, Quantum information can be negative, quant-ph/0505062

43


Download ppt "Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School."

Similar presentations


Ads by Google