Quantu Inforation Theory and Measure Concentration Patrick Hayden (McGill) Overview!! What is inforation theory?!! Entropy, copression, noisy coding and beyond!! What does it have to do with quantu echanics?!! Noise in the quantu echanical foralis!! Density operators, the partial trace, quantu operations!! Classical inforation through quantu channels!! Entangleent in rando subspaces!! Quantu inforation through quantu channels AMS Short Course, Jan 2009 http://www.cs.cgill.ca/~patrick/as2009 Inforation (Shannon) theory!! A practical question:!! How to best ake use of a given counications resource?!! A atheatico-episteological question:!! How to quantify uncertainty and inforation?!! Shannon:!! Solved the first by considering the second.!! A atheatical theory of counication [1948] The Quantifying uncertainty!! Entropy: H(X) = -! x p(x) log 2 p(x)!! Proportional to entropy of statistical physics!! Ter suggested by von Neuann (ore on hi later)!! Can arrive at definition axioatically:!! H(X,Y) = H(X) + H(Y) for independent X, Y, etc.!! Operational point of view X 21 X n Copression Source of independent copies of X 2 nh(x) typical strings Can copress n copies of X to a binary string of length ~nh(x) If X is binary: 0000100111010100010101100101 About np(x=0) 0 s and np(x=1) 1 s {0,1} n : 2 n possible strings Typicality in ore detail!! Let x n = x 1,x 2,,x n with x j! X!! We say that x n is "-typical with respect to p(x) if!! For all a! X with p(a)>0, 1/n N(a x n ) p(a) < " / X!! For all a! X with p(a) = 0, N(a x n )=0.!! For ">0, the probability that a rando string X n is "-typical goes to 1.!! If x n is "-typical, 2 -n[h(x)+"] " p(x n ) " 2 -n[h(x)-"]!! The nuber of "-typical strings is bounded above by 2 n[h(x)+"]#
Quantifying inforation Sending inforation H(X) H(X,Y) H(Y) Statistical odel of a noisy channel: # Uncertainty in X when value of Y is known H(X Y) = H(X,Y)-H(Y) = E Y H(X Y=y) H(X Y) I(X;Y) H(Y X) Inforation is that which reduces uncertainty Shannon s noisy coding theore: In the liit of any uses, the optial rate (in bits per channel use) at which Alice can send essages reliably to Bob through $ is given by the forula I(X;Y) = H(X) H(X Y) = H(X)+H(Y)-H(X,Y) Data processing inequality Optiality in Shannon s theore Alice X ( X, Y ) Bob Y tie X n Shannon s noisy coding theore: In the liit of any uses, the optial rate at which Alice can send essages reliably to Bob through $ is given by the forula Y n p(z x) Z I(X;Y)! Y I(Z;Y)! Assue there exists a code with rate R and perfect decoding. Let M be the rando variable corresponding to the unifor distribution over essages. nr = H(M) = I(M;M ) " I(M;Y n ) " I(X n ;Y n ) "! j=1 n I(X j,y j ) " n ax p(x) I(X;Y)! I(X;Y)! I(Z;Y)! Perfect decoding: M=M M has nr bits of entropy Ter by ter Soe fiddling Data processing Shannon theory provides Quantu Shannon Theory provides!! Practically speaking:!! A holy grail for error-correcting codes!! Conceptually speaking:!! An operationally-otivated way of thinking about correlations!! What s issing (for a quantu echanic)?!! Features fro linear structure: Entangleent and non-orthogonality!! Quantitative & operational theory of quantu correlation: qubits, cbits, ebits, cobits, sbits!! Relies on a!! Major siplifying assuption: Coputation is free!! Minor siplifying assuption: Noise and data have regular structure
Superdense coding Before we get going: Soe unavoidable foralis To send i! {0,1,2,3} Tie! i 1 ebit + 1 qubit % 2 cbits % 0 $!! We need quantu generalizations of:!! Probability distributions (density operators)!! Marginal distributions (partial trace)!! Noisy channels (quantu operations) % i $ Mixing quantu states: The density operator The partial trace Draw & x $ with probability p(x) Perfor a easureent { 0$, 1$}: Probability of outcoe j: q j =! x p(x) &j & x $ 2 =! x p(x) tr[ j$& j & x $&& x ] ' (# ' ()# {M k }!! Suppose that ' AB is a density operator on A'B!! Alice easures {M k } on A!! Outcoe probability is q k = tr[ (M k ' I B ) ' AB ] # $ $ # % $ #& $ # ' $ = tr[ j$& j ' ], " =! x p(x) # x $&# x Outcoe probability is linear in '#!! Define ' A = tr B [' AB ] =! j B &j ' AB j$ B.!! Then q k = tr[ M k ' A ]!! ' A describes outcoe statistics for all possible experients by Alice alone Purification Quantu (noisy) channels: Analogs of p(y x) ' (# *$!! Suppose that ' A is a density operator on A!! Diagonalize ' A =! i + i, i $&, i!! Let *$ =! i + i 1/2, i $ A i$ B!! Note that ' A = tr B [*]!! *$ is a purification of '#!! Syetry: ' ( =* ( and * ) have sae non-zero eigenvalues# What reasonable constraints ight such a channel $:A( B satisfy? 1)! Take density operators to density operators 2)! Convex linearity: a ixture of input states should be apped to a corresponding ixture of output states Surprising fact: All such aps can, in principle, be realized physically Must be interpreted very strictly Require that ($ ' I C )(' AC ) always be a density operator too Doesn t coe for free! Let T be the transpose ap on A. If *$ = 00$ AC + 11$ AC, then (T' I C )( *$&* ) has negative eigenvalues The resulting set of transforations on density operators are known as trace-preserving, copletely positive aps
Quantu channels: exaples Further exaples!! Adjoining ancilla: '! ' ' 0$&0!! Unitary transforations: '! U'U!! Partial trace: ' AB! tr B [' AB ]!! That s it! All channels can be built out of these operations: '# 0$ U Stinespring dilation -# *! =!. A k " A k with!. A k A k = I Operator-su representation!! The depolarizing channel: # #'! (1-p)' + p I/2!! The dephasing channel # #'!! j &j ' j$ j$&j '# 0$ Equivalent to easuring { j$} then forgetting the outcoe -# Notions of distinguishability Basic requireent: quantu channels do not increase distinguishability Fidelity F(',-)=[Tr[(' 1/2 -' 1/2 ) 1/2 ]] 2 F=0 for perfectly distinguishable F=1 for identical F(',-)=ax &* ' * - $ 2 F($('),$(-)) % F(',-) Trace distance T(',-)= '-- 1 T=2 for perfectly distinguishable T=0 for identical T(',-)=2ax p(k=0 ')-p(k=0 -) ax is over easureents {M k } T(',-) % T($('),$(-)) Back to inforation theory! Stateents ade today hold for both easures Quantifying uncertainty Quantifying uncertainty: Exaples!! Let ' =! x p(x) & x $&& x be a density operator!! von Neuann entropy: H(') = - tr [' log ']#!! Equal to Shannon entropy of ' eigenvalues#!! Analog of a joint rando variable:!! ' AB describes a coposite syste A ' B!! H(A) ' = H(' A ) = H( tr B ' AB )!! H( &$&& ) = 0!! H(I/2) = 1!! H(''-) = H(') + H(-)!! H(I/2 n ) = n!! H(p' ) (1-p)-) = H(p,1-p) + ph(') + (1-p)H(-)
Copression The typical subspace Source of independent copies of ' () : " " "! " " (aka typical subspace) A A A B B B di(effective supp of ' B ' n ) ~ 2 nh(b) Can copress n copies of B to a syste of ~nh(b) qubits while preserving correlations with A No statistical assuptions: Just quantu echanics! B ' n!! Diagonalize ' =! x p(x) e x $&e x!! Then ' 'n =! x n p(x n ) e x n $&e x n!! The "-typical projector / t is the projector onto the span of the e x n $&e x n such that x n is typical wrt p(.)!! tr[' ' n / t ] ( 1 as n ( + [Schuacher, Petz] Quantifying inforation Quantifying inforation H(A) '# H(AB) '# H(B) '# H(A) '# H(AB) '# H(B) '# Uncertainty in A when value of B is known? H(A B) = H(AB)-H(B) H(A B) '# H(B A) '# Uncertainty in A when value of B is known? H(A B) = H(AB)-H(B) H(A B) '# I(A;B) '# H(B A) '# %$ AB = 0$ A 0$ B + 1$ A 1$ B % B = I/2 H(A B) % = 0 1 = -1 Conditional entropy can be negative! Inforation is that which reduces uncertainty I(A;B) = H(A) H(A B) = H(A)+H(B)-H(AB) % 0 Sending classical inforation Sending classical inforation Physical odel of a noisy channel: (Trace-preserving, copletely positive ap) (! state) (easureent) (! state) (easureent) HSW noisy coding theore: In the liit of any uses, the optial rate at which Alice can send essages reliably to Bob through $ is given by the (regularization of the) forula X 1,X 2,,X n 2 nh(b) B ' n
Sending classical inforation Data processing inequality (Strong subadditivity) (! state) (easureent) Alice! AB Bob tie '# I(A;B) '# X 1,X 2,,X n 2 nh(b) B ' n U -# I(A;B) -# Exercise: Show that data-processing iplies H(A B) ( % H(A BC) ( for any (. Distinguish using well-chosen easureent I(A;B) ' % I(A;B) -# Optiality in the HSW theore The additivity conjecture: The liit isn t necessary (! state) - (easureent) Assue there exists a code with rate R with perfect decoding. Let M be the rando variable corresponding to the unifor distribution over essages. nr = H(M) = I(M;M ) " I(A;B) "! Holevo, Datta, Fukuda, King, Ruskai, Schuacher, Shirokov, Shor, Werner Counterexaple by Hastings in 2008 Perfect decoding: M=M M has nr bits of entropy Data processing Why did they care so uch? The additivity conjecture: The liit isn t necessary Surprises in high diension!! Choose a rando pure quantu state: #! R C d A ' C d B!! What can we expect of #? (d A # d B ) Operational interpretation: Alice doesn t need to entangle her inputs across ultiple uses of the channel. Codewords look like! x$ '! x% ' " '! xn Hastings counterexaple based on existence of highly entangled subspaces!! On average, states are highly entangled Lubkin, Lloyd, Page, Foong & Kanno, Sanchez-Ruiz, Sen
Concentration of easure Application to entropy A n 3# A n ~ exp[-n f(3)]!! Choose a rando pure quantu state: #! R C d A ' C d B (d A " d B ) S n LEVY: Given an $-Lipschitz function f : S n ( R with edian M, the probability that a rando x! R S n is further than " fro M is bounded above by exp (-n" 2 C/$ 2 ) fro soe C > 0. P H(& A ) U"S, A 2 B Rando subspaces 1)! Choose a fine net F of states on the unit sphere of subspace S. 2)! P( Not all states in UF highly entangled ) " F P( One state isn t ) 3)! Highly entangled for sufficiently fine N iplies sae for all states in S. THEOREM: There exist subspaces of diension Cd A d B 0 3 /(log d A ) 3, all of whose states have entangleent at least log d A - 0-21. The probability that a rando subspace does goes to 1 with d A d B. In qubit language!! In a bipartite syste of n by n+o(n) qubits, there exists a subspace of 2n o(n) qubits in which all states have at least n o(1) ebits of entangleent.!! The subspace of nearly axially entangled states is alost as big as the whole syste! Copare to pairs of qubits Credit credit is due!! The subspace spanned by two or ore Bell pairs always contains soe product states. (No subspaces of entangled states, let alone axially entangled states.)!! Accidental quantu inforation theorists?!! Milan and Schechtan. Asyptotic theory of finite diensional nored spaces. Springer-Verlag, 1986.!! Others: Gowers, Groov, Ledoux, Szarek, Talagrand C 2 ' C 2
Sending quantu inforation Take-hoe essage Physical odel of a noisy channel: (Trace-preserving, copletely positive ap) &$! C d (TPCP ap) (TPCP ap) LSD noisy coding theore: In the liit of any uses, the optial rate at which Alice can reliably send qubits to Bob (1/n log d) through $ is given by the (regularization of the) forula & Conditional entropy!!! Inforation theory can be generalized to analyze quantu inforation processing!! Yields a rich theory, surprising conceptual siplicity!! Operational approach to thinking about quantu echanics:!! Copression, essage transission, subspace transission!! Powerful techniques for dealing with noise!! Measure concentration to explore high-diensional spaces Further reading Soe things I haven t shown you!! Nielsen & Chuang: Quantu Coputation and Quantu Inforation!! The additivity conjecture [Holevo ICM Proceedings 2006]!! Hastings counterexaples: arxiv:0809.3972!! Entangled subspaces: arxiv:0407049!! Quantu capacity proble: Open Systes and Inforation Dynaics special issue 15(1)!! Merging and splitting: The other of all protocols [HOW quant-ph/0505062; ADHW quant-ph/0606225]!! Quantifying entangleent: Foration, distillation and everything in between [HHHH quant-ph/0702225]!! Beating teleportation: Reote state preparation and its cousins [BHLSW quantph/0307100; HLSW quant-ph/0307104]