!! Let x n = x 1,x 2,,x n with x j! X!! We say that x n is "-typical with respect to p(x) if

Similar documents
Concentration of Measure Effects in Quantum Information. Patrick Hayden (McGill University)

The Transactional Nature of Quantum Information

This model assumes that the probability of a gap has size i is proportional to 1/i. i.e., i log m e. j=1. E[gap size] = i P r(i) = N f t.

Entanglement Manipulation

Quantum algorithms (CO 781, Winter 2008) Prof. Andrew Childs, University of Waterloo LECTURE 15: Unstructured search and spatial search

Block designs and statistics

Quantum Information Theory and Cryptography

9. Distance measures. 9.1 Classical information measures. Head Tail. How similar/close are two probability distributions? Trace distance.

Partial traces and entropy inequalities

UCSD Spring School lecture notes: Continuous-time quantum computing

ESE 523 Information Theory

Lecture 11: Quantum Information III - Source Coding

E0 370 Statistical Learning Theory Lecture 6 (Aug 30, 2011) Margin Analysis

Fixed-to-Variable Length Distribution Matching

Physics 215 Winter The Density Matrix

COS 424: Interacting with Data. Written Exercises

Remarks on the Additivity Conjectures for Quantum Channels

Quantum entanglement and entropy

Content-Type Coding. arxiv: v2 [cs.it] 3 Jun Linqi Song UCLA

arxiv: v1 [quant-ph] 3 Jan 2008

Support Vector Machine Classification of Uncertain and Imbalanced data using Robust Optimization

Optimal Jamming Over Additive Noise: Vector Source-Channel Case

Finite fields. and we ve used it in various examples and homework problems. In these notes I will introduce more finite fields

Lecture 18: Quantum Information Theory and Holevo s Bound

Solutions 1. Introduction to Coding Theory - Spring 2010 Solutions 1. Exercise 1.1. See Examples 1.2 and 1.11 in the course notes.

CS Lecture 13. More Maximum Likelihood

Chapter 6 1-D Continuous Groups

Understanding Machine Learning Solution Manual

Physics 201, Lecture 15

Note-A-Rific: Mechanical

1 Generalization bounds based on Rademacher complexity

AQI: Advanced Quantum Information Lecture 6 (Module 2): Distinguishing Quantum States January 28, 2013

Lecture: Quantum Information

2 Q 10. Likewise, in case of multiple particles, the corresponding density in 2 must be averaged over all

arxiv: v3 [quant-ph] 18 Oct 2017

1. Basic rules of quantum mechanics

Feature Extraction Techniques

Entanglement-Assisted Capacity of a Quantum Channel and the Reverse Shannon Theorem

Error Exponents in Asynchronous Communication

Entangling characterization of (SWAP) 1/m and Controlled unitary gates

to mere bit flips) may affect the transmission.

A Holevo-type bound for a Hilbert Schmidt distance measure

On Constant Power Water-filling

On the Bell- Kochen -Specker paradox

E0 370 Statistical Learning Theory Lecture 5 (Aug 25, 2011)

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

A Simple Regression Problem

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Quantum Information Types

Quantum Error Correcting Codes and Quantum Cryptography. Peter Shor M.I.T. Cambridge, MA 02139

Quantum Data Compression

1 Bounding the Margin

Quantum Information Chapter 10. Quantum Shannon Theory

Classical data compression with quantum side information

Uniformly Additive Entropic Formulas

Quantum Information Theory

ADVANCES ON THE BESSIS- MOUSSA-VILLANI TRACE CONJECTURE

A note on the realignment criterion

Chaotic Coupled Map Lattices

Classical and Quantum Channel Simulations

Physics 221B: Solution to HW # 6. 1) Born-Oppenheimer for Coupled Harmonic Oscillators

Bayesian Learning. Chapter 6: Bayesian Learning. Bayes Theorem. Roles for Bayesian Methods. CS 536: Machine Learning Littman (Wu, TA)

The Weierstrass Approximation Theorem

A remark on a success rate model for DPA and CPA

Multi-Scale/Multi-Resolution: Wavelet Transform

ASSUME a source over an alphabet size m, from which a sequence of n independent samples are drawn. The classical

e-companion ONLY AVAILABLE IN ELECTRONIC FORM

Vulnerability of MRD-Code-Based Universal Secure Error-Correcting Network Codes under Time-Varying Jamming Links

3.8 Three Types of Convergence

i ij j ( ) sin cos x y z x x x interchangeably.)

Pattern Recognition and Machine Learning. Learning and Evaluation for Pattern Recognition

Trading classical communication, quantum communication, and entanglement in quantum Shannon theory

Compression and entanglement, entanglement transformations

P (t) = P (t = 0) + F t Conclusion: If we wait long enough, the velocity of an electron will diverge, which is obviously impossible and wrong.

Lecture 4: Postulates of quantum mechanics

DERIVING PROPER UNIFORM PRIORS FOR REGRESSION COEFFICIENTS

Uniform Approximation and Bernstein Polynomials with Coefficients in the Unit Interval

Ocean 420 Physical Processes in the Ocean Project 1: Hydrostatic Balance, Advection and Diffusion Answers

Tight Bounds for Maximal Identifiability of Failure Nodes in Boolean Network Tomography

Computational and Statistical Learning Theory

Lecture 6: Quantum error correction and quantum capacity

Introduction to Robotics (CS223A) (Winter 2006/2007) Homework #5 solutions

Implementing Non-Projective Measurements via Linear Optics: an Approach Based on Optimal Quantum State Discrimination

Compressive Sensing Over Networks

5. Communication resources

Secret-Key Sharing Based on Layered Broadcast Coding over Fading Channels

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 11 10/15/2008 ABSTRACT INTEGRATION I

Lecture 20 November 7, 2013

Distributed Subgradient Methods for Multi-agent Optimization

Asymptotic Pure State Transformations

On Conditions for Linearity of Optimal Estimation

Quantum Mechanics II: Examples

Supplementary Material for Fast and Provable Algorithms for Spectrally Sparse Signal Reconstruction via Low-Rank Hankel Matrix Completion

Quantum decoherence. Éric Oliver Paquette (U. Montréal) -Traces Worshop [Ottawa]- April 29 th, Quantum decoherence p. 1/2

Lecture 11 September 30, 2015

The Hilbert Schmidt version of the commutator theorem for zero trace matrices

Time Reversal and Exchange Symmetries of Unitary Gate Capacities

A Note on Scheduling Tall/Small Multiprocessor Tasks with Unit Processing Time to Minimize Maximum Tardiness

Ph 20.3 Numerical Solution of Ordinary Differential Equations

Using EM To Estimate A Probablity Density With A Mixture Of Gaussians

Transcription:

Quantu Inforation Theory and Measure Concentration Patrick Hayden (McGill) Overview!! What is inforation theory?!! Entropy, copression, noisy coding and beyond!! What does it have to do with quantu echanics?!! Noise in the quantu echanical foralis!! Density operators, the partial trace, quantu operations!! Classical inforation through quantu channels!! Entangleent in rando subspaces!! Quantu inforation through quantu channels AMS Short Course, Jan 2009 http://www.cs.cgill.ca/~patrick/as2009 Inforation (Shannon) theory!! A practical question:!! How to best ake use of a given counications resource?!! A atheatico-episteological question:!! How to quantify uncertainty and inforation?!! Shannon:!! Solved the first by considering the second.!! A atheatical theory of counication [1948] The Quantifying uncertainty!! Entropy: H(X) = -! x p(x) log 2 p(x)!! Proportional to entropy of statistical physics!! Ter suggested by von Neuann (ore on hi later)!! Can arrive at definition axioatically:!! H(X,Y) = H(X) + H(Y) for independent X, Y, etc.!! Operational point of view X 21 X n Copression Source of independent copies of X 2 nh(x) typical strings Can copress n copies of X to a binary string of length ~nh(x) If X is binary: 0000100111010100010101100101 About np(x=0) 0 s and np(x=1) 1 s {0,1} n : 2 n possible strings Typicality in ore detail!! Let x n = x 1,x 2,,x n with x j! X!! We say that x n is "-typical with respect to p(x) if!! For all a! X with p(a)>0, 1/n N(a x n ) p(a) < " / X!! For all a! X with p(a) = 0, N(a x n )=0.!! For ">0, the probability that a rando string X n is "-typical goes to 1.!! If x n is "-typical, 2 -n[h(x)+"] " p(x n ) " 2 -n[h(x)-"]!! The nuber of "-typical strings is bounded above by 2 n[h(x)+"]#

Quantifying inforation Sending inforation H(X) H(X,Y) H(Y) Statistical odel of a noisy channel: # Uncertainty in X when value of Y is known H(X Y) = H(X,Y)-H(Y) = E Y H(X Y=y) H(X Y) I(X;Y) H(Y X) Inforation is that which reduces uncertainty Shannon s noisy coding theore: In the liit of any uses, the optial rate (in bits per channel use) at which Alice can send essages reliably to Bob through $ is given by the forula I(X;Y) = H(X) H(X Y) = H(X)+H(Y)-H(X,Y) Data processing inequality Optiality in Shannon s theore Alice X ( X, Y ) Bob Y tie X n Shannon s noisy coding theore: In the liit of any uses, the optial rate at which Alice can send essages reliably to Bob through $ is given by the forula Y n p(z x) Z I(X;Y)! Y I(Z;Y)! Assue there exists a code with rate R and perfect decoding. Let M be the rando variable corresponding to the unifor distribution over essages. nr = H(M) = I(M;M ) " I(M;Y n ) " I(X n ;Y n ) "! j=1 n I(X j,y j ) " n ax p(x) I(X;Y)! I(X;Y)! I(Z;Y)! Perfect decoding: M=M M has nr bits of entropy Ter by ter Soe fiddling Data processing Shannon theory provides Quantu Shannon Theory provides!! Practically speaking:!! A holy grail for error-correcting codes!! Conceptually speaking:!! An operationally-otivated way of thinking about correlations!! What s issing (for a quantu echanic)?!! Features fro linear structure: Entangleent and non-orthogonality!! Quantitative & operational theory of quantu correlation: qubits, cbits, ebits, cobits, sbits!! Relies on a!! Major siplifying assuption: Coputation is free!! Minor siplifying assuption: Noise and data have regular structure

Superdense coding Before we get going: Soe unavoidable foralis To send i! {0,1,2,3} Tie! i 1 ebit + 1 qubit % 2 cbits % 0 $!! We need quantu generalizations of:!! Probability distributions (density operators)!! Marginal distributions (partial trace)!! Noisy channels (quantu operations) % i $ Mixing quantu states: The density operator The partial trace Draw & x $ with probability p(x) Perfor a easureent { 0$, 1$}: Probability of outcoe j: q j =! x p(x) &j & x $ 2 =! x p(x) tr[ j$& j & x $&& x ] ' (# ' ()# {M k }!! Suppose that ' AB is a density operator on A'B!! Alice easures {M k } on A!! Outcoe probability is q k = tr[ (M k ' I B ) ' AB ] # $ $ # % $ #& $ # ' $ = tr[ j$& j ' ], " =! x p(x) # x $&# x Outcoe probability is linear in '#!! Define ' A = tr B [' AB ] =! j B &j ' AB j$ B.!! Then q k = tr[ M k ' A ]!! ' A describes outcoe statistics for all possible experients by Alice alone Purification Quantu (noisy) channels: Analogs of p(y x) ' (# *$!! Suppose that ' A is a density operator on A!! Diagonalize ' A =! i + i, i $&, i!! Let *$ =! i + i 1/2, i $ A i$ B!! Note that ' A = tr B [*]!! *$ is a purification of '#!! Syetry: ' ( =* ( and * ) have sae non-zero eigenvalues# What reasonable constraints ight such a channel $:A( B satisfy? 1)! Take density operators to density operators 2)! Convex linearity: a ixture of input states should be apped to a corresponding ixture of output states Surprising fact: All such aps can, in principle, be realized physically Must be interpreted very strictly Require that ($ ' I C )(' AC ) always be a density operator too Doesn t coe for free! Let T be the transpose ap on A. If *$ = 00$ AC + 11$ AC, then (T' I C )( *$&* ) has negative eigenvalues The resulting set of transforations on density operators are known as trace-preserving, copletely positive aps

Quantu channels: exaples Further exaples!! Adjoining ancilla: '! ' ' 0$&0!! Unitary transforations: '! U'U!! Partial trace: ' AB! tr B [' AB ]!! That s it! All channels can be built out of these operations: '# 0$ U Stinespring dilation -# *! =!. A k " A k with!. A k A k = I Operator-su representation!! The depolarizing channel: # #'! (1-p)' + p I/2!! The dephasing channel # #'!! j &j ' j$ j$&j '# 0$ Equivalent to easuring { j$} then forgetting the outcoe -# Notions of distinguishability Basic requireent: quantu channels do not increase distinguishability Fidelity F(',-)=[Tr[(' 1/2 -' 1/2 ) 1/2 ]] 2 F=0 for perfectly distinguishable F=1 for identical F(',-)=ax &* ' * - $ 2 F($('),$(-)) % F(',-) Trace distance T(',-)= '-- 1 T=2 for perfectly distinguishable T=0 for identical T(',-)=2ax p(k=0 ')-p(k=0 -) ax is over easureents {M k } T(',-) % T($('),$(-)) Back to inforation theory! Stateents ade today hold for both easures Quantifying uncertainty Quantifying uncertainty: Exaples!! Let ' =! x p(x) & x $&& x be a density operator!! von Neuann entropy: H(') = - tr [' log ']#!! Equal to Shannon entropy of ' eigenvalues#!! Analog of a joint rando variable:!! ' AB describes a coposite syste A ' B!! H(A) ' = H(' A ) = H( tr B ' AB )!! H( &$&& ) = 0!! H(I/2) = 1!! H(''-) = H(') + H(-)!! H(I/2 n ) = n!! H(p' ) (1-p)-) = H(p,1-p) + ph(') + (1-p)H(-)

Copression The typical subspace Source of independent copies of ' () : " " "! " " (aka typical subspace) A A A B B B di(effective supp of ' B ' n ) ~ 2 nh(b) Can copress n copies of B to a syste of ~nh(b) qubits while preserving correlations with A No statistical assuptions: Just quantu echanics! B ' n!! Diagonalize ' =! x p(x) e x $&e x!! Then ' 'n =! x n p(x n ) e x n $&e x n!! The "-typical projector / t is the projector onto the span of the e x n $&e x n such that x n is typical wrt p(.)!! tr[' ' n / t ] ( 1 as n ( + [Schuacher, Petz] Quantifying inforation Quantifying inforation H(A) '# H(AB) '# H(B) '# H(A) '# H(AB) '# H(B) '# Uncertainty in A when value of B is known? H(A B) = H(AB)-H(B) H(A B) '# H(B A) '# Uncertainty in A when value of B is known? H(A B) = H(AB)-H(B) H(A B) '# I(A;B) '# H(B A) '# %$ AB = 0$ A 0$ B + 1$ A 1$ B % B = I/2 H(A B) % = 0 1 = -1 Conditional entropy can be negative! Inforation is that which reduces uncertainty I(A;B) = H(A) H(A B) = H(A)+H(B)-H(AB) % 0 Sending classical inforation Sending classical inforation Physical odel of a noisy channel: (Trace-preserving, copletely positive ap) (! state) (easureent) (! state) (easureent) HSW noisy coding theore: In the liit of any uses, the optial rate at which Alice can send essages reliably to Bob through $ is given by the (regularization of the) forula X 1,X 2,,X n 2 nh(b) B ' n

Sending classical inforation Data processing inequality (Strong subadditivity) (! state) (easureent) Alice! AB Bob tie '# I(A;B) '# X 1,X 2,,X n 2 nh(b) B ' n U -# I(A;B) -# Exercise: Show that data-processing iplies H(A B) ( % H(A BC) ( for any (. Distinguish using well-chosen easureent I(A;B) ' % I(A;B) -# Optiality in the HSW theore The additivity conjecture: The liit isn t necessary (! state) - (easureent) Assue there exists a code with rate R with perfect decoding. Let M be the rando variable corresponding to the unifor distribution over essages. nr = H(M) = I(M;M ) " I(A;B) "! Holevo, Datta, Fukuda, King, Ruskai, Schuacher, Shirokov, Shor, Werner Counterexaple by Hastings in 2008 Perfect decoding: M=M M has nr bits of entropy Data processing Why did they care so uch? The additivity conjecture: The liit isn t necessary Surprises in high diension!! Choose a rando pure quantu state: #! R C d A ' C d B!! What can we expect of #? (d A # d B ) Operational interpretation: Alice doesn t need to entangle her inputs across ultiple uses of the channel. Codewords look like! x$ '! x% ' " '! xn Hastings counterexaple based on existence of highly entangled subspaces!! On average, states are highly entangled Lubkin, Lloyd, Page, Foong & Kanno, Sanchez-Ruiz, Sen

Concentration of easure Application to entropy A n 3# A n ~ exp[-n f(3)]!! Choose a rando pure quantu state: #! R C d A ' C d B (d A " d B ) S n LEVY: Given an $-Lipschitz function f : S n ( R with edian M, the probability that a rando x! R S n is further than " fro M is bounded above by exp (-n" 2 C/$ 2 ) fro soe C > 0. P H(& A ) U"S, A 2 B Rando subspaces 1)! Choose a fine net F of states on the unit sphere of subspace S. 2)! P( Not all states in UF highly entangled ) " F P( One state isn t ) 3)! Highly entangled for sufficiently fine N iplies sae for all states in S. THEOREM: There exist subspaces of diension Cd A d B 0 3 /(log d A ) 3, all of whose states have entangleent at least log d A - 0-21. The probability that a rando subspace does goes to 1 with d A d B. In qubit language!! In a bipartite syste of n by n+o(n) qubits, there exists a subspace of 2n o(n) qubits in which all states have at least n o(1) ebits of entangleent.!! The subspace of nearly axially entangled states is alost as big as the whole syste! Copare to pairs of qubits Credit credit is due!! The subspace spanned by two or ore Bell pairs always contains soe product states. (No subspaces of entangled states, let alone axially entangled states.)!! Accidental quantu inforation theorists?!! Milan and Schechtan. Asyptotic theory of finite diensional nored spaces. Springer-Verlag, 1986.!! Others: Gowers, Groov, Ledoux, Szarek, Talagrand C 2 ' C 2

Sending quantu inforation Take-hoe essage Physical odel of a noisy channel: (Trace-preserving, copletely positive ap) &$! C d (TPCP ap) (TPCP ap) LSD noisy coding theore: In the liit of any uses, the optial rate at which Alice can reliably send qubits to Bob (1/n log d) through $ is given by the (regularization of the) forula & Conditional entropy!!! Inforation theory can be generalized to analyze quantu inforation processing!! Yields a rich theory, surprising conceptual siplicity!! Operational approach to thinking about quantu echanics:!! Copression, essage transission, subspace transission!! Powerful techniques for dealing with noise!! Measure concentration to explore high-diensional spaces Further reading Soe things I haven t shown you!! Nielsen & Chuang: Quantu Coputation and Quantu Inforation!! The additivity conjecture [Holevo ICM Proceedings 2006]!! Hastings counterexaples: arxiv:0809.3972!! Entangled subspaces: arxiv:0407049!! Quantu capacity proble: Open Systes and Inforation Dynaics special issue 15(1)!! Merging and splitting: The other of all protocols [HOW quant-ph/0505062; ADHW quant-ph/0606225]!! Quantifying entangleent: Foration, distillation and everything in between [HHHH quant-ph/0702225]!! Beating teleportation: Reote state preparation and its cousins [BHLSW quantph/0307100; HLSW quant-ph/0307104]