Exponential Separation of Quantum Communication and Classical Information

Similar documents
Communication vs information complexity, relative discrepancy and other lower bounds

Lecture 16 Oct 21, 2014

Information Complexity and Applications. Mark Braverman Princeton University and IAS FoCM 17 July 17, 2017

The Communication Complexity of Correlation. Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan

Information Complexity vs. Communication Complexity: Hidden Layers Game

Interactive information and coding theory

Lower bounds on information complexity via zero-communication protocols and applications

Lecture 11: Quantum Information III - Source Coding

AQI: Advanced Quantum Information Lecture 6 (Module 2): Distinguishing Quantum States January 28, 2013

On Quantum vs. Classical Communication Complexity

How to Compress Interactive Communication

Fourier analysis of boolean functions in quantum computation

Lecture 18: Quantum Information Theory and Holevo s Bound

Lecture 21: Quantum communication complexity

Communication Complexity 16:198:671 2/15/2010. Lecture 6. P (x) log

An exponential separation between quantum and classical one-way communication complexity

THE UNIVERSITY OF CHICAGO COMMUNICATION COMPLEXITY AND INFORMATION COMPLEXITY A DISSERTATION SUBMITTED TO

Quantum Communication Complexity

Quantum Teleportation Pt. 1

Towards Optimal Deterministic Coding for Interactive Communication

A direct product theorem for the two-party bounded-round public-coin communication complexity

Entanglement Manipulation

Transmitting and Hiding Quantum Information

Strong converse theorems using Rényi entropies

Lecture Lecture 9 October 1, 2015

Noisy Interactive Quantum Communication

14. Direct Sum (Part 1) - Introduction

Interactive Communication for Data Exchange

to mere bit flips) may affect the transmission.

Internal Compression of Protocols to Entropy

Direct Product Theorems for Classical Communication Complexity via Subdistribution Bounds

Interactive Channel Capacity

Squashed entanglement

An Information Complexity Approach to the Inner Product Problem

Lecture 20: Lower Bounds for Inner Product & Indexing

Algebraic Problems in Computational Complexity

Classical and Quantum Channel Simulations

A strong direct product theorem in terms of the smooth rectangle bound

Quantum rate distortion, reverse Shannon theorems, and source-channel separation

9. Distance measures. 9.1 Classical information measures. Head Tail. How similar/close are two probability distributions? Trace distance.

Information Theoretic Limits of Randomness Generation

Towards a Reverse Newman s Theorem in Interactive Information Complexity

Quantum Teleportation Pt. 3

Exponential Separation of Quantum and Classical Non-Interactive Multi-Party Communication Complexity

Gilles Brassard. Université de Montréal

Concentration of Measure Effects in Quantum Information. Patrick Hayden (McGill University)

Converse bounds for private communication over quantum channels

Quantum Information Theory and Cryptography

arxiv: v5 [cs.cc] 10 Jul 2014

Detecting pure entanglement is easy, so detecting mixed entanglement is hard

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

Uniformly Additive Entropic Formulas

Entropies & Information Theory

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

Lecture 8: Shannon s Noise Models

Ping Pong Protocol & Auto-compensation

Information-theoretic treatment of tripartite systems and quantum channels. Patrick Coles Carnegie Mellon University Pittsburgh, PA USA

Spin chain model for correlated quantum channels

The one-way communication complexity of the Boolean Hidden Matching Problem

Non-locality and Communication Complexity

On the Role of Shared Entanglement

Efficient Probabilistically Checkable Debates

Simultaneous Communication Protocols with Quantum and Classical Messages

EXPONENTIAL SEPARATION OF QUANTUM AND CLASSICAL ONE-WAY COMMUNICATION COMPLEXITY

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Entanglement and Quantum Teleportation

Asymptotic Pure State Transformations

A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks

Entropy Accumulation in Device-independent Protocols

Constant-rate coding for multiparty interactive communication is impossible

Other Topics in Quantum Information

Quantum Data Compression

Classical data compression with quantum side information

Instantaneous Nonlocal Measurements

Lecture: Quantum Information

Lecture 11: Continuous-valued signals and differential entropy

Introduction to Quantum Information Hermann Kampermann

Lecture 19 October 28, 2015

Explicit Capacity Approaching Coding for Interactive Communication

An Introduction to Quantum Information. By Aditya Jain. Under the Guidance of Dr. Guruprasad Kar PAMU, ISI Kolkata

Topology Matters in Communication

Grothendieck Inequalities, XOR games, and Communication Complexity

2015 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media,

Compression and entanglement, entanglement transformations

Optical Quantum Communication with Quantitative Advantage. Quantum Communication

6.1 Main properties of Shannon entropy. Let X be a random variable taking values x in some alphabet with probabilities.

Problem Set: TT Quantum Information

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

Teleportation of Quantum States (1993; Bennett, Brassard, Crepeau, Jozsa, Peres, Wootters)

Limits on classical communication from quantum entropy power inequalities

Quantum Pseudo-Telepathy

Quantum Communication Complexity

QUANTUM COMMUNICATIONS BASED ON QUANTUM HASHING. Alexander Vasiliev. Kazan Federal University

On the tightness of the Buhrman-Cleve-Wigderson simulation

Quantum Information Chapter 10. Quantum Shannon Theory

Quantum Information & Quantum Computation

Majority is incompressible by AC 0 [p] circuits

Quantum Achievability Proof via Collision Relative Entropy

Equality, Revisited. Ralph C. Bottesch Dmitry Gavinsky Hartmut Klauck

Lecture 3: Oct 7, 2014

Transcription:

Exponential Separation of Quantum Communication and Classical Information Dave Touchette IQC and C&O, University of Waterloo, and Perimeter Institute for Theoretical Physics jt. work with Anurag Anshu (CQT, NUS), Penghui Yao (QuICS, UMD) and Nengkun Yu (QSI, UTS) arxiv: 1611.08946 QIP 2017, Seattle, 20 January 2017 touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 1 / 30

Interactive Communication Communication complexity setting: How much communication/information to compute f on (x, y) µ touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 2 / 30

Interactive Communication Communication complexity setting: How much communication/information to compute f on (x, y) µ Information content of interactive protocols? touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 2 / 30

Interactive Communication Communication complexity setting: How much communication/information to compute f on (x, y) µ Information content of interactive protocols? Information vs. Communication: Compression? touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 2 / 30

Interactive Communication Communication complexity setting: How much communication/information to compute f on (x, y) µ Information content of interactive protocols? Information vs. Communication: Compression? Classical vs. Quantum? touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 2 / 30

Main result Th.: classical task (f, µ, ɛ) s.t. QCC(f, µ, ɛ) 2Ω(IC(f,µ,ɛ)) f (x, y) Boolean function, (x, y) µ, ɛ constant error, say 1/10 QCC: quantum communication complexity IC: classical information complexity touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 3 / 30

Main result Th.: classical task (f, µ, ɛ) s.t. QCC(f, µ, ɛ) 2Ω(IC(f,µ,ɛ)) f (x, y) Boolean function, (x, y) µ, ɛ constant error, say 1/10 QCC: quantum communication complexity IC: classical information complexity Cor.: Limit on Direct sum theorems: 1 Amortized CC: ACC(f, µ, ɛ) = lim n n CC((f, µ, ɛ) n ) IC(f, µ, ɛ) = ACC(f, µ, ɛ) AQCC(f, µ, ɛ) QCC((f, µ, ɛ) n ) Ω(n QCC(f, µ, ɛ)) touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 3 / 30

Main result Th.: classical task (f, µ, ɛ) s.t. QCC(f, µ, ɛ) 2Ω(IC(f,µ,ɛ)) f (x, y) Boolean function, (x, y) µ, ɛ constant error, say 1/10 QCC: quantum communication complexity IC: classical information complexity Cor.: Limit on Direct sum theorems: 1 Amortized CC: ACC(f, µ, ɛ) = lim n n CC((f, µ, ɛ) n ) IC(f, µ, ɛ) = ACC(f, µ, ɛ) AQCC(f, µ, ɛ) QCC((f, µ, ɛ) n ) Ω(n QCC(f, µ, ɛ)) Cor.: Limit on interactive compression: QIC: quantum information complexity IC(f, µ, ɛ) QIC(f, µ, ɛ) QCC(f, µ, ɛ) O(QIC(f, µ, ɛ)) In more details... touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 3 / 30

Unidirectional Classical Communication Compress messages with low information content Today, interested in noiseless communication channel touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 4 / 30

Information Theory I How to quantify classical information? Shannon s entropy! (Finite) Random Variable X of distribution p X has entropy H(X ) Operational significance: optimal asymptotic rate of compression for 1 i.i.d. copies of X : n M H(X ) bits touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 5 / 30

Information Theory I How to quantify classical information? Shannon s entropy! (Finite) Random Variable X of distribution p X has entropy H(X ) Operational significance: optimal asymptotic rate of compression for 1 i.i.d. copies of X : n M H(X ) bits Single-copy, optimal variable length encoding, e.g. Huffman code: H(X ) E( M ) H(X ) + 1 touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 5 / 30

Information Theory II Many Derived Quantities Conditional Entropy H(X Y ) = E y H(X Y = y) Chain rule for entropy: H(XY ) = H(Y ) + H(X Y ) Operational interpretation: Source X, side information Y, 1 lim n n M = H(X Y ) : Alice does not know Y! touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 6 / 30

Information Theory II Many Derived Quantities Conditional Entropy H(X Y ) = E y H(X Y = y) Chain rule for entropy: H(XY ) = H(Y ) + H(X Y ) Operational interpretation: Source X, side information Y, 1 lim n n M = H(X Y ) : Alice does not know Y! Mutual Information I (X ; C) = H(X ) H(X C) = I (C; X ) Data Processing I (X ; C) I (X ; N(C)), with N a stochastic map Conditional Mutual Information I (X : Y Z) = E z I (X ; Y Z = z) Chain rule: I (X1 X 2 X n ; C B) = i n I (X i; C BX 1 X 2 X <i ) I (X1 X 2 X n ; C B) H(C): at most bit length touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 7 / 30

Interactive Classical Communication Communication complexity of bipartite functions c 1 = f 1 (x, r A ), c 2 = f 2 (y, c 1, r B ), c 3 = f 3 (x, c 1, c 2, r A ), Protocol transcript Π(x, y, r A, r B ) = c 1 c 2 c M Classical protocols: Π memorizes whole history CC(f, µ, ɛ) = min Π CC(Π) CC(Π) = c 1 + c 2 + + c M touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 8 / 30

Information Cost of Interactive Protocols Can we compress protocols that do not convey much information For many copies run in parallel? For a single copy? touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 9 / 30

Information Cost of Interactive Protocols Can we compress protocols that do not convey much information For many copies run in parallel? For a single copy? What is the amount of information conveyed by a protocol? Total amount of information leaked at end of protocol? Sum of information content of each transmitted message? Optimal asymptotic compression rate? touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 9 / 30

Classical Information Complexity Information cost: IC(Π, µ) = I (X : Π Y ) + I (Y : Π X ) [Barak, Braverman, Chen, Rao 2010] Amount of information each party learns about the other s input from the final transcript touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 10 / 30

Classical Information Complexity Information cost: IC(Π, µ) = I (X : Π Y ) + I (Y : Π X ) [Barak, Braverman, Chen, Rao 2010] Amount of information each party learns about the other s input from the final transcript Information complexity: IC(f, µ, ɛ) = inf Π IC(Π, µ) Least amount of info Alice and Bob must reveal to compute (f, µ, ɛ) touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 10 / 30

Classical Information Complexity Information cost: IC(Π, µ) = I (X : Π Y ) + I (Y : Π X ) [Barak, Braverman, Chen, Rao 2010] Amount of information each party learns about the other s input from the final transcript Information complexity: IC(f, µ, ɛ) = inf Π IC(Π, µ) Least amount of info Alice and Bob must reveal to compute (f, µ, ɛ) Important properties: T = (f, µ, ɛ): Task of computing f with average error ɛ w.r.t. µ T1 T 2 : Product task Additivity: IC(T 1 T 2 ) = IC(T 1 ) + IC(T 2 ) Lower bounds communication: IC(T ) CC(T ) Operational interpretation: 1 IC(T ) = ACC(T ) = lim n n CC(T n ) [Braverman, Rao 2011] Continuity, etc. touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 10 / 30

Direct Sum Direct sum: CC((f, ɛ) n ) Ω(n CC(f, ɛ))? 1 Remember IC(f, ɛ) = lim n n CC ((f, ɛ) n ) Direct sum related to one-shot compression down to IC T T n T... (n times) T touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 11 / 30

Direct Sum Direct sum: CC((f, ɛ) n ) Ω(n CC(f, ɛ))? 1 Remember IC(f, ɛ) = lim n n CC ((f, ɛ) n ) Direct sum related to one-shot compression down to IC Partial results... Classical: [Chakrabarti, Shi, Wirth, Yao 2001; Jain Radhakrishnan, Sen 2003; Harsha, Jain, McAllister, Radhakrishnan 2007; Jain, Klauck, Nayak 2008; Barak, Braverman, Chen, Rao 2010; Braveman, Rao 2011; Braverman 2012; Kol 2015; Sherstov 2016;... ] Quantum: [Jain, Radhakrishnan, Sen 2005; Ambainis, Spalek, De Wolf 2006; Klauck, Jain 2009; Sherstov 2012; Anshu, Jain, Mukhopadhyay, Shayeghi, Yao 2014; T. 2015;... ] touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 12 / 30

Direct Sum Direct sum: CC((f, ɛ) n ) Ω(n CC(f, ɛ))? 1 Remember IC(f, ɛ) = lim n n CC ((f, ɛ) n ) Direct sum related to one-shot compression down to IC Partial results... Fails in general [Ganor, Kol, Raz 2014, 2015, 2016; Rao, Sinha 2015]: (f, µ, ɛ) s.t. CC(f, µ, ɛ) 2 IC(f,µ,ɛ) touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 13 / 30

Quantum Information Theory I von Neumann s quantum entropy: H(A) ρ Characterizes optimal rate for quantum source compression [Schumacher] touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 14 / 30

Quantum Information Theory II Derived quantities defined in formal analogy to classical quantities H(A B) E b H(A B = b) Use H(A B) = H(AB) H(B) Conditional entropy can be negative! I (A; B) = H(A) H(A B) = I (B; A) I (A; B C) E c I (A; B C = c) Use I (A; B C) = I (A; BC) I (A; C) Get Chain Rule Non negativity holds [Lieb, Ruskai 73] Data Processing also holds touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 15 / 30

Quantum Communication Complexity 2 Models for computing classical f : X Y Z 0 0 0 A C Alice Bob U1 B x Yao C U2 y No Entanglement Quantum Communication C x U3... Ψ A B Cleve-Buhrman Alice Bob M1 Entanglement Exponential separations in communication complexity Classical vs. quantum N-rounds vs. N+1-rounds 0 x C M2 C x M3... y Classical Communication touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 16 / 30

Quantum Information Complexity QIC(f, µ, ɛ) = inf Π QIC(Π, µ) QIC(Π, µ): based on I (X ; C YB) Properties: Additivity: QIC(T 1 T 2 ) = QIC(T 1 ) + QIC(T 2 ) Lower bounds communication: QIC(T ) QCC(T ) Operational interpretation [T.]: 1 QIC(T ) = AQCC(T ) = lim n n QCC(T n ) touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 17 / 30

Implications of Main Result Recall: (f, µ, ɛ) s.t. QCC(f, µ, ɛ) 2Ω(IC(f,µ,ɛ)) f Boolean-valued function, ɛ constant, say 1/10 Implications... touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 18 / 30

Implications of Main Result Recall: (f, µ, ɛ) s.t. QCC(f, µ, ɛ) 2Ω(IC(f,µ,ɛ)) f Boolean-valued function, ɛ constant, say 1/10 Implications... No strong Direct sum theorem for Quantum Communication Complexity: QCC((f, µ, ɛ) n ) Ω(nQCC(f, µ, ɛ)) Even stronger: ACC(f, µ, ɛ) O(lg QCC(f, µ, ɛ))! IC and QCC incomparable: (g, η, δ) s.t. IC(g, η, δ) 2 Ω(QCC(g,η,δ)) [Kerenidis, Laplante, Lerays, Roland, Xiao 2012] GDM bound is not (poly) tight for QCC touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 18 / 30

Need for a New Lower Bound Method on QCC Is RDB QCC (poly)? No [Klartag, Regev 2011]: QCC (VSP) lg (RDB (VSP))! We want new method M s.t. M QCC and IC M : Quantum Fooling Distribution! touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 19 / 30

(f, µ, ɛ): Rao-Sinha s k-ary pointer jumping function Two parameters: arity k, depth n. Fix n = 2 k. Input: (x, h A ) with Alice, (y, h B ) with Bob x, y : [k] n [k], x + y mod k defines good path h A, h B : [k] n {0, 1}, h A h B on a good leaf defines output touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 20 / 30

(Quantum) Fooling Distribution Two distributions: fooling dist. p, hard dist. µ, with µ = 1 2 µ 0 + 1 2 µ 1 Hidden layer j [n] x <j = y <j Fix j, x <j = y <j Let G set of good leaves/paths: determined by x j + y j only p: (x, h A ) (y, h B ) product distribution µ b : x G = y G, h G A hg B = b For low QCC: Pr µ0 [Out = 1] Pr p [Out = 1] Pr µ1 [Out = 1] touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 21 / 30

Low Information? On good path, x = y except at level j... If can hide j [n], then information lg k, values of x(z j ), y(z j ), z j = j Must hide j: CC to find j lg n = H(J) = k = 2 O(IC) Hide j by adding noise [Rao, Sinha 2015]: IC O(lg k), We show QCC is at least poly(k) For one round, QCC is Ω(k).. then poly (k) from round elimination. touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 22 / 30

Baby Case for QCC Lower Bound: One-way protocols One Message M depends only on (x, h A ) Output depends only on M and (y, h B ) p vs. µ 0 : massage to M (X G, H G A ) vs. non- M(X G H G A ) Can relate Pr µ0 [Out = 1] Pr p [Out = 1] to I (M; X G H G A ) Shearer: Pr p [Leaf l G] 1 k I (M; X G H G A ) M k touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 23 / 30

Structure of Proof for Multi-Round Protocols touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 24 / 30

First Conversion to One-Way One-round: Alice does not know J Need to send information about all X J, J [n] Multi-round to one-round: guess teleportation transcript, parallel-repeat 2 QCC times Need 2 QCC n = 2 k QCC k Technical issue: repeat once, abort if guess wrong touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 25 / 30

Structure of Proof for Multi-Round Protocols touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 26 / 30

Second Conversion to One-Way Want to perform compression to QIC B A 2 QIC A B [Jain, Radhakrishnan, Sen 2005] Use product structure of distribution p Need to fix protocol, J, embed input X J... Need QCC k to send information about all k possible paths touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 27 / 30

Structure of Proof for Multi-Round Protocols touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 28 / 30

Going from p to µ 0 Distributional Cut-and-Paste If local state is independent of other party s input under p = µ X µ Y Then local state is independent of other party s input under µ XY touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 29 / 30

Outlook Summary: Exponential separation between QCC and IC Strong Direct Sum fails for QCC New QCC lower bound tools Open Questions: External classical Information complexity? What is power of quantum fooling distribution method? Quantum Relative Discrepancy? 2 QIC compression? BBCR-type interactive compression? Partial Direct Sum? touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 30 / 30

Outlook Summary: Exponential separation between QCC and IC Strong Direct Sum fails for QCC New QCC lower bound tools Open Questions: External classical Information complexity? What is power of quantum fooling distribution method? Quantum Relative Discrepancy? 2 QIC compression? BBCR-type interactive compression? Partial Direct Sum? Thank you! touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 30 / 30

Outlook Summary: Exponential separation between QCC and IC Strong Direct Sum fails for QCC New QCC lower bound tools Open Questions: External classical Information complexity? What is power of quantum fooling distribution method? Quantum Relative Discrepancy? 2 QIC compression? BBCR-type interactive compression? Partial Direct Sum? Thank you! See you next year! touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 30 / 30