Exponential Separation of Quantum Communication and Classical Information Dave Touchette IQC and C&O, University of Waterloo, and Perimeter Institute for Theoretical Physics jt. work with Anurag Anshu (CQT, NUS), Penghui Yao (QuICS, UMD) and Nengkun Yu (QSI, UTS) arxiv: 1611.08946 QIP 2017, Seattle, 20 January 2017 touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 1 / 30
Interactive Communication Communication complexity setting: How much communication/information to compute f on (x, y) µ touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 2 / 30
Interactive Communication Communication complexity setting: How much communication/information to compute f on (x, y) µ Information content of interactive protocols? touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 2 / 30
Interactive Communication Communication complexity setting: How much communication/information to compute f on (x, y) µ Information content of interactive protocols? Information vs. Communication: Compression? touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 2 / 30
Interactive Communication Communication complexity setting: How much communication/information to compute f on (x, y) µ Information content of interactive protocols? Information vs. Communication: Compression? Classical vs. Quantum? touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 2 / 30
Main result Th.: classical task (f, µ, ɛ) s.t. QCC(f, µ, ɛ) 2Ω(IC(f,µ,ɛ)) f (x, y) Boolean function, (x, y) µ, ɛ constant error, say 1/10 QCC: quantum communication complexity IC: classical information complexity touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 3 / 30
Main result Th.: classical task (f, µ, ɛ) s.t. QCC(f, µ, ɛ) 2Ω(IC(f,µ,ɛ)) f (x, y) Boolean function, (x, y) µ, ɛ constant error, say 1/10 QCC: quantum communication complexity IC: classical information complexity Cor.: Limit on Direct sum theorems: 1 Amortized CC: ACC(f, µ, ɛ) = lim n n CC((f, µ, ɛ) n ) IC(f, µ, ɛ) = ACC(f, µ, ɛ) AQCC(f, µ, ɛ) QCC((f, µ, ɛ) n ) Ω(n QCC(f, µ, ɛ)) touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 3 / 30
Main result Th.: classical task (f, µ, ɛ) s.t. QCC(f, µ, ɛ) 2Ω(IC(f,µ,ɛ)) f (x, y) Boolean function, (x, y) µ, ɛ constant error, say 1/10 QCC: quantum communication complexity IC: classical information complexity Cor.: Limit on Direct sum theorems: 1 Amortized CC: ACC(f, µ, ɛ) = lim n n CC((f, µ, ɛ) n ) IC(f, µ, ɛ) = ACC(f, µ, ɛ) AQCC(f, µ, ɛ) QCC((f, µ, ɛ) n ) Ω(n QCC(f, µ, ɛ)) Cor.: Limit on interactive compression: QIC: quantum information complexity IC(f, µ, ɛ) QIC(f, µ, ɛ) QCC(f, µ, ɛ) O(QIC(f, µ, ɛ)) In more details... touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 3 / 30
Unidirectional Classical Communication Compress messages with low information content Today, interested in noiseless communication channel touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 4 / 30
Information Theory I How to quantify classical information? Shannon s entropy! (Finite) Random Variable X of distribution p X has entropy H(X ) Operational significance: optimal asymptotic rate of compression for 1 i.i.d. copies of X : n M H(X ) bits touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 5 / 30
Information Theory I How to quantify classical information? Shannon s entropy! (Finite) Random Variable X of distribution p X has entropy H(X ) Operational significance: optimal asymptotic rate of compression for 1 i.i.d. copies of X : n M H(X ) bits Single-copy, optimal variable length encoding, e.g. Huffman code: H(X ) E( M ) H(X ) + 1 touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 5 / 30
Information Theory II Many Derived Quantities Conditional Entropy H(X Y ) = E y H(X Y = y) Chain rule for entropy: H(XY ) = H(Y ) + H(X Y ) Operational interpretation: Source X, side information Y, 1 lim n n M = H(X Y ) : Alice does not know Y! touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 6 / 30
Information Theory II Many Derived Quantities Conditional Entropy H(X Y ) = E y H(X Y = y) Chain rule for entropy: H(XY ) = H(Y ) + H(X Y ) Operational interpretation: Source X, side information Y, 1 lim n n M = H(X Y ) : Alice does not know Y! Mutual Information I (X ; C) = H(X ) H(X C) = I (C; X ) Data Processing I (X ; C) I (X ; N(C)), with N a stochastic map Conditional Mutual Information I (X : Y Z) = E z I (X ; Y Z = z) Chain rule: I (X1 X 2 X n ; C B) = i n I (X i; C BX 1 X 2 X <i ) I (X1 X 2 X n ; C B) H(C): at most bit length touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 7 / 30
Interactive Classical Communication Communication complexity of bipartite functions c 1 = f 1 (x, r A ), c 2 = f 2 (y, c 1, r B ), c 3 = f 3 (x, c 1, c 2, r A ), Protocol transcript Π(x, y, r A, r B ) = c 1 c 2 c M Classical protocols: Π memorizes whole history CC(f, µ, ɛ) = min Π CC(Π) CC(Π) = c 1 + c 2 + + c M touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 8 / 30
Information Cost of Interactive Protocols Can we compress protocols that do not convey much information For many copies run in parallel? For a single copy? touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 9 / 30
Information Cost of Interactive Protocols Can we compress protocols that do not convey much information For many copies run in parallel? For a single copy? What is the amount of information conveyed by a protocol? Total amount of information leaked at end of protocol? Sum of information content of each transmitted message? Optimal asymptotic compression rate? touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 9 / 30
Classical Information Complexity Information cost: IC(Π, µ) = I (X : Π Y ) + I (Y : Π X ) [Barak, Braverman, Chen, Rao 2010] Amount of information each party learns about the other s input from the final transcript touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 10 / 30
Classical Information Complexity Information cost: IC(Π, µ) = I (X : Π Y ) + I (Y : Π X ) [Barak, Braverman, Chen, Rao 2010] Amount of information each party learns about the other s input from the final transcript Information complexity: IC(f, µ, ɛ) = inf Π IC(Π, µ) Least amount of info Alice and Bob must reveal to compute (f, µ, ɛ) touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 10 / 30
Classical Information Complexity Information cost: IC(Π, µ) = I (X : Π Y ) + I (Y : Π X ) [Barak, Braverman, Chen, Rao 2010] Amount of information each party learns about the other s input from the final transcript Information complexity: IC(f, µ, ɛ) = inf Π IC(Π, µ) Least amount of info Alice and Bob must reveal to compute (f, µ, ɛ) Important properties: T = (f, µ, ɛ): Task of computing f with average error ɛ w.r.t. µ T1 T 2 : Product task Additivity: IC(T 1 T 2 ) = IC(T 1 ) + IC(T 2 ) Lower bounds communication: IC(T ) CC(T ) Operational interpretation: 1 IC(T ) = ACC(T ) = lim n n CC(T n ) [Braverman, Rao 2011] Continuity, etc. touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 10 / 30
Direct Sum Direct sum: CC((f, ɛ) n ) Ω(n CC(f, ɛ))? 1 Remember IC(f, ɛ) = lim n n CC ((f, ɛ) n ) Direct sum related to one-shot compression down to IC T T n T... (n times) T touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 11 / 30
Direct Sum Direct sum: CC((f, ɛ) n ) Ω(n CC(f, ɛ))? 1 Remember IC(f, ɛ) = lim n n CC ((f, ɛ) n ) Direct sum related to one-shot compression down to IC Partial results... Classical: [Chakrabarti, Shi, Wirth, Yao 2001; Jain Radhakrishnan, Sen 2003; Harsha, Jain, McAllister, Radhakrishnan 2007; Jain, Klauck, Nayak 2008; Barak, Braverman, Chen, Rao 2010; Braveman, Rao 2011; Braverman 2012; Kol 2015; Sherstov 2016;... ] Quantum: [Jain, Radhakrishnan, Sen 2005; Ambainis, Spalek, De Wolf 2006; Klauck, Jain 2009; Sherstov 2012; Anshu, Jain, Mukhopadhyay, Shayeghi, Yao 2014; T. 2015;... ] touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 12 / 30
Direct Sum Direct sum: CC((f, ɛ) n ) Ω(n CC(f, ɛ))? 1 Remember IC(f, ɛ) = lim n n CC ((f, ɛ) n ) Direct sum related to one-shot compression down to IC Partial results... Fails in general [Ganor, Kol, Raz 2014, 2015, 2016; Rao, Sinha 2015]: (f, µ, ɛ) s.t. CC(f, µ, ɛ) 2 IC(f,µ,ɛ) touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 13 / 30
Quantum Information Theory I von Neumann s quantum entropy: H(A) ρ Characterizes optimal rate for quantum source compression [Schumacher] touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 14 / 30
Quantum Information Theory II Derived quantities defined in formal analogy to classical quantities H(A B) E b H(A B = b) Use H(A B) = H(AB) H(B) Conditional entropy can be negative! I (A; B) = H(A) H(A B) = I (B; A) I (A; B C) E c I (A; B C = c) Use I (A; B C) = I (A; BC) I (A; C) Get Chain Rule Non negativity holds [Lieb, Ruskai 73] Data Processing also holds touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 15 / 30
Quantum Communication Complexity 2 Models for computing classical f : X Y Z 0 0 0 A C Alice Bob U1 B x Yao C U2 y No Entanglement Quantum Communication C x U3... Ψ A B Cleve-Buhrman Alice Bob M1 Entanglement Exponential separations in communication complexity Classical vs. quantum N-rounds vs. N+1-rounds 0 x C M2 C x M3... y Classical Communication touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 16 / 30
Quantum Information Complexity QIC(f, µ, ɛ) = inf Π QIC(Π, µ) QIC(Π, µ): based on I (X ; C YB) Properties: Additivity: QIC(T 1 T 2 ) = QIC(T 1 ) + QIC(T 2 ) Lower bounds communication: QIC(T ) QCC(T ) Operational interpretation [T.]: 1 QIC(T ) = AQCC(T ) = lim n n QCC(T n ) touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 17 / 30
Implications of Main Result Recall: (f, µ, ɛ) s.t. QCC(f, µ, ɛ) 2Ω(IC(f,µ,ɛ)) f Boolean-valued function, ɛ constant, say 1/10 Implications... touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 18 / 30
Implications of Main Result Recall: (f, µ, ɛ) s.t. QCC(f, µ, ɛ) 2Ω(IC(f,µ,ɛ)) f Boolean-valued function, ɛ constant, say 1/10 Implications... No strong Direct sum theorem for Quantum Communication Complexity: QCC((f, µ, ɛ) n ) Ω(nQCC(f, µ, ɛ)) Even stronger: ACC(f, µ, ɛ) O(lg QCC(f, µ, ɛ))! IC and QCC incomparable: (g, η, δ) s.t. IC(g, η, δ) 2 Ω(QCC(g,η,δ)) [Kerenidis, Laplante, Lerays, Roland, Xiao 2012] GDM bound is not (poly) tight for QCC touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 18 / 30
Need for a New Lower Bound Method on QCC Is RDB QCC (poly)? No [Klartag, Regev 2011]: QCC (VSP) lg (RDB (VSP))! We want new method M s.t. M QCC and IC M : Quantum Fooling Distribution! touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 19 / 30
(f, µ, ɛ): Rao-Sinha s k-ary pointer jumping function Two parameters: arity k, depth n. Fix n = 2 k. Input: (x, h A ) with Alice, (y, h B ) with Bob x, y : [k] n [k], x + y mod k defines good path h A, h B : [k] n {0, 1}, h A h B on a good leaf defines output touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 20 / 30
(Quantum) Fooling Distribution Two distributions: fooling dist. p, hard dist. µ, with µ = 1 2 µ 0 + 1 2 µ 1 Hidden layer j [n] x <j = y <j Fix j, x <j = y <j Let G set of good leaves/paths: determined by x j + y j only p: (x, h A ) (y, h B ) product distribution µ b : x G = y G, h G A hg B = b For low QCC: Pr µ0 [Out = 1] Pr p [Out = 1] Pr µ1 [Out = 1] touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 21 / 30
Low Information? On good path, x = y except at level j... If can hide j [n], then information lg k, values of x(z j ), y(z j ), z j = j Must hide j: CC to find j lg n = H(J) = k = 2 O(IC) Hide j by adding noise [Rao, Sinha 2015]: IC O(lg k), We show QCC is at least poly(k) For one round, QCC is Ω(k).. then poly (k) from round elimination. touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 22 / 30
Baby Case for QCC Lower Bound: One-way protocols One Message M depends only on (x, h A ) Output depends only on M and (y, h B ) p vs. µ 0 : massage to M (X G, H G A ) vs. non- M(X G H G A ) Can relate Pr µ0 [Out = 1] Pr p [Out = 1] to I (M; X G H G A ) Shearer: Pr p [Leaf l G] 1 k I (M; X G H G A ) M k touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 23 / 30
Structure of Proof for Multi-Round Protocols touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 24 / 30
First Conversion to One-Way One-round: Alice does not know J Need to send information about all X J, J [n] Multi-round to one-round: guess teleportation transcript, parallel-repeat 2 QCC times Need 2 QCC n = 2 k QCC k Technical issue: repeat once, abort if guess wrong touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 25 / 30
Structure of Proof for Multi-Round Protocols touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 26 / 30
Second Conversion to One-Way Want to perform compression to QIC B A 2 QIC A B [Jain, Radhakrishnan, Sen 2005] Use product structure of distribution p Need to fix protocol, J, embed input X J... Need QCC k to send information about all k possible paths touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 27 / 30
Structure of Proof for Multi-Round Protocols touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 28 / 30
Going from p to µ 0 Distributional Cut-and-Paste If local state is independent of other party s input under p = µ X µ Y Then local state is independent of other party s input under µ XY touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 29 / 30
Outlook Summary: Exponential separation between QCC and IC Strong Direct Sum fails for QCC New QCC lower bound tools Open Questions: External classical Information complexity? What is power of quantum fooling distribution method? Quantum Relative Discrepancy? 2 QIC compression? BBCR-type interactive compression? Partial Direct Sum? touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 30 / 30
Outlook Summary: Exponential separation between QCC and IC Strong Direct Sum fails for QCC New QCC lower bound tools Open Questions: External classical Information complexity? What is power of quantum fooling distribution method? Quantum Relative Discrepancy? 2 QIC compression? BBCR-type interactive compression? Partial Direct Sum? Thank you! touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 30 / 30
Outlook Summary: Exponential separation between QCC and IC Strong Direct Sum fails for QCC New QCC lower bound tools Open Questions: External classical Information complexity? What is power of quantum fooling distribution method? Quantum Relative Discrepancy? 2 QIC compression? BBCR-type interactive compression? Partial Direct Sum? Thank you! See you next year! touchette.dave@gmail.com Exp. Sep. QCC and IC QIP, 17 January 2017 30 / 30