Communication vs information complexity, relative discrepancy and other lower bounds
|
|
- Jesse Wheeler
- 5 years ago
- Views:
Transcription
1 Communication vs information complexity, relative discrepancy and other lower bounds Iordanis Kerenidis CNRS, LIAFA- Univ Paris Diderot 7 Joint work with: L. Fontes, R. Jain, S. Laplante, M. Lauriere, J. Roland
2 Communication complexity x {0,1} n y {0,1} n Alice and Bob want to compute f(x, y) Minimal communication needed? Trivial protocol with communication n Initiated by [Yao 79] Wide application to circuit complexity, VLSI, streaming, distributed computing, data structures, etc.
3 Measuring complexity of communication Basic measure: number of bits transmitted But bits may be useless Example: EQUALITY M = random invertible n x n matrix x {0,1} n (Mx)1 y {0,1} n (Mx)2 If (Mx)i (My)i then output 0...
4 Measuring complexity of communication Basic measure: number of bits transmitted But bits may be useless Example: EQUALITY M = random invertible n x n matrix x {0,1} n (Mx)1 y {0,1} n (Mx)2 If (Mx)i (My)i then output 0... Many bits communicated, Ω(n) (on avg) BUT, Bob learns O(1) bits about x Question: how to measure amount of information transmitted?
5 Outline 1. Introducing Information Cost and Information Complexity 2. Relating Information to Communication Complexity 3. Zero- communication protocols and Information Complexity 4. Information complexity is bigger than Relative discrepancy
6 Entropy Entropy: H(X) x supp(x) Pr[X = x] log (1/Pr[X = x]) Uncertainty about X Conditional entropy: H(X Y) y supp(y) Pr[Y = y] H(X Y = y) Uncertainty about X knowing Y Mutual information: I(X ; Y) H(X) - H(X Y) "How much information does Y reveal about X" Conditional Mutual Information: I(X ; Y Z) H(X Z) - H(X YZ) "Already knowing Z, how much additional information does Y reveal about X"
7 Information complexity Public coins rpub Private coins ra x {0,1} n y {0,1} n Private coins rb Fix protocol π, a distribution μ over inputs (X, Y, Π) : sample (X, Y) μ, let Π be transcript of π(x, Y)
8 Information complexity Public coins rpub Private coins ra x {0,1} n y {0,1} n Private coins rb Fix protocol π, a distribution μ over inputs (X, Y, Π) : sample (X, Y) μ, let Π be transcript of π(x, Y) CC(π) = maxx, y, random coins π(x, y) ICμ(π) = I(X ; Π Y) + I(Y ; Π X) Information Alice learns about Bob s input from transcript + vice versa
9 Information complexity Public coins rpub Private coins ra x {0,1} n y {0,1} n Private coins rb Fix protocol π, a distribution μ over inputs (X, Y, Π) : sample (X, Y) μ, let Π be transcript of π(x, Y) CC(π) = maxx, y, random coins π(x, y) ICμ(π) = I(X ; Π Y) + I(Y ; Π X) Information Alice learns about Bob s input from transcript + vice versa π is (μ,ε)- good for f if Pr(X,Y) μ[π(x,y) = f(x,y)] 1 ε Rμ,ε(f) = infπ (μ,ε)- good for f CC(π) Rε(f) = supμ Rμ,ε(f) ICμ,ε(f) = infπ (μ,ε)- good for f ICμ(π) ICε(f) = supμ ICμ,ε(f)
10 Information vs Communication as a Compression question Source Coding Theorem [Shannon 49]: Given a source X with entropy H(X), can encode X using H(X)+1 bits on avg Huffman coding. Asymptotically, can use H(X)bits Non- interactive One- way communication: Information C = Communication C x Encoding(x)
11 Information vs Communication as a Compression question Source Coding Theorem [Shannon 49]: Given a source X with entropy H(X), can encode X using H(X)+1 bits on avg Huffman coding. Asymptotically, can use H(X)bits Non- interactive One- way communication: Information C = Communication C x Encoding(x) What about interaction?
12 Information vs. Communication Complexity How close are IC and CC? Can I compress an interaction? Conjecture 1: for any ε, μ, f, it holds that Rμ,ε(f) = Ο(ICμ,ε(f)) Conjecture 2: for any ε, f, it holds that Rε(f) = Ο(ICε(f))
13 Information vs. Communication Complexity How close are IC and CC? Can I compress an interaction? Conjecture 1: for any ε, μ, f, it holds that Rμ,ε(f) = Ο(ICμ,ε(f)) Conjecture 2: for any ε, f, it holds that Rε(f) = Ο(ICε(f)) Main application: Conjecture 1 (or 2) implies Direct Sum for CC Rμ,ε(f k ) ICμ,ε(f k ) = k * ICμ,ε(f) Ω( k * Rμ,ε(f) ) [Braverman11]
14 Information vs. Communication Complexity How close are IC and CC? Can I compress an interaction? Conjecture 1: for any ε, μ, f, it holds that Rμ,ε(f) = Ο(ICμ,ε(f)) Conjecture 2: for any ε, f, it holds that Rε(f) = Ο(ICε(f)) Main application: Conjecture 1 (or 2) implies Direct Sum for CC Rμ,ε(f k ) ICμ,ε(f k ) = k * ICμ,ε(f) Ω( k * Rμ,ε(f) ) [Braverman11] [Braverman- Rao 11]: IC(f) = lim k R(f k ) / k [Braverman 12]: for any π, exists τ with CC(τ) 2 Ο(IC(π)) [Gavinsky, Lovett 14] logrank conj. for IC implies is for CC
15 partition relaxed partition Communication Complexity smooth rectangle factorization norm rectangle/corruption discrepancy
16 partition relaxed partition GapHamming [Chakrabarti- Regev 11, Sherstov 11] smooth rectangle factorization norm Disjointness (Quantum) [Sherstov 08] Communication Complexity VectorInSubspace [Klartag- Regev 11] Disjointness [Kalyanasundaram- Schnitger 87, Razborov 92] rectangle/corruption [Linial- Shraibman 09, Jain- Klauck 10, Laplante- Lerays- Roland 12] Inner Product discrepancy
17 Communication Complexity partition relaxed partition Information Complexity smooth rectangle factorization norm rectangle/corruption discrepancy
18 Communication Complexity partition relaxed partition Information Complexity smooth rectangle factorization norm rectangle/corruption [Braverman- Weinstein 11] discrepancy
19 Communication Complexity partition relaxed partition KLLRX 12 Information Complexity smooth rectangle factorization norm rectangle/corruption discrepancy
20 Main Theorem of KLLRX12 Theorem: for all ε, μ, f, it holds that ICμ,ε(f) Ω( log rprtμ,ε(f) ), i.e. IC subsumes relaxed partition Corollaries 1. All known CC lower bounds imply same lower bound on IC 2. All known problems with tight LBs satisfy Direct Sum Theorems Rμk,ε(f k ) ICμk,ε(f k ) k * ICμ,ε(f) = k * Rμ,ε(f) 3. Quantum 1- way communication exponentially smaller than IC VectorInSubspace problem [Klartag- Regev 11]: Ο(log n) vs. Ω(n 1/3 ) 4. ICμ,ε(GapHamming) = Ω(n) and Direct Sum [CR 11, Sherstov 11])
21 Zero Communication protocols Let P compute f with error ϵ using C bits of communication. Then there exists a zero communication protocol π which can abort st x,y π(x,y) does not abort with probability η = 2 - C Prob[π(x,y) f(x,y) non abort] ϵ
22 Zero Communication protocols Let P compute f with error ϵ using C bits of communication. Then there exists a zero communication protocol π which can abort st x,y π(x,y) does not abort with probability η = 2 - C Prob[π(x,y) f(x,y) non abort] ϵ
23 Zero Communication protocols Let P compute f with error ϵ using C bits of communication. Then there exists a zero communication protocol π which can abort st x,y π(x,y) does not abort with probability η = 2 - C Prob[π(x,y) f(x,y) non abort] ϵ
24 Zero Communication protocols Let P compute f with error ϵ using C bits of communication. Then there exists a zero communication protocol π which can abort st x,y π(x,y) does not abort with probability η = 2 - C Prob[π(x,y) f(x,y) non abort] ϵ
25 Partition bound and ZC protocols Zero communication protocol x,y π(x,y) does not abort with probability η (efficiency) Prob[π(x,y) f(x,y) non abort] ϵ Efficiency of f := ημ,ε(f) = supπ η(π)
26 Partition bound and ZC protocols Zero communication protocol x,y π(x,y) does not abort with probability η (efficiency) Prob[π(x,y) f(x,y) non abort] ϵ Efficiency of f := ημ,ε(f) = supπ η(π) Theorem: Rμ,ε(f) Ω( log 1/ημ,ε(f) )
27 Partition bound and ZC protocols Zero communication protocol x,y π(x,y) does not abort with probability η (efficiency) Prob[π(x,y) f(x,y) non abort] ϵ Efficiency of f := ημ,ε(f) = supπ η(π) Theorem: Rμ,ε(f) Ω( log 1/ημ,ε(f) ) Theorem [LLR12]: prtμ,ε(f) = 1/ημ,ε(f)
28 Relaxed Partition bound and ZC protocols Zero communication protocol x,y π(x,y) does not abort with probability η (rel. efficiency) Prob[π(x,y) f(x,y) non abort] ϵ Efficiency of f := ημ,ε(f) = supπ η(π)
29 Relaxed Partition bound and ZC protocols Zero communication protocol x,y π(x,y) does not abort with probability η (rel. efficiency) Prob[π(x,y) f(x,y) non abort] ϵ Efficiency of f := ημ,ε(f) = supπ η(π) Thm1 [KLLRX12]: rprtμ,ε(f) = 1/ημ,ε(f) Thm2 [KLLRX12]: For any protocol, a ZC protocol with rel. eff. 2 - IC
30 Compressing IC to Zero communication x {0,1} n rpub = (t 1, t 2, t 3,..., t K ), hash h y {0,1} n 1. Create set of individually accepted transcripts Using rpub, generate candidate transcripts t 1,..., t K (K 2 CC(π) + Ο(IC(π)) ) Alice & Bob each decide whether to accept each t i Conditioned on t i accepted, distribution of t i same as π(x,y) roughly 2 Ο(IC(π)) accepted transcripts by each one 2. Find common accepted transcript Using rpub, pick a hash function that gives 0 w.p. 2 - Ο(IC(π)) Alice and Bob: If there exists accepted transcript that hashes to 0, then output according to that, else Abort. Pr[not abort] 2 - Ο(IC(π))
31 Communication Complexity partition relaxed partition KLLRX 12 Information Complexity smooth rectangle factorization norm rectangle/corruption discrepancy
32 CC =? O(IC)
33 [GKR14] show exponential gap between IC and CC for a fixed distribution μ for partial Boolean function f Function Bursting noise function New Lower Bound Technique relative discrepancy f, μ IC μ (f, ϵ) CC μ (f, ϵ) CC =? O(IC) prt rk + rec CC rprt srect rdisc μ IC μ sdisc γ 2 disc
34 [GKR14] show exponential gap between IC and CC for a fixed distribution μ for partial Boolean function f Function Bursting noise function New Lower Bound Technique relative discrepancy f, μ IC μ (f, ϵ) CC μ (f, ϵ) CC =? O(IC) prt rk + rec CC rprt srect rdisc μ IC μ sdisc γ 2 Remarks IC μ =log log n vs. CC μ =log n disc
35 [GKR14] show exponential gap between IC and CC for a fixed distribution μ for partial Boolean function f Function Bursting noise function New Lower Bound Technique relative discrepancy f, μ IC μ (f, ϵ) CC μ (f, ϵ) CC =? O(IC) prt rk + rec CC rprt srect rdisc μ IC μ sdisc γ 2 Remarks IC μ =log log n vs. CC μ =log n Conjecture 1: for any ε, μ, f, Rμ,ε(f) = Ο(ICμ,ε(f)) disc
36 [GKR14] show exponential gap between IC and CC for a fixed distribution μ for partial Boolean function f Function Bursting noise function New Lower Bound Technique relative discrepancy f, μ IC μ (f, ϵ) CC μ (f, ϵ) CC =? O(IC) prt rk + rec CC rprt srect rdisc μ IC μ sdisc γ 2 Remarks IC μ =log log n vs. CC μ =log n disc New Conjecture 1: for any ε, μ, f, Rμ,ε(f) = Ο(ICμ,ε(f))*log n
37 Understanding Relative discrepancy Can relative discrepancy separate IC from CC irrespective of the distribution on the inputs? How does relative discrepancy compare to other known bounds? What methods do we know that can still separate IC from CC?
38 Part I: Non- distributional case CC prt IC rprt [This talk] rdisc
39 Partition bound Def [JK10]: (Dual LP formulation) For any boolean function f: prtε(f) = max{αxy},{βxy} β(x Y) εα(x Y) subject to β(r) α(r f 1 (z)) 1 R, z x,y α xy 0 Thm: CCε(f) log(prtε(f))
40 Relaxed Partition bound Def: (Dual LP formulation) For any boolean function f: rprtε(f) = max{αxy},{βxy} β(x Y) εα(x Y) subject to β(r) α(r f 1 (z)) 1 R, z x,y α xy 0 α xy β xy 0 Thm: CCε(f) log(prtε(f)) log(rprtε(f))
41 Relative Discrepancy Def: [GKR14] (NB: not LP) rdisc ε (f,μ) = supκ,δ,{ρxy} 1/δ(½ κ ε) subject to μ(r f 1 (z)) (½ κ)ρ(r) R,z s.t. ρ(r) δ ρ(x Y) = 1 ρxy 0 x,y 0 κ < ½, 0 < δ < 1 rdisc ε (f)= maxμ rdisc ε (f,μ)
42 Def: [GKR14] (NB: not LP) rdisc ε (f,μ) = supκ,δ,{ρxy} 1/δ(½ κ ε) subject to μ(r f 1 (z)) (½ κ)ρ(r) R,z s.t. ρ(r) δ ρ(x Y) = 1 ρxy 0 x,y 0 κ < ½, 0 < δ < 1 rdisc ε (f)= maxμ rdisc ε (f,μ) Relative Discrepancy rescale rectangle size
43 Relative discrepancy relaxed partition Consider any feasible solution for rdisc Apply change of variables to obtain a feasible solution to rprt with a higher objective value.
44 Relative discrepancy relaxed partition rdisc ε (f) = maxμ supκ,δ,{ρxy} 1/δ(½ κ ε) s.t. μ(r f 1 (z)) (½ κ)ρ(r) R,z s.t. ρ(r) δ ρ(x Y) = 1 ρxy 0 x,y 0 κ < ½, 0 < δ < 1
45 Relative discrepancy relaxed partition rdisc ε (f) = maxμ supκ,δ,{ρxy} 1/δ(½ κ ε) s.t. μ(r f 1 (z)) (½ κ)ρ(r) R,z s.t. ρ(r) δ ρ(x Y) = 1 ρxy 0 x,y 0 κ < ½, 0 < δ < 1 α β μ β ρ rprt ε (f) = max{αxy},{βxy} β(x Y) εα(x Y) s.t. β(r) α(r f 1 (z)) 1 R, z α 0 αxy βxy 0 x,y
46 Part I: Non- distributional case CC prt IC rprt rdisc Can relative discrepancy separate IC from CC irrespective of the distribution on the inputs? NO
47 Part II: Fixed μ prt ε (f; μ) = max α, {βxy} β(x Y) ε αμ(x Y) s.t. β(r) αμ(r f 1 (z)) 1 R, z α 0
48 Part II: Fixed μ prt ε (f; μ) = max α, {βxy} β(x Y) ε αμ(x Y) s.t. β(r) αμ(r f 1 (z)) 1 R, z α 0 non-distributional α β μ β ρ
49 Part II: Fixed μ prt ε (f; μ) = max α, {βxy} β(x Y) ε αμ(x Y) s.t. β(r) αμ(r f 1 (z)) 1 R, z α 0 non-distributional α β μ β ρ
50 Part II: Fixed μ prt ε (f; μ) = max α, {βxy} β(x Y) ε αμ(x Y) s.t. β(r) αμ(r f 1 (z)) 1 R, z α 0 non-distributional Distributional: exponential separation Non-distributional: collapse α β μ β ρ
51 Lower bounds with partitions pprt CC pprt [JLV14] : quadratically tight bound prt ardisc rprt IC ardisc [GKR14]: strong new bound rdisc Remark: both use partitions and not rectangles
52 ardisc vs pprt CC prt IC rprt rk + srect rec sdisc γ 2 disc
53 Information vs. Communication Complexity How close are IC and CC? Can I compress an interaction? NEW Conjecture 1: for any ε, μ, f, Rμ,ε(f) = Ο(ICμ,ε(f))*log n Conjecture 2: for any ε, f, Rε(f) = Ο(ICε(f))
54 Information vs. Communication Complexity How close are IC and CC? Can I compress an interaction? NEW Conjecture 1: for any ε, μ, f, Rμ,ε(f) = Ο(ICμ,ε(f))*log n Conjecture 2: for any ε, f, Rε(f) = Ο(ICε(f)) [GKR15]: f, IC(f, ϵ) CC(f, ϵ) Remarks: IC=log log n vs. CC=log n New Conjecture 2: for any ε, μ, f, Rμ,ε(f) = Ο(ICμ,ε(f))*log n
55 Summary LP formulation of distributional relative discrepancy : partition bound + nonnegativity constraint Non- distributional relative discrepancy is less than IC Partition bound is not a lower bound on IC for fixed distributions ardisc is quadratically tight for communication complexity
56 n vs log(n) separation of distributional IC and CC Give separation for wprt/pprt; prt/rprt Is the partition bound a lower bound on IC? Requires new compression techniques to obtain constant efficiency ZCP Is it tight for CC? Open Problems Recover a communication protocol from a ZCP
57 Thank you
58 Compressing CC to Zero communication x {0,1} n rpub (= transcript t) y {0,1} n Theorem: if π (μ,ε)- good for f using communication, then can build ZCP τ (μ,ε)- good for f with η(τ) 2 - CC(π) Proof: τ uses rpub to guess random transcript t Alice (resp. Bob) sees if t is consistent with x (resp. y) If so, output what π outputs, otherwise output Abort Pr[t not abort] 2 - CC(π) CCμ,ε(f) Ω( log 1/ημ,ε(f) )
59 Compressing IC to Zero communication x {0,1} n rpub = (t 1, t 2, t 3,..., t K ), hash h y {0,1} n Theorem: for all ε, μ, f, it holds that ICμ,ε(f) Ω( log 1/ημ,ε(f) )
60 Public coin partition bound Max efficiency of a zero- communication protocol : Players pick a partition at random; Play following a uniformly distributed rectangle in the partition, Abort if either input not in rectangle.
61 Adaptive relative discrepancy Def: [GKR14] (NB: not LP) ardisc ε (f,μ) = supκ,δ,{ρ P xy} 1/δ(½ κ ε) subject to μ(r f 1 (z)) (½ κ)ρ P (R) P, (R,z) P s.t. ρ P (R) δ ρ P (X Y) = 1 ρ P xy 0 x,y 0 κ < ½, 0 < δ < 1 ardisc ε (f)= maxμ ardisc ε (f,μ)
Exponential Separation of Quantum Communication and Classical Information
Exponential Separation of Quantum Communication and Classical Information Dave Touchette IQC and C&O, University of Waterloo, and Perimeter Institute for Theoretical Physics jt. work with Anurag Anshu
More informationLower bounds on information complexity via zero-communication protocols and applications
Lower bounds on information complexity via zero-communication protocols and applications Iordanis Kerenidis Sophie Laplante Virginie Lerays Jérémie Roland David Xiao April 6, 2012 Abstract We show that
More informationInformation Complexity and Applications. Mark Braverman Princeton University and IAS FoCM 17 July 17, 2017
Information Complexity and Applications Mark Braverman Princeton University and IAS FoCM 17 July 17, 2017 Coding vs complexity: a tale of two theories Coding Goal: data transmission Different channels
More informationLecture Lecture 9 October 1, 2015
CS 229r: Algorithms for Big Data Fall 2015 Lecture Lecture 9 October 1, 2015 Prof. Jelani Nelson Scribe: Rachit Singh 1 Overview In the last lecture we covered the distance to monotonicity (DTM) and longest
More informationAn Information Complexity Approach to the Inner Product Problem
An Information Complexity Approach to the Inner Product Problem William Henderson-Frost Advisor: Amit Chakrabarti Senior Honors Thesis Submitted to the faculty in partial fulfillment of the requirements
More informationLecture 16 Oct 21, 2014
CS 395T: Sublinear Algorithms Fall 24 Prof. Eric Price Lecture 6 Oct 2, 24 Scribe: Chi-Kit Lam Overview In this lecture we will talk about information and compression, which the Huffman coding can achieve
More informationA strong direct product theorem in terms of the smooth rectangle bound
A strong direct product theorem in terms of the smooth rectangle bound Rahul Jain Centre for Quantum Technologies and Department of Computer Science National U. Singapore E-mail: rahul@comp.nus.edu.sg.
More informationLecture 16: Communication Complexity
CSE 531: Computational Complexity I Winter 2016 Lecture 16: Communication Complexity Mar 2, 2016 Lecturer: Paul Beame Scribe: Paul Beame 1 Communication Complexity In (two-party) communication complexity
More informationInteractive information and coding theory
Interactive information and coding theory Mark Braverman Abstract. We give a high-level overview of recent developments in interactive information and coding theory. These include developments involving
More informationThe Communication Complexity of Correlation. Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan
The Communication Complexity of Correlation Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan Transmitting Correlated Variables (X, Y) pair of correlated random variables Transmitting
More informationLecture 20: Lower Bounds for Inner Product & Indexing
15-859: Information Theory and Applications in TCS CMU: Spring 201 Lecture 20: Lower Bounds for Inner Product & Indexing April 9, 201 Lecturer: Venkatesan Guruswami Scribe: Albert Gu 1 Recap Last class
More informationAn Introduction to Quantum Information and Applications
An Introduction to Quantum Information and Applications Iordanis Kerenidis CNRS LIAFA-Univ Paris-Diderot Quantum information and computation Quantum information and computation How is information encoded
More informationApproximation norms and duality for communication complexity lower bounds
Approximation norms and duality for communication complexity lower bounds Troy Lee Columbia University Adi Shraibman Weizmann Institute From min to max The cost of a best algorithm is naturally phrased
More information14. Direct Sum (Part 1) - Introduction
Communication Complexity 14 Oct, 2011 (@ IMSc) 14. Direct Sum (Part 1) - Introduction Lecturer: Prahladh Harsha Scribe: Abhishek Dang 14.1 Introduction The Direct Sum problem asks how the difficulty in
More informationAn exponential separation between quantum and classical one-way communication complexity
An exponential separation between quantum and classical one-way communication complexity Ashley Montanaro Centre for Quantum Information and Foundations, Department of Applied Mathematics and Theoretical
More informationTHE UNIVERSITY OF CHICAGO COMMUNICATION COMPLEXITY AND INFORMATION COMPLEXITY A DISSERTATION SUBMITTED TO
THE UNIVERSITY OF CHICAGO COMMUNICATION COMPLEXITY AND INFORMATION COMPLEXITY A DISSERTATION SUBMITTED TO THE FACULTY OF THE DIVISION OF THE PHYSICAL SCIENCES IN CANDIDACY FOR THE DEGREE OF DOCTOR OF PHILOSOPHY
More informationDirect product theorem for discrepancy
Direct product theorem for discrepancy Troy Lee Rutgers University Joint work with: Robert Špalek Direct product theorems Knowing how to compute f, how can you compute f f f? Obvious upper bounds: If can
More informationInformation Complexity vs. Communication Complexity: Hidden Layers Game
Information Complexity vs. Communication Complexity: Hidden Layers Game Jiahui Liu Final Project Presentation for Information Theory in TCS Introduction Review of IC vs CC Hidden Layers Game Upper Bound
More informationCommunication Complexity 16:198:671 2/15/2010. Lecture 6. P (x) log
Communication Complexity 6:98:67 2/5/200 Lecture 6 Lecturer: Nikos Leonardos Scribe: Troy Lee Information theory lower bounds. Entropy basics Let Ω be a finite set and P a probability distribution on Ω.
More informationDirect product theorem for discrepancy
Direct product theorem for discrepancy Troy Lee Rutgers University Adi Shraibman Weizmann Institute of Science Robert Špalek Google Direct product theorems: Why is Google interested? Direct product theorems:
More informationClique vs. Independent Set
Lower Bounds for Clique vs. Independent Set Mika Göös University of Toronto Mika Göös (Univ. of Toronto) Clique vs. Independent Set rd February 5 / 4 On page 6... Mika Göös (Univ. of Toronto) Clique vs.
More informationHow to Compress Interactive Communication
How to Compress Interactive Communication Boaz Barak Mark Braverman Xi Chen Anup Rao March 1, 2013 Abstract We describe new ways to simulate 2-party communication protocols to get protocols with potentially
More informationDirect product theorem for discrepancy
Direct product theorem for discrepancy Troy Lee Rutgers University Robert Špalek Google Direct product theorems: Why is Google interested? Direct product theorems: Why should Google be interested? Direct
More informationCS Foundations of Communication Complexity
CS 2429 - Foundations of Communication Complexity Lecturer: Sergey Gorbunov 1 Introduction In this lecture we will see how to use methods of (conditional) information complexity to prove lower bounds for
More informationFourier analysis of boolean functions in quantum computation
Fourier analysis of boolean functions in quantum computation Ashley Montanaro Centre for Quantum Information and Foundations, Department of Applied Mathematics and Theoretical Physics, University of Cambridge
More informationOn Quantum vs. Classical Communication Complexity
On Quantum vs. Classical Communication Complexity Dmitry Gavinsky Institute of Mathematics, Praha Czech Academy of Sciences Introduction Introduction The setting of communication complexity is one of the
More informationSimultaneous Communication Protocols with Quantum and Classical Messages
Simultaneous Communication Protocols with Quantum and Classical Messages Oded Regev Ronald de Wolf July 17, 2008 Abstract We study the simultaneous message passing model of communication complexity, for
More informationNondeterminism LECTURE Nondeterminism as a proof system. University of California, Los Angeles CS 289A Communication Complexity
University of California, Los Angeles CS 289A Communication Complexity Instructor: Alexander Sherstov Scribe: Matt Brown Date: January 25, 2012 LECTURE 5 Nondeterminism In this lecture, we introduce nondeterministic
More informationGrothendieck Inequalities, XOR games, and Communication Complexity
Grothendieck Inequalities, XOR games, and Communication Complexity Troy Lee Rutgers University Joint work with: Jop Briët, Harry Buhrman, and Thomas Vidick Overview Introduce XOR games, Grothendieck s
More informationThe Computational Complexity Column
The Computational Complexity Column by Vikraman Arvind Institute of Mathematical Sciences, CIT Campus, Taramani Chennai 600113, India arvind@imsc.res.in http://www.imsc.res.in/~arvind Communication complexity
More informationPartitions and Covers
University of California, Los Angeles CS 289A Communication Complexity Instructor: Alexander Sherstov Scribe: Dong Wang Date: January 2, 2012 LECTURE 4 Partitions and Covers In previous lectures, we saw
More informationCommunication is bounded by root of rank
Electronic Colloquium on Computational Complexity, Report No. 84 (2013) Communication is bounded by root of rank Shachar Lovett June 7, 2013 Abstract We prove that any total boolean function of rank r
More informationInformation Theory + Polyhedral Combinatorics
Information Theory + Polyhedral Combinatorics Sebastian Pokutta Georgia Institute of Technology ISyE, ARC Joint work Gábor Braun Information Theory in Complexity Theory and Combinatorics Simons Institute
More informationQuantum Communication Complexity
Quantum Communication Complexity Ronald de Wolf Communication complexity has been studied extensively in the area of theoretical computer science and has deep connections with seemingly unrelated areas,
More informationTight Bounds for Distributed Functional Monitoring
Tight Bounds for Distributed Functional Monitoring Qin Zhang MADALGO, Aarhus University Joint with David Woodruff, IBM Almaden NII Shonan meeting, Japan Jan. 2012 1-1 The distributed streaming model (a.k.a.
More informationQuantum Communication
Quantum Communication Harry Buhrman CWI & University of Amsterdam Physics and Computing Computing is physical Miniaturization quantum effects Quantum Computers ) Enables continuing miniaturization ) Fundamentally
More informationCS Communication Complexity: Applications and New Directions
CS 2429 - Communication Complexity: Applications and New Directions Lecturer: Toniann Pitassi 1 Introduction In this course we will define the basic two-party model of communication, as introduced in the
More informationThe one-way communication complexity of the Boolean Hidden Matching Problem
The one-way communication complexity of the Boolean Hidden Matching Problem Iordanis Kerenidis CRS - LRI Université Paris-Sud jkeren@lri.fr Ran Raz Faculty of Mathematics Weizmann Institute ran.raz@weizmann.ac.il
More informationEXPONENTIAL SEPARATION OF QUANTUM AND CLASSICAL ONE-WAY COMMUNICATION COMPLEXITY
EXPONENTIAL SEPARATION OF QUANTUM AND CLASSICAL ONE-WAY COMMUNICATION COMPLEXITY ZIV BAR-YOSSEF, T. S. JAYRAM, AND IORDANIS KERENIDIS Abstract. We give the first exponential separation between quantum
More informationCommunication with Imperfect Shared Randomness
Communication with Imperfect Shared Randomness (Joint work with Venkatesan Guruswami (CMU), Raghu Meka (?) and Madhu Sudan (MSR)) Who? Clément Canonne (Columbia University) When? November 19, 2014 1 /
More informationCS Foundations of Communication Complexity
CS 49 - Foundations of Communication Complexity Lecturer: Toniann Pitassi 1 The Discrepancy Method Cont d In the previous lecture we ve outlined the discrepancy method, which is a method for getting lower
More informationLower Bound Techniques for Multiparty Communication Complexity
Lower Bound Techniques for Multiparty Communication Complexity Qin Zhang Indiana University Bloomington Based on works with Jeff Phillips, Elad Verbin and David Woodruff 1-1 The multiparty number-in-hand
More informationMajority is incompressible by AC 0 [p] circuits
Majority is incompressible by AC 0 [p] circuits Igor Carboni Oliveira Columbia University Joint work with Rahul Santhanam (Univ. Edinburgh) 1 Part 1 Background, Examples, and Motivation 2 Basic Definitions
More informationThe Gaussians Distribution
CSE 206A: Lattice Algorithms and Applications Winter 2016 The Gaussians Distribution Instructor: Daniele Micciancio UCSD CSE 1 The real fourier transform Gaussian distributions and harmonic analysis play
More informationLecture 18: Quantum Information Theory and Holevo s Bound
Quantum Computation (CMU 1-59BB, Fall 2015) Lecture 1: Quantum Information Theory and Holevo s Bound November 10, 2015 Lecturer: John Wright Scribe: Nicolas Resch 1 Question In today s lecture, we will
More informationMulti-Party Quantum Communication Complexity with Routed Messages
Multi-Party Quantum Communication Complexity with Routed Messages Seiichiro Tani Masaki Nakanishi Shigeru Yamashita Abstract This paper describes a general quantum lower bounding technique for the communication
More informationCommunication with Contextual Uncertainty
Communication with Contextual Uncertainty Badih Ghazi Ilan Komargodski Pravesh Kothari Madhu Sudan July 9, 205 Abstract We introduce a simple model illustrating the role of context in communication and
More informationChebyshev Polynomials, Approximate Degree, and Their Applications
Chebyshev Polynomials, Approximate Degree, and Their Applications Justin Thaler 1 Georgetown University Boolean Functions Boolean function f : { 1, 1} n { 1, 1} AND n (x) = { 1 (TRUE) if x = ( 1) n 1 (FALSE)
More informationLecture 11: Quantum Information III - Source Coding
CSCI5370 Quantum Computing November 25, 203 Lecture : Quantum Information III - Source Coding Lecturer: Shengyu Zhang Scribe: Hing Yin Tsang. Holevo s bound Suppose Alice has an information source X that
More informationThe Limits of Two-Party Differential Privacy
Electronic Colloquium on Computational Complexity, Report No. 106 (2011) The Limits of Two-Party Differential Privacy Andrew McGregor Ilya Mironov Toniann Pitassi Omer Reingold Kunal Talwar Salil Vadhan
More informationDirect Product Theorems for Classical Communication Complexity via Subdistribution Bounds
Direct Product Theorems for Classical Communication Complexity via Subdistribution Bounds Rahul Jain U. Waterloo Hartmut Klauck Goethe-Universität Frankfurt December 17, 2007 Ashwin Nayak U. Waterloo &
More informationTowards a Reverse Newman s Theorem in Interactive Information Complexity
Towards a Reverse Newman s Theorem in Interactive Information Complexity Joshua Brody 1, Harry Buhrman 2,3, Michal Koucký 4, Bruno Loff 2, Florian Speelman 2, and Nikolay Vereshchagin 5 1 Aarhus University
More informationAn Optimal Lower Bound on the Communication Complexity of GAP-HAMMING-DISTANCE
An Optimal Lower Bound on the Communication Complexity of GAP-HAMMING-DISTANCE Amit Chakrabarti Oded Regev June 29, 2012 Abstract We prove an optimal Ω(n) lower bound on the randomized communication complexity
More informationThe Hardness of Being Private
The Hardness of Being Private Anil Ada Arkadev Chattopadhyay Stephen Cook Lila Fontes Michal Koucký Toniann Pitassi September 20, 2012 Abstract In 1989 Kushilevitz [1] initiated the study of information-theoretic
More informationIntroduction to Nonlinear Control Lecture # 3 Time-Varying and Perturbed Systems
p. 1/5 Introduction to Nonlinear Control Lecture # 3 Time-Varying and Perturbed Systems p. 2/5 Time-varying Systems ẋ = f(t, x) f(t, x) is piecewise continuous in t and locally Lipschitz in x for all t
More informationNear-Optimal Lower Bounds on the Multi-Party Communication Complexity of Set Disjointness
Near-Optimal Lower Bounds on the Multi-Party Communication Complexity of Set Disjointness Amit Chakrabarti School of Mathematics Institute for Advanced Study Princeton, NJ 08540 amitc@ias.edu Subhash Khot
More informationCourse Introduction. 1.1 Overview. Lecture 1. Scribe: Tore Frederiksen
Lecture 1 Course Introduction Scribe: Tore Frederiksen 1.1 Overview Assume we are in a situation where several people need to work together in order to solve a problem they cannot solve on their own. The
More informationQuantum rate distortion, reverse Shannon theorems, and source-channel separation
Quantum rate distortion, reverse Shannon theorems, and source-channel separation ilanjana Datta, Min-Hsiu Hsieh, Mark Wilde (1) University of Cambridge,U.K. (2) McGill University, Montreal, Canada Classical
More informationRank minimization via the γ 2 norm
Rank minimization via the γ 2 norm Troy Lee Columbia University Adi Shraibman Weizmann Institute Rank Minimization Problem Consider the following problem min X rank(x) A i, X b i for i = 1,..., k Arises
More informationSimultaneous Communication Protocols with Quantum and Classical Messages
Simultaneous Communication Protocols with Quantum and Classical Messages Dmitry Gavinsky Oded Regev Ronald de Wolf December 29, 2008 Abstract We study the simultaneous message passing (SMP) model of communication
More information18.310A Final exam practice questions
18.310A Final exam practice questions This is a collection of practice questions, gathered randomly from previous exams and quizzes. They may not be representative of what will be on the final. In particular,
More informationOn the tightness of the Buhrman-Cleve-Wigderson simulation
On the tightness of the Buhrman-Cleve-Wigderson simulation Shengyu Zhang Department of Computer Science and Engineering, The Chinese University of Hong Kong. syzhang@cse.cuhk.edu.hk Abstract. Buhrman,
More informationExponential Separation of Quantum and Classical Non-Interactive Multi-Party Communication Complexity
Electronic Colloquium on Computational Complexity, Report No. 74 (2007) Exponential Separation of Quantum and Classical Non-Interactive ulti-party Communication Complexity Dmitry Gavinsky Pavel Pudlák
More informationStrengths and Weaknesses of Quantum Fingerprinting
Strengths and Weaknesses of Quantum Fingerprinting Dmitry Gavinsky University of Calgary Julia Kempe CNRS & LRI Univ. de Paris-Sud, Orsay Ronald de Wolf CWI, Amsterdam Abstract We study the power of quantum
More informationLOWER BOUNDS FOR QUANTUM COMMUNICATION COMPLEXITY
LOWER BOUNDS FOR QUANTUM COMMUNICATION COMPLEXITY HARTMUT KLAUCK Abstract. We prove lower bounds on the bounded error quantum communication complexity. Our methods are based on the Fourier transform of
More informationLecture 10 + additional notes
CSE533: Information Theorn Computer Science November 1, 2010 Lecturer: Anup Rao Lecture 10 + additional notes Scribe: Mohammad Moharrami 1 Constraint satisfaction problems We start by defining bivariate
More informationCommunication Complexity
Communication Complexity Jaikumar Radhakrishnan School of Technology and Computer Science Tata Institute of Fundamental Research Mumbai 31 May 2012 J 31 May 2012 1 / 13 Plan 1 Examples, the model, the
More informationMatrix Rank in Communication Complexity
University of California, Los Angeles CS 289A Communication Complexity Instructor: Alexander Sherstov Scribe: Mohan Yang Date: January 18, 2012 LECTURE 3 Matrix Rank in Communication Complexity This lecture
More informationarxiv: v1 [cs.cc] 6 Apr 2010
arxiv:1004.0817v1 [cs.cc] 6 Apr 2010 A Separation of NP and conp in Multiparty Communication Complexity Dmitry Gavinsky Alexander A. Sherstov Abstract We prove that NP conp and conp MA in the number-onforehead
More informationMultiparty Computation
Multiparty Computation Principle There is a (randomized) function f : ({0, 1} l ) n ({0, 1} l ) n. There are n parties, P 1,...,P n. Some of them may be adversarial. Two forms of adversarial behaviour:
More informationLower bounds for Edit Distance and Product Metrics via Poincaré-Type Inequalities
Lower bounds for Edit Distance and Product Metrics via Poincaré-Type Inequalities Alexandr Andoni Princeton U./CCI andoni@mit.edu T.S. Jayram IBM Almaden jayram@almaden.ibm.com Mihai Pǎtraşcu AT&T Labs
More informationCompute the Fourier transform on the first register to get x {0,1} n x 0.
CS 94 Recursive Fourier Sampling, Simon s Algorithm /5/009 Spring 009 Lecture 3 1 Review Recall that we can write any classical circuit x f(x) as a reversible circuit R f. We can view R f as a unitary
More informationSpectral gaps and geometric representations. Amir Yehudayoff (Technion)
Spectral gaps and geometric representations Amir Yehudayoff (Technion) Noga Alon (Tel Aviv) Shay Moran (Simons) pseudorandomness random objects have properties difficult to describe hard to compute expanders
More informationTight Bounds for Single-Pass Streaming Complexity of the Set Cover Problem
Tight Bounds for Single-Pass Streaming Complexity of the Set Cover Problem Sepehr Assadi Department of Computer and Information Science University of Pennsylvania Philadelphia, PA, USA sassadi@cis.upenn.edu
More informationAn Approximation Algorithm for Approximation Rank
An Approximation Algorithm for Approximation Rank Troy Lee Columbia University Adi Shraibman Weizmann Institute Conventions Identify a communication function f : X Y { 1, +1} with the associated X-by-Y
More informationNonlinear Systems and Control Lecture # 19 Perturbed Systems & Input-to-State Stability
p. 1/1 Nonlinear Systems and Control Lecture # 19 Perturbed Systems & Input-to-State Stability p. 2/1 Perturbed Systems: Nonvanishing Perturbation Nominal System: Perturbed System: ẋ = f(x), f(0) = 0 ẋ
More informationMaximal Noise in Interactive Communication over Erasure Channels and Channels with Feedback
Maximal Noise in Interactive Communication over Erasure Channels and Channels with Feedback Klim Efremenko UC Berkeley klimefrem@gmail.com Ran Gelles Princeton University rgelles@cs.princeton.edu Bernhard
More informationCommunication Lower Bounds via Critical Block Sensitivity
Communication Lower Bounds via Critical Block Sensitivity Mika Göös & Toniann Pitassi University of Toronto Göös & Pitassi (Univ. of Toronto) Communication Lower Bounds 13th January 2014 1 / 18 Communication
More informationHardness of MST Construction
Lecture 7 Hardness of MST Construction In the previous lecture, we saw that an MST can be computed in O( n log n+ D) rounds using messages of size O(log n). Trivially, Ω(D) rounds are required, but what
More informationECE 4400:693 - Information Theory
ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential
More informationMonotone Circuits for Matching Require. Linear Depth
Monotone Circuits for Matching Require Linear Depth Ran Raz Avi Wigderson The Hebrew University February 11, 2003 Abstract We prove that monotone circuits computing the perfect matching function on n-vertex
More informationHow to Compress Interactive Communication
How to Compress Interactive Communication Boaz Barak Mark Braverman Xi Chen Anup Rao October 15, 2010 Abstract We describe new ways to simulate 2-party communication protocols to get protocols with potentially
More informationAn approximation algorithm for approximation rank
An approximation algorithm for approximation rank Troy Lee Department of Computer Science Rutgers University Adi Shraibman Department of Mathematics Weizmann Institute of Science arxiv:0809.2093v1 [cs.cc]
More informationTight Bounds for Distributed Streaming
Tight Bounds for Distributed Streaming (a.k.a., Distributed Functional Monitoring) David Woodruff IBM Research Almaden Qin Zhang MADALGO, Aarhus Univ. STOC 12 May 22, 2012 1-1 The distributed streaming
More informationComputer Science A Cryptography and Data Security. Claude Crépeau
Computer Science 308-547A Cryptography and Data Security Claude Crépeau These notes are, largely, transcriptions by Anton Stiglic of class notes from the former course Cryptography and Data Security (308-647A)
More informationCS Introduction to Complexity Theory. Lecture #11: Dec 8th, 2015
CS 2401 - Introduction to Complexity Theory Lecture #11: Dec 8th, 2015 Lecturer: Toniann Pitassi Scribe Notes by: Xu Zhao 1 Communication Complexity Applications Communication Complexity (CC) has many
More informationLower Bounds for Number-in-Hand Multiparty Communication Complexity, Made Easy
Lower Bounds for Number-in-Hand Multiparty Communication Complexity, Made Easy Jeff M. Phillips School of Computing University of Utah jeffp@cs.utah.edu Elad Verbin Dept. of Computer Science Aarhus University,
More informationInternal Compression of Protocols to Entropy
Internal Compression of Protocols to Entropy Balthazar Bauer 1, Shay Moran 2, and Amir Yehudayoff 3 1 Département d Infomatique, ENS Lyon Lyon, France balthazarbauer@aol.com 2 Departments of Computer Science,
More informationLinear sketching for Functions over Boolean Hypercube
Linear sketching for Functions over Boolean Hypercube Grigory Yaroslavtsev (Indiana University, Bloomington) http://grigory.us with Sampath Kannan (U. Pennsylvania), Elchanan Mossel (MIT) and Swagato Sanyal
More informationA Polynomial Time Algorithm for Lossy Population Recovery
A Polynomial Time Algorithm for Lossy Population Recovery Ankur Moitra Massachusetts Institute of Technology joint work with Mike Saks A Story A Story A Story A Story A Story Can you reconstruct a description
More informationQuantum Communication Complexity
Quantum Communication Complexity Hartmut Klauck FB Informatik, Johann-Wolfgang-Goethe-Universität 60054 Frankfurt am Main, Germany Abstract This paper surveys the field of quantum communication complexity.
More informationStream Computation and Arthur- Merlin Communication
Stream Computation and Arthur- Merlin Communication Justin Thaler, Yahoo! Labs Joint Work with: Amit Chakrabarti, Dartmouth Graham Cormode, University of Warwick Andrew McGregor, Umass Amherst Suresh Venkatasubramanian,
More informationA direct product theorem for the two-party bounded-round public-coin communication complexity
A direct product theorem for the two-party bounded-round public-coin communication complexity Rahul Jain Centre for Quantum Technologies and Department of Computer Science National University of Singapore
More informationTopology Matters in Communication
Electronic Colloquium on Computational Complexity, Report No. 74 204 Topology Matters in Communication Arkadev Chattopadhyay Jaikumar Radhakrishnan Atri Rudra May 4, 204 Abstract We provide the first communication
More informationComputing and Communications 2. Information Theory -Entropy
1896 1920 1987 2006 Computing and Communications 2. Information Theory -Entropy Ying Cui Department of Electronic Engineering Shanghai Jiao Tong University, China 2017, Autumn 1 Outline Entropy Joint entropy
More informationInformation Theoretic Limits of Randomness Generation
Information Theoretic Limits of Randomness Generation Abbas El Gamal Stanford University Shannon Centennial, University of Michigan, September 2016 Information theory The fundamental problem of communication
More informationLecture 6: Quantum error correction and quantum capacity
Lecture 6: Quantum error correction and quantum capacity Mark M. Wilde The quantum capacity theorem is one of the most important theorems in quantum hannon theory. It is a fundamentally quantum theorem
More informationIntroduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information.
L65 Dept. of Linguistics, Indiana University Fall 205 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission rate
More informationDept. of Linguistics, Indiana University Fall 2015
L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 28 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission
More information