Communication vs information complexity, relative discrepancy and other lower bounds

Size: px
Start display at page:

Download "Communication vs information complexity, relative discrepancy and other lower bounds"

Transcription

1 Communication vs information complexity, relative discrepancy and other lower bounds Iordanis Kerenidis CNRS, LIAFA- Univ Paris Diderot 7 Joint work with: L. Fontes, R. Jain, S. Laplante, M. Lauriere, J. Roland

2 Communication complexity x {0,1} n y {0,1} n Alice and Bob want to compute f(x, y) Minimal communication needed? Trivial protocol with communication n Initiated by [Yao 79] Wide application to circuit complexity, VLSI, streaming, distributed computing, data structures, etc.

3 Measuring complexity of communication Basic measure: number of bits transmitted But bits may be useless Example: EQUALITY M = random invertible n x n matrix x {0,1} n (Mx)1 y {0,1} n (Mx)2 If (Mx)i (My)i then output 0...

4 Measuring complexity of communication Basic measure: number of bits transmitted But bits may be useless Example: EQUALITY M = random invertible n x n matrix x {0,1} n (Mx)1 y {0,1} n (Mx)2 If (Mx)i (My)i then output 0... Many bits communicated, Ω(n) (on avg) BUT, Bob learns O(1) bits about x Question: how to measure amount of information transmitted?

5 Outline 1. Introducing Information Cost and Information Complexity 2. Relating Information to Communication Complexity 3. Zero- communication protocols and Information Complexity 4. Information complexity is bigger than Relative discrepancy

6 Entropy Entropy: H(X) x supp(x) Pr[X = x] log (1/Pr[X = x]) Uncertainty about X Conditional entropy: H(X Y) y supp(y) Pr[Y = y] H(X Y = y) Uncertainty about X knowing Y Mutual information: I(X ; Y) H(X) - H(X Y) "How much information does Y reveal about X" Conditional Mutual Information: I(X ; Y Z) H(X Z) - H(X YZ) "Already knowing Z, how much additional information does Y reveal about X"

7 Information complexity Public coins rpub Private coins ra x {0,1} n y {0,1} n Private coins rb Fix protocol π, a distribution μ over inputs (X, Y, Π) : sample (X, Y) μ, let Π be transcript of π(x, Y)

8 Information complexity Public coins rpub Private coins ra x {0,1} n y {0,1} n Private coins rb Fix protocol π, a distribution μ over inputs (X, Y, Π) : sample (X, Y) μ, let Π be transcript of π(x, Y) CC(π) = maxx, y, random coins π(x, y) ICμ(π) = I(X ; Π Y) + I(Y ; Π X) Information Alice learns about Bob s input from transcript + vice versa

9 Information complexity Public coins rpub Private coins ra x {0,1} n y {0,1} n Private coins rb Fix protocol π, a distribution μ over inputs (X, Y, Π) : sample (X, Y) μ, let Π be transcript of π(x, Y) CC(π) = maxx, y, random coins π(x, y) ICμ(π) = I(X ; Π Y) + I(Y ; Π X) Information Alice learns about Bob s input from transcript + vice versa π is (μ,ε)- good for f if Pr(X,Y) μ[π(x,y) = f(x,y)] 1 ε Rμ,ε(f) = infπ (μ,ε)- good for f CC(π) Rε(f) = supμ Rμ,ε(f) ICμ,ε(f) = infπ (μ,ε)- good for f ICμ(π) ICε(f) = supμ ICμ,ε(f)

10 Information vs Communication as a Compression question Source Coding Theorem [Shannon 49]: Given a source X with entropy H(X), can encode X using H(X)+1 bits on avg Huffman coding. Asymptotically, can use H(X)bits Non- interactive One- way communication: Information C = Communication C x Encoding(x)

11 Information vs Communication as a Compression question Source Coding Theorem [Shannon 49]: Given a source X with entropy H(X), can encode X using H(X)+1 bits on avg Huffman coding. Asymptotically, can use H(X)bits Non- interactive One- way communication: Information C = Communication C x Encoding(x) What about interaction?

12 Information vs. Communication Complexity How close are IC and CC? Can I compress an interaction? Conjecture 1: for any ε, μ, f, it holds that Rμ,ε(f) = Ο(ICμ,ε(f)) Conjecture 2: for any ε, f, it holds that Rε(f) = Ο(ICε(f))

13 Information vs. Communication Complexity How close are IC and CC? Can I compress an interaction? Conjecture 1: for any ε, μ, f, it holds that Rμ,ε(f) = Ο(ICμ,ε(f)) Conjecture 2: for any ε, f, it holds that Rε(f) = Ο(ICε(f)) Main application: Conjecture 1 (or 2) implies Direct Sum for CC Rμ,ε(f k ) ICμ,ε(f k ) = k * ICμ,ε(f) Ω( k * Rμ,ε(f) ) [Braverman11]

14 Information vs. Communication Complexity How close are IC and CC? Can I compress an interaction? Conjecture 1: for any ε, μ, f, it holds that Rμ,ε(f) = Ο(ICμ,ε(f)) Conjecture 2: for any ε, f, it holds that Rε(f) = Ο(ICε(f)) Main application: Conjecture 1 (or 2) implies Direct Sum for CC Rμ,ε(f k ) ICμ,ε(f k ) = k * ICμ,ε(f) Ω( k * Rμ,ε(f) ) [Braverman11] [Braverman- Rao 11]: IC(f) = lim k R(f k ) / k [Braverman 12]: for any π, exists τ with CC(τ) 2 Ο(IC(π)) [Gavinsky, Lovett 14] logrank conj. for IC implies is for CC

15 partition relaxed partition Communication Complexity smooth rectangle factorization norm rectangle/corruption discrepancy

16 partition relaxed partition GapHamming [Chakrabarti- Regev 11, Sherstov 11] smooth rectangle factorization norm Disjointness (Quantum) [Sherstov 08] Communication Complexity VectorInSubspace [Klartag- Regev 11] Disjointness [Kalyanasundaram- Schnitger 87, Razborov 92] rectangle/corruption [Linial- Shraibman 09, Jain- Klauck 10, Laplante- Lerays- Roland 12] Inner Product discrepancy

17 Communication Complexity partition relaxed partition Information Complexity smooth rectangle factorization norm rectangle/corruption discrepancy

18 Communication Complexity partition relaxed partition Information Complexity smooth rectangle factorization norm rectangle/corruption [Braverman- Weinstein 11] discrepancy

19 Communication Complexity partition relaxed partition KLLRX 12 Information Complexity smooth rectangle factorization norm rectangle/corruption discrepancy

20 Main Theorem of KLLRX12 Theorem: for all ε, μ, f, it holds that ICμ,ε(f) Ω( log rprtμ,ε(f) ), i.e. IC subsumes relaxed partition Corollaries 1. All known CC lower bounds imply same lower bound on IC 2. All known problems with tight LBs satisfy Direct Sum Theorems Rμk,ε(f k ) ICμk,ε(f k ) k * ICμ,ε(f) = k * Rμ,ε(f) 3. Quantum 1- way communication exponentially smaller than IC VectorInSubspace problem [Klartag- Regev 11]: Ο(log n) vs. Ω(n 1/3 ) 4. ICμ,ε(GapHamming) = Ω(n) and Direct Sum [CR 11, Sherstov 11])

21 Zero Communication protocols Let P compute f with error ϵ using C bits of communication. Then there exists a zero communication protocol π which can abort st x,y π(x,y) does not abort with probability η = 2 - C Prob[π(x,y) f(x,y) non abort] ϵ

22 Zero Communication protocols Let P compute f with error ϵ using C bits of communication. Then there exists a zero communication protocol π which can abort st x,y π(x,y) does not abort with probability η = 2 - C Prob[π(x,y) f(x,y) non abort] ϵ

23 Zero Communication protocols Let P compute f with error ϵ using C bits of communication. Then there exists a zero communication protocol π which can abort st x,y π(x,y) does not abort with probability η = 2 - C Prob[π(x,y) f(x,y) non abort] ϵ

24 Zero Communication protocols Let P compute f with error ϵ using C bits of communication. Then there exists a zero communication protocol π which can abort st x,y π(x,y) does not abort with probability η = 2 - C Prob[π(x,y) f(x,y) non abort] ϵ

25 Partition bound and ZC protocols Zero communication protocol x,y π(x,y) does not abort with probability η (efficiency) Prob[π(x,y) f(x,y) non abort] ϵ Efficiency of f := ημ,ε(f) = supπ η(π)

26 Partition bound and ZC protocols Zero communication protocol x,y π(x,y) does not abort with probability η (efficiency) Prob[π(x,y) f(x,y) non abort] ϵ Efficiency of f := ημ,ε(f) = supπ η(π) Theorem: Rμ,ε(f) Ω( log 1/ημ,ε(f) )

27 Partition bound and ZC protocols Zero communication protocol x,y π(x,y) does not abort with probability η (efficiency) Prob[π(x,y) f(x,y) non abort] ϵ Efficiency of f := ημ,ε(f) = supπ η(π) Theorem: Rμ,ε(f) Ω( log 1/ημ,ε(f) ) Theorem [LLR12]: prtμ,ε(f) = 1/ημ,ε(f)

28 Relaxed Partition bound and ZC protocols Zero communication protocol x,y π(x,y) does not abort with probability η (rel. efficiency) Prob[π(x,y) f(x,y) non abort] ϵ Efficiency of f := ημ,ε(f) = supπ η(π)

29 Relaxed Partition bound and ZC protocols Zero communication protocol x,y π(x,y) does not abort with probability η (rel. efficiency) Prob[π(x,y) f(x,y) non abort] ϵ Efficiency of f := ημ,ε(f) = supπ η(π) Thm1 [KLLRX12]: rprtμ,ε(f) = 1/ημ,ε(f) Thm2 [KLLRX12]: For any protocol, a ZC protocol with rel. eff. 2 - IC

30 Compressing IC to Zero communication x {0,1} n rpub = (t 1, t 2, t 3,..., t K ), hash h y {0,1} n 1. Create set of individually accepted transcripts Using rpub, generate candidate transcripts t 1,..., t K (K 2 CC(π) + Ο(IC(π)) ) Alice & Bob each decide whether to accept each t i Conditioned on t i accepted, distribution of t i same as π(x,y) roughly 2 Ο(IC(π)) accepted transcripts by each one 2. Find common accepted transcript Using rpub, pick a hash function that gives 0 w.p. 2 - Ο(IC(π)) Alice and Bob: If there exists accepted transcript that hashes to 0, then output according to that, else Abort. Pr[not abort] 2 - Ο(IC(π))

31 Communication Complexity partition relaxed partition KLLRX 12 Information Complexity smooth rectangle factorization norm rectangle/corruption discrepancy

32 CC =? O(IC)

33 [GKR14] show exponential gap between IC and CC for a fixed distribution μ for partial Boolean function f Function Bursting noise function New Lower Bound Technique relative discrepancy f, μ IC μ (f, ϵ) CC μ (f, ϵ) CC =? O(IC) prt rk + rec CC rprt srect rdisc μ IC μ sdisc γ 2 disc

34 [GKR14] show exponential gap between IC and CC for a fixed distribution μ for partial Boolean function f Function Bursting noise function New Lower Bound Technique relative discrepancy f, μ IC μ (f, ϵ) CC μ (f, ϵ) CC =? O(IC) prt rk + rec CC rprt srect rdisc μ IC μ sdisc γ 2 Remarks IC μ =log log n vs. CC μ =log n disc

35 [GKR14] show exponential gap between IC and CC for a fixed distribution μ for partial Boolean function f Function Bursting noise function New Lower Bound Technique relative discrepancy f, μ IC μ (f, ϵ) CC μ (f, ϵ) CC =? O(IC) prt rk + rec CC rprt srect rdisc μ IC μ sdisc γ 2 Remarks IC μ =log log n vs. CC μ =log n Conjecture 1: for any ε, μ, f, Rμ,ε(f) = Ο(ICμ,ε(f)) disc

36 [GKR14] show exponential gap between IC and CC for a fixed distribution μ for partial Boolean function f Function Bursting noise function New Lower Bound Technique relative discrepancy f, μ IC μ (f, ϵ) CC μ (f, ϵ) CC =? O(IC) prt rk + rec CC rprt srect rdisc μ IC μ sdisc γ 2 Remarks IC μ =log log n vs. CC μ =log n disc New Conjecture 1: for any ε, μ, f, Rμ,ε(f) = Ο(ICμ,ε(f))*log n

37 Understanding Relative discrepancy Can relative discrepancy separate IC from CC irrespective of the distribution on the inputs? How does relative discrepancy compare to other known bounds? What methods do we know that can still separate IC from CC?

38 Part I: Non- distributional case CC prt IC rprt [This talk] rdisc

39 Partition bound Def [JK10]: (Dual LP formulation) For any boolean function f: prtε(f) = max{αxy},{βxy} β(x Y) εα(x Y) subject to β(r) α(r f 1 (z)) 1 R, z x,y α xy 0 Thm: CCε(f) log(prtε(f))

40 Relaxed Partition bound Def: (Dual LP formulation) For any boolean function f: rprtε(f) = max{αxy},{βxy} β(x Y) εα(x Y) subject to β(r) α(r f 1 (z)) 1 R, z x,y α xy 0 α xy β xy 0 Thm: CCε(f) log(prtε(f)) log(rprtε(f))

41 Relative Discrepancy Def: [GKR14] (NB: not LP) rdisc ε (f,μ) = supκ,δ,{ρxy} 1/δ(½ κ ε) subject to μ(r f 1 (z)) (½ κ)ρ(r) R,z s.t. ρ(r) δ ρ(x Y) = 1 ρxy 0 x,y 0 κ < ½, 0 < δ < 1 rdisc ε (f)= maxμ rdisc ε (f,μ)

42 Def: [GKR14] (NB: not LP) rdisc ε (f,μ) = supκ,δ,{ρxy} 1/δ(½ κ ε) subject to μ(r f 1 (z)) (½ κ)ρ(r) R,z s.t. ρ(r) δ ρ(x Y) = 1 ρxy 0 x,y 0 κ < ½, 0 < δ < 1 rdisc ε (f)= maxμ rdisc ε (f,μ) Relative Discrepancy rescale rectangle size

43 Relative discrepancy relaxed partition Consider any feasible solution for rdisc Apply change of variables to obtain a feasible solution to rprt with a higher objective value.

44 Relative discrepancy relaxed partition rdisc ε (f) = maxμ supκ,δ,{ρxy} 1/δ(½ κ ε) s.t. μ(r f 1 (z)) (½ κ)ρ(r) R,z s.t. ρ(r) δ ρ(x Y) = 1 ρxy 0 x,y 0 κ < ½, 0 < δ < 1

45 Relative discrepancy relaxed partition rdisc ε (f) = maxμ supκ,δ,{ρxy} 1/δ(½ κ ε) s.t. μ(r f 1 (z)) (½ κ)ρ(r) R,z s.t. ρ(r) δ ρ(x Y) = 1 ρxy 0 x,y 0 κ < ½, 0 < δ < 1 α β μ β ρ rprt ε (f) = max{αxy},{βxy} β(x Y) εα(x Y) s.t. β(r) α(r f 1 (z)) 1 R, z α 0 αxy βxy 0 x,y

46 Part I: Non- distributional case CC prt IC rprt rdisc Can relative discrepancy separate IC from CC irrespective of the distribution on the inputs? NO

47 Part II: Fixed μ prt ε (f; μ) = max α, {βxy} β(x Y) ε αμ(x Y) s.t. β(r) αμ(r f 1 (z)) 1 R, z α 0

48 Part II: Fixed μ prt ε (f; μ) = max α, {βxy} β(x Y) ε αμ(x Y) s.t. β(r) αμ(r f 1 (z)) 1 R, z α 0 non-distributional α β μ β ρ

49 Part II: Fixed μ prt ε (f; μ) = max α, {βxy} β(x Y) ε αμ(x Y) s.t. β(r) αμ(r f 1 (z)) 1 R, z α 0 non-distributional α β μ β ρ

50 Part II: Fixed μ prt ε (f; μ) = max α, {βxy} β(x Y) ε αμ(x Y) s.t. β(r) αμ(r f 1 (z)) 1 R, z α 0 non-distributional Distributional: exponential separation Non-distributional: collapse α β μ β ρ

51 Lower bounds with partitions pprt CC pprt [JLV14] : quadratically tight bound prt ardisc rprt IC ardisc [GKR14]: strong new bound rdisc Remark: both use partitions and not rectangles

52 ardisc vs pprt CC prt IC rprt rk + srect rec sdisc γ 2 disc

53 Information vs. Communication Complexity How close are IC and CC? Can I compress an interaction? NEW Conjecture 1: for any ε, μ, f, Rμ,ε(f) = Ο(ICμ,ε(f))*log n Conjecture 2: for any ε, f, Rε(f) = Ο(ICε(f))

54 Information vs. Communication Complexity How close are IC and CC? Can I compress an interaction? NEW Conjecture 1: for any ε, μ, f, Rμ,ε(f) = Ο(ICμ,ε(f))*log n Conjecture 2: for any ε, f, Rε(f) = Ο(ICε(f)) [GKR15]: f, IC(f, ϵ) CC(f, ϵ) Remarks: IC=log log n vs. CC=log n New Conjecture 2: for any ε, μ, f, Rμ,ε(f) = Ο(ICμ,ε(f))*log n

55 Summary LP formulation of distributional relative discrepancy : partition bound + nonnegativity constraint Non- distributional relative discrepancy is less than IC Partition bound is not a lower bound on IC for fixed distributions ardisc is quadratically tight for communication complexity

56 n vs log(n) separation of distributional IC and CC Give separation for wprt/pprt; prt/rprt Is the partition bound a lower bound on IC? Requires new compression techniques to obtain constant efficiency ZCP Is it tight for CC? Open Problems Recover a communication protocol from a ZCP

57 Thank you

58 Compressing CC to Zero communication x {0,1} n rpub (= transcript t) y {0,1} n Theorem: if π (μ,ε)- good for f using communication, then can build ZCP τ (μ,ε)- good for f with η(τ) 2 - CC(π) Proof: τ uses rpub to guess random transcript t Alice (resp. Bob) sees if t is consistent with x (resp. y) If so, output what π outputs, otherwise output Abort Pr[t not abort] 2 - CC(π) CCμ,ε(f) Ω( log 1/ημ,ε(f) )

59 Compressing IC to Zero communication x {0,1} n rpub = (t 1, t 2, t 3,..., t K ), hash h y {0,1} n Theorem: for all ε, μ, f, it holds that ICμ,ε(f) Ω( log 1/ημ,ε(f) )

60 Public coin partition bound Max efficiency of a zero- communication protocol : Players pick a partition at random; Play following a uniformly distributed rectangle in the partition, Abort if either input not in rectangle.

61 Adaptive relative discrepancy Def: [GKR14] (NB: not LP) ardisc ε (f,μ) = supκ,δ,{ρ P xy} 1/δ(½ κ ε) subject to μ(r f 1 (z)) (½ κ)ρ P (R) P, (R,z) P s.t. ρ P (R) δ ρ P (X Y) = 1 ρ P xy 0 x,y 0 κ < ½, 0 < δ < 1 ardisc ε (f)= maxμ ardisc ε (f,μ)

Exponential Separation of Quantum Communication and Classical Information

Exponential Separation of Quantum Communication and Classical Information Exponential Separation of Quantum Communication and Classical Information Dave Touchette IQC and C&O, University of Waterloo, and Perimeter Institute for Theoretical Physics jt. work with Anurag Anshu

More information

Lower bounds on information complexity via zero-communication protocols and applications

Lower bounds on information complexity via zero-communication protocols and applications Lower bounds on information complexity via zero-communication protocols and applications Iordanis Kerenidis Sophie Laplante Virginie Lerays Jérémie Roland David Xiao April 6, 2012 Abstract We show that

More information

Information Complexity and Applications. Mark Braverman Princeton University and IAS FoCM 17 July 17, 2017

Information Complexity and Applications. Mark Braverman Princeton University and IAS FoCM 17 July 17, 2017 Information Complexity and Applications Mark Braverman Princeton University and IAS FoCM 17 July 17, 2017 Coding vs complexity: a tale of two theories Coding Goal: data transmission Different channels

More information

Lecture Lecture 9 October 1, 2015

Lecture Lecture 9 October 1, 2015 CS 229r: Algorithms for Big Data Fall 2015 Lecture Lecture 9 October 1, 2015 Prof. Jelani Nelson Scribe: Rachit Singh 1 Overview In the last lecture we covered the distance to monotonicity (DTM) and longest

More information

An Information Complexity Approach to the Inner Product Problem

An Information Complexity Approach to the Inner Product Problem An Information Complexity Approach to the Inner Product Problem William Henderson-Frost Advisor: Amit Chakrabarti Senior Honors Thesis Submitted to the faculty in partial fulfillment of the requirements

More information

Lecture 16 Oct 21, 2014

Lecture 16 Oct 21, 2014 CS 395T: Sublinear Algorithms Fall 24 Prof. Eric Price Lecture 6 Oct 2, 24 Scribe: Chi-Kit Lam Overview In this lecture we will talk about information and compression, which the Huffman coding can achieve

More information

A strong direct product theorem in terms of the smooth rectangle bound

A strong direct product theorem in terms of the smooth rectangle bound A strong direct product theorem in terms of the smooth rectangle bound Rahul Jain Centre for Quantum Technologies and Department of Computer Science National U. Singapore E-mail: rahul@comp.nus.edu.sg.

More information

Lecture 16: Communication Complexity

Lecture 16: Communication Complexity CSE 531: Computational Complexity I Winter 2016 Lecture 16: Communication Complexity Mar 2, 2016 Lecturer: Paul Beame Scribe: Paul Beame 1 Communication Complexity In (two-party) communication complexity

More information

Interactive information and coding theory

Interactive information and coding theory Interactive information and coding theory Mark Braverman Abstract. We give a high-level overview of recent developments in interactive information and coding theory. These include developments involving

More information

The Communication Complexity of Correlation. Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan

The Communication Complexity of Correlation. Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan The Communication Complexity of Correlation Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan Transmitting Correlated Variables (X, Y) pair of correlated random variables Transmitting

More information

Lecture 20: Lower Bounds for Inner Product & Indexing

Lecture 20: Lower Bounds for Inner Product & Indexing 15-859: Information Theory and Applications in TCS CMU: Spring 201 Lecture 20: Lower Bounds for Inner Product & Indexing April 9, 201 Lecturer: Venkatesan Guruswami Scribe: Albert Gu 1 Recap Last class

More information

An Introduction to Quantum Information and Applications

An Introduction to Quantum Information and Applications An Introduction to Quantum Information and Applications Iordanis Kerenidis CNRS LIAFA-Univ Paris-Diderot Quantum information and computation Quantum information and computation How is information encoded

More information

Approximation norms and duality for communication complexity lower bounds

Approximation norms and duality for communication complexity lower bounds Approximation norms and duality for communication complexity lower bounds Troy Lee Columbia University Adi Shraibman Weizmann Institute From min to max The cost of a best algorithm is naturally phrased

More information

14. Direct Sum (Part 1) - Introduction

14. Direct Sum (Part 1) - Introduction Communication Complexity 14 Oct, 2011 (@ IMSc) 14. Direct Sum (Part 1) - Introduction Lecturer: Prahladh Harsha Scribe: Abhishek Dang 14.1 Introduction The Direct Sum problem asks how the difficulty in

More information

An exponential separation between quantum and classical one-way communication complexity

An exponential separation between quantum and classical one-way communication complexity An exponential separation between quantum and classical one-way communication complexity Ashley Montanaro Centre for Quantum Information and Foundations, Department of Applied Mathematics and Theoretical

More information

THE UNIVERSITY OF CHICAGO COMMUNICATION COMPLEXITY AND INFORMATION COMPLEXITY A DISSERTATION SUBMITTED TO

THE UNIVERSITY OF CHICAGO COMMUNICATION COMPLEXITY AND INFORMATION COMPLEXITY A DISSERTATION SUBMITTED TO THE UNIVERSITY OF CHICAGO COMMUNICATION COMPLEXITY AND INFORMATION COMPLEXITY A DISSERTATION SUBMITTED TO THE FACULTY OF THE DIVISION OF THE PHYSICAL SCIENCES IN CANDIDACY FOR THE DEGREE OF DOCTOR OF PHILOSOPHY

More information

Direct product theorem for discrepancy

Direct product theorem for discrepancy Direct product theorem for discrepancy Troy Lee Rutgers University Joint work with: Robert Špalek Direct product theorems Knowing how to compute f, how can you compute f f f? Obvious upper bounds: If can

More information

Information Complexity vs. Communication Complexity: Hidden Layers Game

Information Complexity vs. Communication Complexity: Hidden Layers Game Information Complexity vs. Communication Complexity: Hidden Layers Game Jiahui Liu Final Project Presentation for Information Theory in TCS Introduction Review of IC vs CC Hidden Layers Game Upper Bound

More information

Communication Complexity 16:198:671 2/15/2010. Lecture 6. P (x) log

Communication Complexity 16:198:671 2/15/2010. Lecture 6. P (x) log Communication Complexity 6:98:67 2/5/200 Lecture 6 Lecturer: Nikos Leonardos Scribe: Troy Lee Information theory lower bounds. Entropy basics Let Ω be a finite set and P a probability distribution on Ω.

More information

Direct product theorem for discrepancy

Direct product theorem for discrepancy Direct product theorem for discrepancy Troy Lee Rutgers University Adi Shraibman Weizmann Institute of Science Robert Špalek Google Direct product theorems: Why is Google interested? Direct product theorems:

More information

Clique vs. Independent Set

Clique vs. Independent Set Lower Bounds for Clique vs. Independent Set Mika Göös University of Toronto Mika Göös (Univ. of Toronto) Clique vs. Independent Set rd February 5 / 4 On page 6... Mika Göös (Univ. of Toronto) Clique vs.

More information

How to Compress Interactive Communication

How to Compress Interactive Communication How to Compress Interactive Communication Boaz Barak Mark Braverman Xi Chen Anup Rao March 1, 2013 Abstract We describe new ways to simulate 2-party communication protocols to get protocols with potentially

More information

Direct product theorem for discrepancy

Direct product theorem for discrepancy Direct product theorem for discrepancy Troy Lee Rutgers University Robert Špalek Google Direct product theorems: Why is Google interested? Direct product theorems: Why should Google be interested? Direct

More information

CS Foundations of Communication Complexity

CS Foundations of Communication Complexity CS 2429 - Foundations of Communication Complexity Lecturer: Sergey Gorbunov 1 Introduction In this lecture we will see how to use methods of (conditional) information complexity to prove lower bounds for

More information

Fourier analysis of boolean functions in quantum computation

Fourier analysis of boolean functions in quantum computation Fourier analysis of boolean functions in quantum computation Ashley Montanaro Centre for Quantum Information and Foundations, Department of Applied Mathematics and Theoretical Physics, University of Cambridge

More information

On Quantum vs. Classical Communication Complexity

On Quantum vs. Classical Communication Complexity On Quantum vs. Classical Communication Complexity Dmitry Gavinsky Institute of Mathematics, Praha Czech Academy of Sciences Introduction Introduction The setting of communication complexity is one of the

More information

Simultaneous Communication Protocols with Quantum and Classical Messages

Simultaneous Communication Protocols with Quantum and Classical Messages Simultaneous Communication Protocols with Quantum and Classical Messages Oded Regev Ronald de Wolf July 17, 2008 Abstract We study the simultaneous message passing model of communication complexity, for

More information

Nondeterminism LECTURE Nondeterminism as a proof system. University of California, Los Angeles CS 289A Communication Complexity

Nondeterminism LECTURE Nondeterminism as a proof system. University of California, Los Angeles CS 289A Communication Complexity University of California, Los Angeles CS 289A Communication Complexity Instructor: Alexander Sherstov Scribe: Matt Brown Date: January 25, 2012 LECTURE 5 Nondeterminism In this lecture, we introduce nondeterministic

More information

Grothendieck Inequalities, XOR games, and Communication Complexity

Grothendieck Inequalities, XOR games, and Communication Complexity Grothendieck Inequalities, XOR games, and Communication Complexity Troy Lee Rutgers University Joint work with: Jop Briët, Harry Buhrman, and Thomas Vidick Overview Introduce XOR games, Grothendieck s

More information

The Computational Complexity Column

The Computational Complexity Column The Computational Complexity Column by Vikraman Arvind Institute of Mathematical Sciences, CIT Campus, Taramani Chennai 600113, India arvind@imsc.res.in http://www.imsc.res.in/~arvind Communication complexity

More information

Partitions and Covers

Partitions and Covers University of California, Los Angeles CS 289A Communication Complexity Instructor: Alexander Sherstov Scribe: Dong Wang Date: January 2, 2012 LECTURE 4 Partitions and Covers In previous lectures, we saw

More information

Communication is bounded by root of rank

Communication is bounded by root of rank Electronic Colloquium on Computational Complexity, Report No. 84 (2013) Communication is bounded by root of rank Shachar Lovett June 7, 2013 Abstract We prove that any total boolean function of rank r

More information

Information Theory + Polyhedral Combinatorics

Information Theory + Polyhedral Combinatorics Information Theory + Polyhedral Combinatorics Sebastian Pokutta Georgia Institute of Technology ISyE, ARC Joint work Gábor Braun Information Theory in Complexity Theory and Combinatorics Simons Institute

More information

Quantum Communication Complexity

Quantum Communication Complexity Quantum Communication Complexity Ronald de Wolf Communication complexity has been studied extensively in the area of theoretical computer science and has deep connections with seemingly unrelated areas,

More information

Tight Bounds for Distributed Functional Monitoring

Tight Bounds for Distributed Functional Monitoring Tight Bounds for Distributed Functional Monitoring Qin Zhang MADALGO, Aarhus University Joint with David Woodruff, IBM Almaden NII Shonan meeting, Japan Jan. 2012 1-1 The distributed streaming model (a.k.a.

More information

Quantum Communication

Quantum Communication Quantum Communication Harry Buhrman CWI & University of Amsterdam Physics and Computing Computing is physical Miniaturization quantum effects Quantum Computers ) Enables continuing miniaturization ) Fundamentally

More information

CS Communication Complexity: Applications and New Directions

CS Communication Complexity: Applications and New Directions CS 2429 - Communication Complexity: Applications and New Directions Lecturer: Toniann Pitassi 1 Introduction In this course we will define the basic two-party model of communication, as introduced in the

More information

The one-way communication complexity of the Boolean Hidden Matching Problem

The one-way communication complexity of the Boolean Hidden Matching Problem The one-way communication complexity of the Boolean Hidden Matching Problem Iordanis Kerenidis CRS - LRI Université Paris-Sud jkeren@lri.fr Ran Raz Faculty of Mathematics Weizmann Institute ran.raz@weizmann.ac.il

More information

EXPONENTIAL SEPARATION OF QUANTUM AND CLASSICAL ONE-WAY COMMUNICATION COMPLEXITY

EXPONENTIAL SEPARATION OF QUANTUM AND CLASSICAL ONE-WAY COMMUNICATION COMPLEXITY EXPONENTIAL SEPARATION OF QUANTUM AND CLASSICAL ONE-WAY COMMUNICATION COMPLEXITY ZIV BAR-YOSSEF, T. S. JAYRAM, AND IORDANIS KERENIDIS Abstract. We give the first exponential separation between quantum

More information

Communication with Imperfect Shared Randomness

Communication with Imperfect Shared Randomness Communication with Imperfect Shared Randomness (Joint work with Venkatesan Guruswami (CMU), Raghu Meka (?) and Madhu Sudan (MSR)) Who? Clément Canonne (Columbia University) When? November 19, 2014 1 /

More information

CS Foundations of Communication Complexity

CS Foundations of Communication Complexity CS 49 - Foundations of Communication Complexity Lecturer: Toniann Pitassi 1 The Discrepancy Method Cont d In the previous lecture we ve outlined the discrepancy method, which is a method for getting lower

More information

Lower Bound Techniques for Multiparty Communication Complexity

Lower Bound Techniques for Multiparty Communication Complexity Lower Bound Techniques for Multiparty Communication Complexity Qin Zhang Indiana University Bloomington Based on works with Jeff Phillips, Elad Verbin and David Woodruff 1-1 The multiparty number-in-hand

More information

Majority is incompressible by AC 0 [p] circuits

Majority is incompressible by AC 0 [p] circuits Majority is incompressible by AC 0 [p] circuits Igor Carboni Oliveira Columbia University Joint work with Rahul Santhanam (Univ. Edinburgh) 1 Part 1 Background, Examples, and Motivation 2 Basic Definitions

More information

The Gaussians Distribution

The Gaussians Distribution CSE 206A: Lattice Algorithms and Applications Winter 2016 The Gaussians Distribution Instructor: Daniele Micciancio UCSD CSE 1 The real fourier transform Gaussian distributions and harmonic analysis play

More information

Lecture 18: Quantum Information Theory and Holevo s Bound

Lecture 18: Quantum Information Theory and Holevo s Bound Quantum Computation (CMU 1-59BB, Fall 2015) Lecture 1: Quantum Information Theory and Holevo s Bound November 10, 2015 Lecturer: John Wright Scribe: Nicolas Resch 1 Question In today s lecture, we will

More information

Multi-Party Quantum Communication Complexity with Routed Messages

Multi-Party Quantum Communication Complexity with Routed Messages Multi-Party Quantum Communication Complexity with Routed Messages Seiichiro Tani Masaki Nakanishi Shigeru Yamashita Abstract This paper describes a general quantum lower bounding technique for the communication

More information

Communication with Contextual Uncertainty

Communication with Contextual Uncertainty Communication with Contextual Uncertainty Badih Ghazi Ilan Komargodski Pravesh Kothari Madhu Sudan July 9, 205 Abstract We introduce a simple model illustrating the role of context in communication and

More information

Chebyshev Polynomials, Approximate Degree, and Their Applications

Chebyshev Polynomials, Approximate Degree, and Their Applications Chebyshev Polynomials, Approximate Degree, and Their Applications Justin Thaler 1 Georgetown University Boolean Functions Boolean function f : { 1, 1} n { 1, 1} AND n (x) = { 1 (TRUE) if x = ( 1) n 1 (FALSE)

More information

Lecture 11: Quantum Information III - Source Coding

Lecture 11: Quantum Information III - Source Coding CSCI5370 Quantum Computing November 25, 203 Lecture : Quantum Information III - Source Coding Lecturer: Shengyu Zhang Scribe: Hing Yin Tsang. Holevo s bound Suppose Alice has an information source X that

More information

The Limits of Two-Party Differential Privacy

The Limits of Two-Party Differential Privacy Electronic Colloquium on Computational Complexity, Report No. 106 (2011) The Limits of Two-Party Differential Privacy Andrew McGregor Ilya Mironov Toniann Pitassi Omer Reingold Kunal Talwar Salil Vadhan

More information

Direct Product Theorems for Classical Communication Complexity via Subdistribution Bounds

Direct Product Theorems for Classical Communication Complexity via Subdistribution Bounds Direct Product Theorems for Classical Communication Complexity via Subdistribution Bounds Rahul Jain U. Waterloo Hartmut Klauck Goethe-Universität Frankfurt December 17, 2007 Ashwin Nayak U. Waterloo &

More information

Towards a Reverse Newman s Theorem in Interactive Information Complexity

Towards a Reverse Newman s Theorem in Interactive Information Complexity Towards a Reverse Newman s Theorem in Interactive Information Complexity Joshua Brody 1, Harry Buhrman 2,3, Michal Koucký 4, Bruno Loff 2, Florian Speelman 2, and Nikolay Vereshchagin 5 1 Aarhus University

More information

An Optimal Lower Bound on the Communication Complexity of GAP-HAMMING-DISTANCE

An Optimal Lower Bound on the Communication Complexity of GAP-HAMMING-DISTANCE An Optimal Lower Bound on the Communication Complexity of GAP-HAMMING-DISTANCE Amit Chakrabarti Oded Regev June 29, 2012 Abstract We prove an optimal Ω(n) lower bound on the randomized communication complexity

More information

The Hardness of Being Private

The Hardness of Being Private The Hardness of Being Private Anil Ada Arkadev Chattopadhyay Stephen Cook Lila Fontes Michal Koucký Toniann Pitassi September 20, 2012 Abstract In 1989 Kushilevitz [1] initiated the study of information-theoretic

More information

Introduction to Nonlinear Control Lecture # 3 Time-Varying and Perturbed Systems

Introduction to Nonlinear Control Lecture # 3 Time-Varying and Perturbed Systems p. 1/5 Introduction to Nonlinear Control Lecture # 3 Time-Varying and Perturbed Systems p. 2/5 Time-varying Systems ẋ = f(t, x) f(t, x) is piecewise continuous in t and locally Lipschitz in x for all t

More information

Near-Optimal Lower Bounds on the Multi-Party Communication Complexity of Set Disjointness

Near-Optimal Lower Bounds on the Multi-Party Communication Complexity of Set Disjointness Near-Optimal Lower Bounds on the Multi-Party Communication Complexity of Set Disjointness Amit Chakrabarti School of Mathematics Institute for Advanced Study Princeton, NJ 08540 amitc@ias.edu Subhash Khot

More information

Course Introduction. 1.1 Overview. Lecture 1. Scribe: Tore Frederiksen

Course Introduction. 1.1 Overview. Lecture 1. Scribe: Tore Frederiksen Lecture 1 Course Introduction Scribe: Tore Frederiksen 1.1 Overview Assume we are in a situation where several people need to work together in order to solve a problem they cannot solve on their own. The

More information

Quantum rate distortion, reverse Shannon theorems, and source-channel separation

Quantum rate distortion, reverse Shannon theorems, and source-channel separation Quantum rate distortion, reverse Shannon theorems, and source-channel separation ilanjana Datta, Min-Hsiu Hsieh, Mark Wilde (1) University of Cambridge,U.K. (2) McGill University, Montreal, Canada Classical

More information

Rank minimization via the γ 2 norm

Rank minimization via the γ 2 norm Rank minimization via the γ 2 norm Troy Lee Columbia University Adi Shraibman Weizmann Institute Rank Minimization Problem Consider the following problem min X rank(x) A i, X b i for i = 1,..., k Arises

More information

Simultaneous Communication Protocols with Quantum and Classical Messages

Simultaneous Communication Protocols with Quantum and Classical Messages Simultaneous Communication Protocols with Quantum and Classical Messages Dmitry Gavinsky Oded Regev Ronald de Wolf December 29, 2008 Abstract We study the simultaneous message passing (SMP) model of communication

More information

18.310A Final exam practice questions

18.310A Final exam practice questions 18.310A Final exam practice questions This is a collection of practice questions, gathered randomly from previous exams and quizzes. They may not be representative of what will be on the final. In particular,

More information

On the tightness of the Buhrman-Cleve-Wigderson simulation

On the tightness of the Buhrman-Cleve-Wigderson simulation On the tightness of the Buhrman-Cleve-Wigderson simulation Shengyu Zhang Department of Computer Science and Engineering, The Chinese University of Hong Kong. syzhang@cse.cuhk.edu.hk Abstract. Buhrman,

More information

Exponential Separation of Quantum and Classical Non-Interactive Multi-Party Communication Complexity

Exponential Separation of Quantum and Classical Non-Interactive Multi-Party Communication Complexity Electronic Colloquium on Computational Complexity, Report No. 74 (2007) Exponential Separation of Quantum and Classical Non-Interactive ulti-party Communication Complexity Dmitry Gavinsky Pavel Pudlák

More information

Strengths and Weaknesses of Quantum Fingerprinting

Strengths and Weaknesses of Quantum Fingerprinting Strengths and Weaknesses of Quantum Fingerprinting Dmitry Gavinsky University of Calgary Julia Kempe CNRS & LRI Univ. de Paris-Sud, Orsay Ronald de Wolf CWI, Amsterdam Abstract We study the power of quantum

More information

LOWER BOUNDS FOR QUANTUM COMMUNICATION COMPLEXITY

LOWER BOUNDS FOR QUANTUM COMMUNICATION COMPLEXITY LOWER BOUNDS FOR QUANTUM COMMUNICATION COMPLEXITY HARTMUT KLAUCK Abstract. We prove lower bounds on the bounded error quantum communication complexity. Our methods are based on the Fourier transform of

More information

Lecture 10 + additional notes

Lecture 10 + additional notes CSE533: Information Theorn Computer Science November 1, 2010 Lecturer: Anup Rao Lecture 10 + additional notes Scribe: Mohammad Moharrami 1 Constraint satisfaction problems We start by defining bivariate

More information

Communication Complexity

Communication Complexity Communication Complexity Jaikumar Radhakrishnan School of Technology and Computer Science Tata Institute of Fundamental Research Mumbai 31 May 2012 J 31 May 2012 1 / 13 Plan 1 Examples, the model, the

More information

Matrix Rank in Communication Complexity

Matrix Rank in Communication Complexity University of California, Los Angeles CS 289A Communication Complexity Instructor: Alexander Sherstov Scribe: Mohan Yang Date: January 18, 2012 LECTURE 3 Matrix Rank in Communication Complexity This lecture

More information

arxiv: v1 [cs.cc] 6 Apr 2010

arxiv: v1 [cs.cc] 6 Apr 2010 arxiv:1004.0817v1 [cs.cc] 6 Apr 2010 A Separation of NP and conp in Multiparty Communication Complexity Dmitry Gavinsky Alexander A. Sherstov Abstract We prove that NP conp and conp MA in the number-onforehead

More information

Multiparty Computation

Multiparty Computation Multiparty Computation Principle There is a (randomized) function f : ({0, 1} l ) n ({0, 1} l ) n. There are n parties, P 1,...,P n. Some of them may be adversarial. Two forms of adversarial behaviour:

More information

Lower bounds for Edit Distance and Product Metrics via Poincaré-Type Inequalities

Lower bounds for Edit Distance and Product Metrics via Poincaré-Type Inequalities Lower bounds for Edit Distance and Product Metrics via Poincaré-Type Inequalities Alexandr Andoni Princeton U./CCI andoni@mit.edu T.S. Jayram IBM Almaden jayram@almaden.ibm.com Mihai Pǎtraşcu AT&T Labs

More information

Compute the Fourier transform on the first register to get x {0,1} n x 0.

Compute the Fourier transform on the first register to get x {0,1} n x 0. CS 94 Recursive Fourier Sampling, Simon s Algorithm /5/009 Spring 009 Lecture 3 1 Review Recall that we can write any classical circuit x f(x) as a reversible circuit R f. We can view R f as a unitary

More information

Spectral gaps and geometric representations. Amir Yehudayoff (Technion)

Spectral gaps and geometric representations. Amir Yehudayoff (Technion) Spectral gaps and geometric representations Amir Yehudayoff (Technion) Noga Alon (Tel Aviv) Shay Moran (Simons) pseudorandomness random objects have properties difficult to describe hard to compute expanders

More information

Tight Bounds for Single-Pass Streaming Complexity of the Set Cover Problem

Tight Bounds for Single-Pass Streaming Complexity of the Set Cover Problem Tight Bounds for Single-Pass Streaming Complexity of the Set Cover Problem Sepehr Assadi Department of Computer and Information Science University of Pennsylvania Philadelphia, PA, USA sassadi@cis.upenn.edu

More information

An Approximation Algorithm for Approximation Rank

An Approximation Algorithm for Approximation Rank An Approximation Algorithm for Approximation Rank Troy Lee Columbia University Adi Shraibman Weizmann Institute Conventions Identify a communication function f : X Y { 1, +1} with the associated X-by-Y

More information

Nonlinear Systems and Control Lecture # 19 Perturbed Systems & Input-to-State Stability

Nonlinear Systems and Control Lecture # 19 Perturbed Systems & Input-to-State Stability p. 1/1 Nonlinear Systems and Control Lecture # 19 Perturbed Systems & Input-to-State Stability p. 2/1 Perturbed Systems: Nonvanishing Perturbation Nominal System: Perturbed System: ẋ = f(x), f(0) = 0 ẋ

More information

Maximal Noise in Interactive Communication over Erasure Channels and Channels with Feedback

Maximal Noise in Interactive Communication over Erasure Channels and Channels with Feedback Maximal Noise in Interactive Communication over Erasure Channels and Channels with Feedback Klim Efremenko UC Berkeley klimefrem@gmail.com Ran Gelles Princeton University rgelles@cs.princeton.edu Bernhard

More information

Communication Lower Bounds via Critical Block Sensitivity

Communication Lower Bounds via Critical Block Sensitivity Communication Lower Bounds via Critical Block Sensitivity Mika Göös & Toniann Pitassi University of Toronto Göös & Pitassi (Univ. of Toronto) Communication Lower Bounds 13th January 2014 1 / 18 Communication

More information

Hardness of MST Construction

Hardness of MST Construction Lecture 7 Hardness of MST Construction In the previous lecture, we saw that an MST can be computed in O( n log n+ D) rounds using messages of size O(log n). Trivially, Ω(D) rounds are required, but what

More information

ECE 4400:693 - Information Theory

ECE 4400:693 - Information Theory ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential

More information

Monotone Circuits for Matching Require. Linear Depth

Monotone Circuits for Matching Require. Linear Depth Monotone Circuits for Matching Require Linear Depth Ran Raz Avi Wigderson The Hebrew University February 11, 2003 Abstract We prove that monotone circuits computing the perfect matching function on n-vertex

More information

How to Compress Interactive Communication

How to Compress Interactive Communication How to Compress Interactive Communication Boaz Barak Mark Braverman Xi Chen Anup Rao October 15, 2010 Abstract We describe new ways to simulate 2-party communication protocols to get protocols with potentially

More information

An approximation algorithm for approximation rank

An approximation algorithm for approximation rank An approximation algorithm for approximation rank Troy Lee Department of Computer Science Rutgers University Adi Shraibman Department of Mathematics Weizmann Institute of Science arxiv:0809.2093v1 [cs.cc]

More information

Tight Bounds for Distributed Streaming

Tight Bounds for Distributed Streaming Tight Bounds for Distributed Streaming (a.k.a., Distributed Functional Monitoring) David Woodruff IBM Research Almaden Qin Zhang MADALGO, Aarhus Univ. STOC 12 May 22, 2012 1-1 The distributed streaming

More information

Computer Science A Cryptography and Data Security. Claude Crépeau

Computer Science A Cryptography and Data Security. Claude Crépeau Computer Science 308-547A Cryptography and Data Security Claude Crépeau These notes are, largely, transcriptions by Anton Stiglic of class notes from the former course Cryptography and Data Security (308-647A)

More information

CS Introduction to Complexity Theory. Lecture #11: Dec 8th, 2015

CS Introduction to Complexity Theory. Lecture #11: Dec 8th, 2015 CS 2401 - Introduction to Complexity Theory Lecture #11: Dec 8th, 2015 Lecturer: Toniann Pitassi Scribe Notes by: Xu Zhao 1 Communication Complexity Applications Communication Complexity (CC) has many

More information

Lower Bounds for Number-in-Hand Multiparty Communication Complexity, Made Easy

Lower Bounds for Number-in-Hand Multiparty Communication Complexity, Made Easy Lower Bounds for Number-in-Hand Multiparty Communication Complexity, Made Easy Jeff M. Phillips School of Computing University of Utah jeffp@cs.utah.edu Elad Verbin Dept. of Computer Science Aarhus University,

More information

Internal Compression of Protocols to Entropy

Internal Compression of Protocols to Entropy Internal Compression of Protocols to Entropy Balthazar Bauer 1, Shay Moran 2, and Amir Yehudayoff 3 1 Département d Infomatique, ENS Lyon Lyon, France balthazarbauer@aol.com 2 Departments of Computer Science,

More information

Linear sketching for Functions over Boolean Hypercube

Linear sketching for Functions over Boolean Hypercube Linear sketching for Functions over Boolean Hypercube Grigory Yaroslavtsev (Indiana University, Bloomington) http://grigory.us with Sampath Kannan (U. Pennsylvania), Elchanan Mossel (MIT) and Swagato Sanyal

More information

A Polynomial Time Algorithm for Lossy Population Recovery

A Polynomial Time Algorithm for Lossy Population Recovery A Polynomial Time Algorithm for Lossy Population Recovery Ankur Moitra Massachusetts Institute of Technology joint work with Mike Saks A Story A Story A Story A Story A Story Can you reconstruct a description

More information

Quantum Communication Complexity

Quantum Communication Complexity Quantum Communication Complexity Hartmut Klauck FB Informatik, Johann-Wolfgang-Goethe-Universität 60054 Frankfurt am Main, Germany Abstract This paper surveys the field of quantum communication complexity.

More information

Stream Computation and Arthur- Merlin Communication

Stream Computation and Arthur- Merlin Communication Stream Computation and Arthur- Merlin Communication Justin Thaler, Yahoo! Labs Joint Work with: Amit Chakrabarti, Dartmouth Graham Cormode, University of Warwick Andrew McGregor, Umass Amherst Suresh Venkatasubramanian,

More information

A direct product theorem for the two-party bounded-round public-coin communication complexity

A direct product theorem for the two-party bounded-round public-coin communication complexity A direct product theorem for the two-party bounded-round public-coin communication complexity Rahul Jain Centre for Quantum Technologies and Department of Computer Science National University of Singapore

More information

Topology Matters in Communication

Topology Matters in Communication Electronic Colloquium on Computational Complexity, Report No. 74 204 Topology Matters in Communication Arkadev Chattopadhyay Jaikumar Radhakrishnan Atri Rudra May 4, 204 Abstract We provide the first communication

More information

Computing and Communications 2. Information Theory -Entropy

Computing and Communications 2. Information Theory -Entropy 1896 1920 1987 2006 Computing and Communications 2. Information Theory -Entropy Ying Cui Department of Electronic Engineering Shanghai Jiao Tong University, China 2017, Autumn 1 Outline Entropy Joint entropy

More information

Information Theoretic Limits of Randomness Generation

Information Theoretic Limits of Randomness Generation Information Theoretic Limits of Randomness Generation Abbas El Gamal Stanford University Shannon Centennial, University of Michigan, September 2016 Information theory The fundamental problem of communication

More information

Lecture 6: Quantum error correction and quantum capacity

Lecture 6: Quantum error correction and quantum capacity Lecture 6: Quantum error correction and quantum capacity Mark M. Wilde The quantum capacity theorem is one of the most important theorems in quantum hannon theory. It is a fundamentally quantum theorem

More information

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information.

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information. L65 Dept. of Linguistics, Indiana University Fall 205 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission rate

More information

Dept. of Linguistics, Indiana University Fall 2015

Dept. of Linguistics, Indiana University Fall 2015 L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 28 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission

More information