Interactive Communication for Data Exchange
|
|
- Kevin Melton
- 5 years ago
- Views:
Transcription
1 Interactive Communication for Data Exchange Himanshu Tyagi Indian Institute of Science, Bangalore Joint work with Pramod Viswanath and Shun Watanabe
2 The Data Exchange Problem [ElGamal-Orlitsky 84], [Csiszár-Narayan 04] X Y by b X A protocol π constitutes an ɛ-data exchange (ɛ-de) protocol if ( ) Pr ˆX = X, Ŷ = Y 1 ɛ. What is the minimum length L ɛ (X, Y ) of an ɛ-de protocol? 1
3 The Slepian-Wolf Problem Only X needs to be sent to an observer of Y. [Slepian-Wolf 73] Optimal rate for the case of IID observations: Rɛ = H(X Y ), 0 < ɛ < 1. [Miyake-Kanaya 95] Single-shot bounds: L ɛ (X Y ) λ + log [1 ɛ Pr (h(x Y ) λ)] : lower bound L ɛ (X Y ) λ log [ɛ Pr (h(x Y ) λ)] : upper bound 2
4 The Slepian-Wolf Problem Only X needs to be sent to an observer of Y. [Slepian-Wolf 73] Optimal rate for the case of IID observations: Rɛ = H(X Y ), 0 < ɛ < 1. [Miyake-Kanaya 95] Single-shot bounds: L ɛ (X Y ) λ + log [1 ɛ Pr (h(x Y ) λ)] : lower bound L ɛ (X Y ) λ log [ɛ Pr (h(x Y ) λ)] : upper bound Prob. < 1 Prob. > Spectrum of h(x Y )= log P X Y (X Y ) 2
5 Can Interaction Help? Asymptotically, interaction does not help in the SW problem. 3
6 Can Interaction Help? Asymptotically, interaction does not help in the SW problem. Instances where interaction is known to help: [Orlitsky 90] Single-shot, worst-case length: One round of interaction is almost optimal without error [Feder-Shulman 02] Universal version, adaptive rate: An interactive protocol accomplishes this task [Yang-He 10] Single-shot, average length An interactive protocol attains roughly H(X Y ) 3
7 Can Interaction Help? Asymptotically, interaction does not help in the SW problem. Instances where interaction is known to help: [Orlitsky 90] Single-shot, worst-case length: One round of interaction is almost optimal without error [Feder-Shulman 02] Universal version, adaptive rate: An interactive protocol accomplishes this task [Yang-He 10] Single-shot, average length An interactive protocol attains roughly H(X Y ) Can interaction help in the data exchange problem? 3
8 Using Slepian-Wolf scheme for Data Exchange The following rate is achievable for the IID case: H(X Y ) def = H(X Y ) + H(Y X). 4
9 Using Slepian-Wolf scheme for Data Exchange The following rate is achievable for the IID case: [Csiszár-Narayan 04] H(X Y ) def = H(X Y ) + H(Y X). This rate is the least possible. The proof relies on a property of interactive communication: H(Π) H(Π X, U) + H(Π Y, V ). Implication: Noninteractive communication can attain the optimal rate 4
10 Using Slepian-Wolf scheme for Data Exchange The following rate is achievable for the IID case: [Csiszár-Narayan 04] H(X Y ) def = H(X Y ) + H(Y X). This rate is the least possible. The proof relies on a property of interactive communication: H(Π) H(Π X, U) + H(Π Y, V ). Implication: Noninteractive communication can attain the optimal rate Is interaction of any use for data exchange? 4
11 Main Result: Bounds on L ɛ (X, Y ) We show that interaction is indeed helpful. We characterize the min. length of interactive communication needed, thereby characterizing the gain due to interaction. 5
12 Main Result: Bounds on L ɛ (X, Y ) We show that interaction is indeed helpful. We characterize the min. length of interactive communication needed, thereby characterizing the gain due to interaction. Define sum conditional entropy h(x Y ) def = h(x Y ) + h(y X) Theorem (Single-shot) For every 0 < ɛ < 1, we have L ɛ (X, Y ) λ log [ɛ Pr (h(x Y ) λ)], λ > 0, L ɛ (X, Y ) λ + log [1 ɛ Pr (h(x Y ) λ)], λ > 0. 5
13 Corollary 1: Second-Order Asymptotics for IID Sources Let (X n, Y n ) = (X i, Y i ) n i=1 be IID realizations of (X, Y ). Theorem For every 0 < ɛ < 1, we have L ɛ (X n, Y n ) = nh(x Y ) + nvar[h(x Y )]Q 1 (ɛ) + o( n). 6
14 Corollary 1: Second-Order Asymptotics for IID Sources Let (X n, Y n ) = (X i, Y i ) n i=1 be IID realizations of (X, Y ). Theorem For every 0 < ɛ < 1, we have L ɛ (X n, Y n ) = nh(x Y ) + nvar[h(x Y )]Q 1 (ɛ) + o( n). This length is strictly smaller than that attained by noninteractive protocols. 6
15 Corollary 2: Minimum Rate for General Sources Let (X, Y) = (X n, Y n ) n=1 be a general source sequence. Define the minimum rate of communication for data exchange as R (X, Y) def 1 = inf lim sup {ɛ n} n n L ɛ n (X n, Y n ), where the infimum is over all sequences ɛ n 0. Theorem For a general source sequence (X, Y), R (X, Y) = H(X Y), where H(X Y) denotes the lim sup in probability of h(x n Y n ). 7
16 Corollary 2: Minimum Rate for General Sources Let (X, Y) = (X n, Y n ) n=1 be a general source sequence. Define the minimum rate of communication for data exchange as R (X, Y) def 1 = inf lim sup {ɛ n} n n L ɛ n (X n, Y n ), where the infimum is over all sequences ɛ n 0. Theorem For a general source sequence (X, Y), R (X, Y) = H(X Y), where H(X Y) denotes the lim sup in probability of h(x n Y n ). This rate is strictly smaller than that attained by noninteractive protocols. 7
17 Proof Sketch for the Converse
18 Digression: Secret Key Agreement X Y K x K y K constitutes an ɛ-secret key of length log K if where 1 2 P K xk yf P (2) unif P F 1 ɛ, P (2) unif(k x, k y ) = 1 K 1(k x = k y ). The maximum length of an ɛ-sk is denoted by S ɛ (X, Y ). 9
19 Main Heuristic Parties with correlated observations share more bits than what they communicate. The extra bits shared can be extracted as a secret key. Thus, if the parties share R shared bits and communicate R bits, R shared R S(X, Y ) R shared S(X, Y ) R [Csisár-Narayan 04] First formalized this duality to obtain SK capacity [T-Narayan-Gupta 10, T 12] characterization of secure computability 10
20 Warm-up: Optimal Rate for Data Exchange Csiszár-Narayan approach flipped around: Consider a rate R protocol for data exchnage. Both parties share roughly nh(xy ) bits at the end. Using an extractor lemma we can generate a SK of rate H(XY ) R, which must be less than the SK capacity I(X Y ). Thus, R H(XY ) I(X Y ) = H(X Y ) + H(Y X). 11
21 Warm-up: Optimal Rate for Data Exchange Csiszár-Narayan approach flipped around: Consider a rate R protocol for data exchnage. Both parties share roughly nh(xy ) bits at the end. Using an extractor lemma we can generate a SK of rate H(XY ) R, which must be less than the SK capacity I(X Y ). Thus, R H(XY ) I(X Y ) = H(X Y ) + H(Y X). We seek to extend this argument to a single-shot setup. 11
22 Upper Bound for Secret Key Length [T-Watanabe 14] Theorem For every 0 < ɛ, η < 1 with η < 1 ɛ, we have S ɛ (X, Y ) λ log (P λ ɛ η) + 2 log 1/η, λ > 0, where P λ = P XY ({(x, y) : log }) P XY (x, y) Q X (x) Q Y (y) < λ. 12
23 Converse for Almost Uniform Sources Consider a data exchange protocol of length l. H min (XY ) H max (XY ) Spectrum of h(xy ) Using the Leftover Hash Lemma, we can extract a SK of length H min (XY ) l. 13
24 Converse for Almost Uniform Sources Consider a data exchange protocol of length l. H min (XY ) H max (XY ) Spectrum of h(xy ) Using the upper bound for S ɛ (X, Y ), H min (XY ) l λ log (P XY (i(x Y ) < λ) ɛ) = λ log (P XY (h(xy ) h(x Y ) < λ) ɛ) H max (XY ) γ log (P XY (h(x Y ) > γ) ɛ) 13
25 Converse for Almost Uniform Sources Consider a data exchange protocol of length l. H min (XY ) H max (XY ) Spectrum of h(xy ) Thus, l H max (XY ) H min (XY ) + γ + log (P XY (h(x Y ) > γ) ɛ), which gives the converse bound if H max (XY ) H min (XY ). 13
26 General Converse via Spectrum Slicing Slice the spectrum into N slices of width each. H min (XY ) H max(xy ) E j Spectrum of h(xy ) There exists a slice E j with P XY (E j ) N 2, and so P XY P XY Ej N 2 P XY. The proof is completed by applying the previous bound to P XY Ej. 14
27 Our Achievability Scheme
28 Rough Sketch of Our Scheme H min (X Y ) h(x Y ) T j h i random binning of X into H ξ min (X Y ) values, i = 1, random binning of X into values, 2 i N. First party sends bin indices Π i = h i (x) successively until it receives an ACK or i = N Second party sends an ACK when it finds an ˆx s.t. (ˆx, y) T i and h j (ˆx) = Π j, 1 j i. 16
29 In Closing... Prob. < Spectrum of h(x4y ) The minimum length of communication for ɛ-data exchange is equal to roughly the ɛ-tail λ ɛ of h(x Y ). Interaction is necessary to attain this rate. 17
Secret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe
Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe Two party secret key agreement Maurer 93, Ahlswede-Csiszár 93 X F Y K x K y ArandomvariableK
More informationInteractive Communication for Data Exchange
Interactive Communication for Data Exchange Himanshu Tyagi, Member, IEEE, Pramod Viswanath, Fellow, IEEE, and Shun Watanabe, Member, IEEE Abstract Two parties observing correlated data seek to exchange
More informationSecret Key Agreement: General Capacity and Second-Order Asymptotics
Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe Abstract We revisit the problem of secret key agreement using interactive public communication
More informationSecret Key Agreement: General Capacity and Second-Order Asymptotics
Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe 1 Abstract We revisit the problem of secret key agreement using interactive public communication
More informationSecret Key Generation and Secure Computing
Secret Key Generation and Secure Computing Himanshu Tyagi Department of Electrical and Computer Engineering and Institute of System Research University of Maryland, College Park, USA. Joint work with Prakash
More informationA Bound For Multiparty Secret Key Agreement And Implications For A Problem Of Secure Computing. Himanshu Tyagi and Shun Watanabe
A Bound For Multiparty Secret Key Agreement And Implications For A Problem Of Secure Computing Himanshu Tyagi and Shun Watanabe Multiparty Secret Key Agreement COMMUNICATION NETWORK F X 1 X 2 X m K 1 K
More informationSHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe
SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/40 Acknowledgement Praneeth Boda Himanshu Tyagi Shun Watanabe 3/40 Outline Two-terminal model: Mutual
More informationCapacity of a channel Shannon s second theorem. Information Theory 1/33
Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,
More informationSHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe
SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/41 Outline Two-terminal model: Mutual information Operational meaning in: Channel coding: channel
More informationCommon Randomness Principles of Secrecy
Common Randomness Principles of Secrecy Himanshu Tyagi Department of Electrical and Computer Engineering and Institute of Systems Research 1 Correlated Data, Distributed in Space and Time Sensor Networks
More informationThe Communication Complexity of Correlation. Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan
The Communication Complexity of Correlation Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan Transmitting Correlated Variables (X, Y) pair of correlated random variables Transmitting
More informationNetwork coding for multicast relation to compression and generalization of Slepian-Wolf
Network coding for multicast relation to compression and generalization of Slepian-Wolf 1 Overview Review of Slepian-Wolf Distributed network compression Error exponents Source-channel separation issues
More informationConverses For Secret Key Agreement and Secure Computing
Converses For Secret Key Agreement and Secure Computing Himanshu Tyagi and Shun Watanabe Abstract We consider information theoretic secret key agreement and secure function computation by multiple parties
More informationUniversal Incremental Slepian-Wolf Coding
Proceedings of the 43rd annual Allerton Conference, Monticello, IL, September 2004 Universal Incremental Slepian-Wolf Coding Stark C. Draper University of California, Berkeley Berkeley, CA, 94720 USA sdraper@eecs.berkeley.edu
More informationSecret Key and Private Key Constructions for Simple Multiterminal Source Models
Secret Key and Private Key Constructions for Simple Multiterminal Source Models arxiv:cs/05050v [csit] 3 Nov 005 Chunxuan Ye Department of Electrical and Computer Engineering and Institute for Systems
More informationMultiterminal Source Coding with an Entropy-Based Distortion Measure
Multiterminal Source Coding with an Entropy-Based Distortion Measure Thomas Courtade and Rick Wesel Department of Electrical Engineering University of California, Los Angeles 4 August, 2011 IEEE International
More informationLecture 15: Conditional and Joint Typicaility
EE376A Information Theory Lecture 1-02/26/2015 Lecture 15: Conditional and Joint Typicaility Lecturer: Kartik Venkat Scribe: Max Zimet, Brian Wai, Sepehr Nezami 1 Notation We always write a sequence of
More informationEE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15
EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability
More informationLecture 6 I. CHANNEL CODING. X n (m) P Y X
6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder
More informationAN INTRODUCTION TO SECRECY CAPACITY. 1. Overview
AN INTRODUCTION TO SECRECY CAPACITY BRIAN DUNN. Overview This paper introduces the reader to several information theoretic aspects of covert communications. In particular, it discusses fundamental limits
More informationInformation Leakage of Correlated Source Coded Sequences over a Channel with an Eavesdropper
Information Leakage of Correlated Source Coded Sequences over a Channel with an Eavesdropper Reevana Balmahoon and Ling Cheng School of Electrical and Information Engineering University of the Witwatersrand
More informationEE5585 Data Compression May 2, Lecture 27
EE5585 Data Compression May 2, 2013 Lecture 27 Instructor: Arya Mazumdar Scribe: Fangying Zhang Distributed Data Compression/Source Coding In the previous class we used a H-W table as a simple example,
More informationLecture 11: Polar codes construction
15-859: Information Theory and Applications in TCS CMU: Spring 2013 Lecturer: Venkatesan Guruswami Lecture 11: Polar codes construction February 26, 2013 Scribe: Dan Stahlke 1 Polar codes: recap of last
More informationSource Coding and Function Computation: Optimal Rate in Zero-Error and Vanishing Zero-Error Regime
Source Coding and Function Computation: Optimal Rate in Zero-Error and Vanishing Zero-Error Regime Solmaz Torabi Dept. of Electrical and Computer Engineering Drexel University st669@drexel.edu Advisor:
More informationMULTITERMINAL SECRECY AND TREE PACKING. With Imre Csiszár, Sirin Nitinawarat, Chunxuan Ye, Alexander Barg and Alex Reznik
MULTITERMINAL SECRECY AND TREE PACKING With Imre Csiszár, Sirin Nitinawarat, Chunxuan Ye, Alexander Barg and Alex Reznik Information Theoretic Security A complementary approach to computational security
More informationLecture 3: Channel Capacity
Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.
More informationDistributed Lossless Compression. Distributed lossless compression system
Lecture #3 Distributed Lossless Compression (Reading: NIT 10.1 10.5, 4.4) Distributed lossless source coding Lossless source coding via random binning Time Sharing Achievability proof of the Slepian Wolf
More informationLecture 11: Continuous-valued signals and differential entropy
Lecture 11: Continuous-valued signals and differential entropy Biology 429 Carl Bergstrom September 20, 2008 Sources: Parts of today s lecture follow Chapter 8 from Cover and Thomas (2007). Some components
More informationCommunication Cost of Distributed Computing
Communication Cost of Distributed Computing Abbas El Gamal Stanford University Brice Colloquium, 2009 A. El Gamal (Stanford University) Comm. Cost of Dist. Computing Brice Colloquium, 2009 1 / 41 Motivation
More informationShannon s Noisy-Channel Coding Theorem
Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 2015 Abstract In information theory, Shannon s Noisy-Channel Coding Theorem states that it is possible to communicate over a noisy
More informationA Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks
A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks Marco Tomamichel, Masahito Hayashi arxiv: 1208.1478 Also discussing results of: Second Order Asymptotics for Quantum
More informationReliable Computation over Multiple-Access Channels
Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,
More informationCommon Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014
Common Information Abbas El Gamal Stanford University Viterbi Lecture, USC, April 2014 Andrew Viterbi s Fabulous Formula, IEEE Spectrum, 2010 El Gamal (Stanford University) Disclaimer Viterbi Lecture 2
More informationEE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions
EE/Stat 376B Handout #5 Network Information Theory October, 14, 014 1. Problem.4 parts (b) and (c). Homework Set # Solutions (b) Consider h(x + Y ) h(x + Y Y ) = h(x Y ) = h(x). (c) Let ay = Y 1 + Y, where
More informationLecture 16. Error-free variable length schemes (contd.): Shannon-Fano-Elias code, Huffman code
Lecture 16 Agenda for the lecture Error-free variable length schemes (contd.): Shannon-Fano-Elias code, Huffman code Variable-length source codes with error 16.1 Error-free coding schemes 16.1.1 The Shannon-Fano-Elias
More informationInformation-theoretic Secrecy A Cryptographic Perspective
Information-theoretic Secrecy A Cryptographic Perspective Stefano Tessaro UC Santa Barbara WCS 2017 April 30, 2017 based on joint works with M. Bellare and A. Vardy Cryptography Computational assumptions
More informationExponential Separation of Quantum Communication and Classical Information
Exponential Separation of Quantum Communication and Classical Information Dave Touchette IQC and C&O, University of Waterloo, and Perimeter Institute for Theoretical Physics jt. work with Anurag Anshu
More informationOn Common Information and the Encoding of Sources that are Not Successively Refinable
On Common Information and the Encoding of Sources that are Not Successively Refinable Kumar Viswanatha, Emrah Akyol, Tejaswi Nanjundaswamy and Kenneth Rose ECE Department, University of California - Santa
More informationEE 4TM4: Digital Communications II. Channel Capacity
EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.
More informationLecture 11: Quantum Information III - Source Coding
CSCI5370 Quantum Computing November 25, 203 Lecture : Quantum Information III - Source Coding Lecturer: Shengyu Zhang Scribe: Hing Yin Tsang. Holevo s bound Suppose Alice has an information source X that
More informationSimple and Tight Bounds for Information Reconciliation and Privacy Amplification
Simple and Tight Bounds for Information Reconciliation and Privacy Amplification Renato Renner 1 and Stefan Wolf 2 1 Computer Science Department, ETH Zürich, Switzerland. renner@inf.ethz.ch. 2 Département
More informationDistributed Functional Compression through Graph Coloring
Distributed Functional Compression through Graph Coloring Vishal Doshi, Devavrat Shah, Muriel Médard, and Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology
More informationExplicit Capacity-Achieving Coding Scheme for the Gaussian Wiretap Channel. Himanshu Tyagi and Alexander Vardy
Explicit Capacity-Achieving Coding Scheme for the Gaussian Wiretap Channel Himanshu Tyagi and Alexander Vardy The Gaussian wiretap channel M Encoder X n N(0,σ 2 TI) Y n Decoder ˆM N(0,σ 2 WI) Z n Eavesdropper
More informationAlon Orlitsky. AT&T Bell Laboratories. March 22, Abstract
Average-case interactive communication Alon Orlitsky AT&T Bell Laboratories March 22, 1996 Abstract and Y are random variables. Person P knows, Person P Y knows Y, and both know the joint probability distribution
More informationLecture 4 Noisy Channel Coding
Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem
More informationSecret-Key Agreement over Unauthenticated Public Channels Part I: Definitions and a Completeness Result
Secret-Key Agreement over Unauthenticated Public Channels Part I: Definitions and a Completeness Result Ueli Maurer, Fellow, IEEE Stefan Wolf Abstract This is the first part of a three-part paper on secret-key
More informationInformation Theoretic Limits of Randomness Generation
Information Theoretic Limits of Randomness Generation Abbas El Gamal Stanford University Shannon Centennial, University of Michigan, September 2016 Information theory The fundamental problem of communication
More informationPERFECTLY secure key agreement has been studied recently
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 45, NO. 2, MARCH 1999 499 Unconditionally Secure Key Agreement the Intrinsic Conditional Information Ueli M. Maurer, Senior Member, IEEE, Stefan Wolf Abstract
More informationThe Method of Types and Its Application to Information Hiding
The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,
More informationECE Information theory Final (Fall 2008)
ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1
More informationFrans M.J. Willems. Authentication Based on Secret-Key Generation. Frans M.J. Willems. (joint work w. Tanya Ignatenko)
Eindhoven University of Technology IEEE EURASIP Spain Seminar on Signal Processing, Communication and Information Theory, Universidad Carlos III de Madrid, December 11, 2014 : Secret-Based Authentication
More informationLecture 17: Differential Entropy
Lecture 17: Differential Entropy Differential entropy AEP for differential entropy Quantization Maximum differential entropy Estimation counterpart of Fano s inequality Dr. Yao Xie, ECE587, Information
More informationSecond-Order Asymptotics in Information Theory
Second-Order Asymptotics in Information Theory Vincent Y. F. Tan (vtan@nus.edu.sg) Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) National Taiwan University November 2015
More informationSOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003
SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003 SLEPIAN-WOLF RESULT { X i} RATE R x ENCODER 1 DECODER X i V i {, } { V i} ENCODER 0 RATE R v Problem: Determine R, the
More informationCoding for Computing. ASPITRG, Drexel University. Jie Ren 2012/11/14
Coding for Computing ASPITRG, Drexel University Jie Ren 2012/11/14 Outline Background and definitions Main Results Examples and Analysis Proofs Background and definitions Problem Setup Graph Entropy Conditional
More informationGroup Secret Key Agreement over State-Dependent Wireless Broadcast Channels
Group Secret Key Agreement over State-Dependent Wireless Broadcast Channels Mahdi Jafari Siavoshani Sharif University of Technology, Iran Shaunak Mishra, Suhas Diggavi, Christina Fragouli Institute of
More informationShannon s noisy-channel theorem
Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for
More informationEECS 750. Hypothesis Testing with Communication Constraints
EECS 750 Hypothesis Testing with Communication Constraints Name: Dinesh Krithivasan Abstract In this report, we study a modification of the classical statistical problem of bivariate hypothesis testing.
More informationChapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University
Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission
More informationPerfect Omniscience, Perfect Secrecy and Steiner Tree Packing
Perfect Omniscience, Perfect Secrecy and Steiner Tree Packing S. Nitinawarat and P. Narayan Department of Electrical and Computer Engineering and Institute for Systems Research University of Maryland College
More informationExponential Separation of Quantum and Classical Non-Interactive Multi-Party Communication Complexity
Electronic Colloquium on Computational Complexity, Report No. 74 (2007) Exponential Separation of Quantum and Classical Non-Interactive ulti-party Communication Complexity Dmitry Gavinsky Pavel Pudlák
More informationInteractive Decoding of a Broadcast Message
In Proc. Allerton Conf. Commun., Contr., Computing, (Illinois), Oct. 2003 Interactive Decoding of a Broadcast Message Stark C. Draper Brendan J. Frey Frank R. Kschischang University of Toronto Toronto,
More informationClassical and Quantum Channel Simulations
Classical and Quantum Channel Simulations Mario Berta (based on joint work with Fernando Brandão, Matthias Christandl, Renato Renner, Joseph Renes, Stephanie Wehner, Mark Wilde) Outline Classical Shannon
More informationVariable-Rate Universal Slepian-Wolf Coding with Feedback
Variable-Rate Universal Slepian-Wolf Coding with Feedback Shriram Sarvotham, Dror Baron, and Richard G. Baraniuk Dept. of Electrical and Computer Engineering Rice University, Houston, TX 77005 Abstract
More informationLecture 22: Final Review
Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information
More informationECE 4400:693 - Information Theory
ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential
More informationExercises with solutions (Set D)
Exercises with solutions Set D. A fair die is rolled at the same time as a fair coin is tossed. Let A be the number on the upper surface of the die and let B describe the outcome of the coin toss, where
More informationLecture 2: August 31
0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 2: August 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy
More informationHypothesis Testing with Communication Constraints
Hypothesis Testing with Communication Constraints Dinesh Krithivasan EECS 750 April 17, 2006 Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 1 / 21 Presentation Outline
More informationLecture 7. Union bound for reducing M-ary to binary hypothesis testing
Lecture 7 Agenda for the lecture M-ary hypothesis testing and the MAP rule Union bound for reducing M-ary to binary hypothesis testing Introduction of the channel coding problem 7.1 M-ary hypothesis testing
More informationExtractors and the Leftover Hash Lemma
6.889 New Developments in Cryptography March 8, 2011 Extractors and the Leftover Hash Lemma Instructors: Shafi Goldwasser, Yael Kalai, Leo Reyzin, Boaz Barak, and Salil Vadhan Lecturer: Leo Reyzin Scribe:
More information2015 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media,
2015 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising
More informationA Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels
A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels S. Sandeep Pradhan a, Suhan Choi a and Kannan Ramchandran b, a {pradhanv,suhanc}@eecs.umich.edu, EECS Dept.,
More informationAn exponential separation between quantum and classical one-way communication complexity
An exponential separation between quantum and classical one-way communication complexity Ashley Montanaro Centre for Quantum Information and Foundations, Department of Applied Mathematics and Theoretical
More informationEntropy Accumulation in Device-independent Protocols
Entropy Accumulation in Device-independent Protocols QIP17 Seattle January 19, 2017 arxiv: 1607.01796 & 1607.01797 Rotem Arnon-Friedman, Frédéric Dupuis, Omar Fawzi, Renato Renner, & Thomas Vidick Outline
More informationFourier analysis of boolean functions in quantum computation
Fourier analysis of boolean functions in quantum computation Ashley Montanaro Centre for Quantum Information and Foundations, Department of Applied Mathematics and Theoretical Physics, University of Cambridge
More information(Classical) Information Theory III: Noisy channel coding
(Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way
More informationInteractive Channel Capacity
Electronic Colloquium on Computational Complexity, Report No. 1 (2013 Interactive Channel Capacity Gillat Kol Ran Raz Abstract We study the interactive channel capacity of an ɛ-noisy channel. The interactive
More informationLecture 28: Public-key Cryptography. Public-key Cryptography
Lecture 28: Recall In private-key cryptography the secret-key sk is always established ahead of time The secrecy of the private-key cryptography relies on the fact that the adversary does not have access
More informationc 2017 Shaileshh Bojja Venkatakrishnan
c 2017 Shaileshh Bojja Venkatakrishnan ALGORITHMS FOR INTERACTIVE, DISTRIBUTED AND NETWORKED SYSTEMS BY SHAILESHH BOJJA VENKATAKRISHNAN DISSERTATION Submitted in partial fulfillment of the requirements
More information6.2 Deeper Properties of Continuous Functions
6.2. DEEPER PROPERTIES OF CONTINUOUS FUNCTIONS 69 6.2 Deeper Properties of Continuous Functions 6.2. Intermediate Value Theorem and Consequences When one studies a function, one is usually interested in
More informationOn the Duality between Multiple-Access Codes and Computation Codes
On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch
More informationSolutions to Homework Set #1 Sanov s Theorem, Rate distortion
st Semester 00/ Solutions to Homework Set # Sanov s Theorem, Rate distortion. Sanov s theorem: Prove the simple version of Sanov s theorem for the binary random variables, i.e., let X,X,...,X n be a sequence
More informationSolving Dual Problems
Lecture 20 Solving Dual Problems We consider a constrained problem where, in addition to the constraint set X, there are also inequality and linear equality constraints. Specifically the minimization problem
More informationOn Optimum Conventional Quantization for Source Coding with Side Information at the Decoder
On Optimum Conventional Quantization for Source Coding with Side Information at the Decoder by Lin Zheng A thesis presented to the University of Waterloo in fulfillment of the thesis requirement for the
More informationConsistency of the maximum likelihood estimator for general hidden Markov models
Consistency of the maximum likelihood estimator for general hidden Markov models Jimmy Olsson Centre for Mathematical Sciences Lund University Nordstat 2012 Umeå, Sweden Collaborators Hidden Markov models
More informationOn Function Computation with Privacy and Secrecy Constraints
1 On Function Computation with Privacy and Secrecy Constraints Wenwen Tu and Lifeng Lai Abstract In this paper, the problem of function computation with privacy and secrecy constraints is considered. The
More informationInaccessible Entropy and its Applications. 1 Review: Psedorandom Generators from One-Way Functions
Columbia University - Crypto Reading Group Apr 27, 2011 Inaccessible Entropy and its Applications Igor Carboni Oliveira We summarize the constructions of PRGs from OWFs discussed so far and introduce the
More informationSource Coding with Lists and Rényi Entropy or The Honey-Do Problem
Source Coding with Lists and Rényi Entropy or The Honey-Do Problem Amos Lapidoth ETH Zurich October 8, 2013 Joint work with Christoph Bunte. A Task from your Spouse Using a fixed number of bits, your spouse
More informationNotes on Zero Knowledge
U.C. Berkeley CS172: Automata, Computability and Complexity Handout 9 Professor Luca Trevisan 4/21/2015 Notes on Zero Knowledge These notes on zero knowledge protocols for quadratic residuosity are based
More informationLecture 1: Entropy, convexity, and matrix scaling CSE 599S: Entropy optimality, Winter 2016 Instructor: James R. Lee Last updated: January 24, 2016
Lecture 1: Entropy, convexity, and matrix scaling CSE 599S: Entropy optimality, Winter 2016 Instructor: James R. Lee Last updated: January 24, 2016 1 Entropy Since this course is about entropy maximization,
More informationLecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity
5-859: Information Theory and Applications in TCS CMU: Spring 23 Lecture 8: Channel and source-channel coding theorems; BEC & linear codes February 7, 23 Lecturer: Venkatesan Guruswami Scribe: Dan Stahlke
More informationOn the Secrecy Capacity of Fading Channels
On the Secrecy Capacity of Fading Channels arxiv:cs/63v [cs.it] 7 Oct 26 Praveen Kumar Gopala, Lifeng Lai and Hesham El Gamal Department of Electrical and Computer Engineering The Ohio State University
More information5199/IOC5063 Theory of Cryptology, 2014 Fall
5199/IOC5063 Theory of Cryptology, 2014 Fall Homework 2 Reference Solution 1. This is about the RSA common modulus problem. Consider that two users A and B use the same modulus n = 146171 for the RSA encryption.
More informationCharacterising Probability Distributions via Entropies
1 Characterising Probability Distributions via Entropies Satyajit Thakor, Terence Chan and Alex Grant Indian Institute of Technology Mandi University of South Australia Myriota Pty Ltd arxiv:1602.03618v2
More informationKeyless authentication in the presence of a simultaneously transmitting adversary
Keyless authentication in the presence of a simultaneously transmitting adversary Eric Graves Army Research Lab Adelphi MD 20783 U.S.A. ericsgra@ufl.edu Paul Yu Army Research Lab Adelphi MD 20783 U.S.A.
More informationA large deviation principle for a RWRC in a box
A large deviation principle for a RWRC in a box 7th Cornell Probability Summer School Michele Salvi TU Berlin July 12, 2011 Michele Salvi (TU Berlin) An LDP for a RWRC in a nite box July 12, 2011 1 / 15
More informationCapacity of Block Rayleigh Fading Channels Without CSI
Capacity of Block Rayleigh Fading Channels Without CSI Mainak Chowdhury and Andrea Goldsmith, Fellow, IEEE Department of Electrical Engineering, Stanford University, USA Email: mainakch@stanford.edu, andrea@wsl.stanford.edu
More informationThe Poisson Channel with Side Information
The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH
More informationGraph Coloring and Conditional Graph Entropy
Graph Coloring and Conditional Graph Entropy Vishal Doshi, Devavrat Shah, Muriel Médard, Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology Cambridge,
More information