SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

Size: px
Start display at page:

Download "SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe"

Transcription

1 SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

2 2/40 Acknowledgement Praneeth Boda Himanshu Tyagi Shun Watanabe

3 3/40 Outline Two-terminal model: Mutual information Operational meaning in: Channel coding: channel capacity Lossy source coding: rate distortion function Binary hypothesis testing: Stein s lemma Interactive communication and common randomness Two-terminal model: Mutual information Multiterminal model: Shared information Applications

4 4/40 Outline Two-terminal model: Mutual information Operational meaning in: Channel coding: channel capacity Lossy source coding: rate distortion function Binary hypothesis testing: Stein s lemma Interactive communication and common randomness Applications

5 5/40 Mutual Information Mutual information is a measure of mutual dependence between two rvs.

6 5/40 Mutual Information Mutual information is a measure of mutual dependence between two rvs. Let X 1 and X 2 be R-valued rvs with joint probability distribution P X1X 2. The mutual information between X 1 and X 2 is { [ ] EPX1X2 log dp X 1 X 2 dp I X 1 X 2 = X1 P X2 X 1, X 2, if P X1X 2 P X1 P X2, if P X1X 2 P X1 P X2 = D P X1X 2 P X1 P X2. Kullback Leibler divergence When X 1 and X 2 are finite-valued, I X 1 X 2 = H X 1 + H X 2 H X 1, X 2 = H X 1 H X 1 X 2 = H X 2 H X 2 X 1 [ ] = H X 1, X 2 H X 1 X 2 + H X 2 X 1.

7 6/40 Channel Coding Let X 1 and X 2 be finite alphabets, and W : X 1 X 2 be a stochastic matrix. message m {1,..., M} fm encoder f DMC decoder φ x 11,..., x 1n x 21,..., x 2n m Discrete memoryless channel DMC: n W n x 21,..., x 2n x 11,..., x 1n = W x 2i x 1i. i=1

8 7/40 Channel Capacity message m {1,..., M} fm encoder f DMC decoder φ x 11,..., x 1n x 21,..., x 2n m Goal: Make code rate 1 n log M as large as possible while keeping max P φ X 21,..., X 2n m fm m to be small, in the asymptotic sense as n. [C.E. Shannon, 1948] Channel capacity C = max I X 1 X 2. P X1 :P X2 X 1 =W

9 8/40 Lossy Source Coding Let {X 1t } t=1 be an X 1-valued i.i.d. source. source fx 11,..., x 1n φj encoder f decoder φ x 11,..., x 1n j {1,..., J} x 21,..., x 2n Distortion measure: d x 11,..., x 1n, x 21,..., x 2n = 1 n d x 1i, x 2i. n i=1

10 9/40 Rate Distortion Function source fx 11,..., x 1n φj encoder f decoder φ x 11,..., x 1n j {1,..., J} x 21,..., x 2n Goal: Make compression code rate 1 n log J as small as possible while keeping 1 P n n i=1 d X 1i, X 2i to be large, in the asymptotic sense as n. [Shannon, 1948, 1959] Rate distortion function R = min I X 1 X 2. P X2 X 1 : E[dX 1,X 2]

11 10/40 Simple Binary Hypothesis Testing Let {X 1t, X 2t } t=1 be an X 1 X 2 -valued i.i.d. process generated according to H 0 : P X1X 2 or H 1 : P X1 P X2. Test: Decides H 0 w.p. T 0 x 11,..., x 1n, x 21,..., x 2n, H 1 w.p. T 1 x 11,..., x 1n, x 21,..., x 2n = 1 T Stein s lemma [H. Chernoff, 1956]: For every 0 < ɛ < 1, lim n 1 n log inf T : P H0 T says H 0 1 ɛ P H 1 T says H 0 = D P X1X 2 P X1 P X2 = I X 1 X 2.

12 11/40 Outline Two-terminal model: Mutual information Interactive communication and common randomness Two-terminal model: Mutual information Multiterminal model: Shared information Applications

13 12/40 Multiterminal Model COMMUNICATION NETWORK F X 1 X 2 X m Set of terminals = M = {1,..., m}. X 1,..., X m are finite-valued rvs with known joint distribution P X1...X m on X 1 X m. Terminal i M observes data X i. Multiple rounds of interactive communication on a noiseless channel of unlimited capacity; all terminals hear all communication.

14 13/40 Interactive Communication Interactive communication Assume: Communication occurs in consecutive time slots in r rounds. Communication is described in terms of the mappings f 11,..., f 1m, f 21,..., f 2m,..., f r1,..., f rm f ji : message in round j from terminal i, 1 j r, 1 i m f ji is any function of X i and of all previous communication.

15 13/40 Interactive Communication Interactive communication Assume: Communication occurs in consecutive time slots in r rounds. Communication is described in terms of the mappings f 11,..., f 1m, f 21,..., f 2m,..., f r1,..., f rm f ji : message in round j from terminal i, 1 j r, 1 i m f ji is any function of X i and of all previous communication. The corresponding rvs representing the communication are F = F X 1,..., X m = F 11,..., F 1m, F 21,..., F 2m,..., F r1,..., F rm F 11 = f 11 X1, F12 = f 12 X2, F 11,... F ji = f ji Xi; all previous communication. Simple communication: F = F 1,..., F m, Fi = f i Xi, 1 i m.

16 14/40 Applications COMMUNICATION NETWORK F X 1 X 2 X m Data exchange: Omniscience. Signal recovery: Data compression. Function computation. Cryptography: Secret key generation. Applications in control?

17 15/40 Example: Function Computation X 1 = X 11 X 12 F 1 F 2 X 2 = X 21 X 22 [S. Watanabe] X 11, X 12, X 21, X 22 are mutually independent 0.5, 0.5 bits. Terminals 1 and 2 wish to compute: G = gx 1, X 2 = 1 X 11, X 12 = X 21, X 22. Simple communication: F = Communication complexity: HF = 4 bits. F 1 = X 11, X 12, F 2 = X 21, X 22. No privacy: Terminal 1 or 2, or an observer of F, learns all the data X 1, X 2.

18 16/40 Example: Function Computation X 1 = X 11 X 12 F 11 F 12 F 21 F 22 X 2 = X 21 X 22 Interactive communication 1: F 11 = X 11 X 12, F 12 = X 21 X 22. If F 11 F 12, protocol over. If F 11 = F 12, then F 21 = X 11, F 22 = X 21. Complexity: HF = 3 bits. Some privacy: W.p. 0.5 both terminals, or an observer of F, learn that X 1 X 2; and w.p. 0.5 everyone learns X 1, X 2.

19 17/40 Example: Function Computation X 1 = X 11 X 12 F 11 F 12 X 2 = X 21 X 22 Interactive communication 2: F = F 11 = X 11, X 12, F 12 = G. Complexity: HF = 2.81 bits. Some privacy: Terminal 2, or an observer of F, learns X 1; terminal 1, or an observer of F, either learns X 2 w.p or w.p that X 2 differs from X 1.

20 18/40 Related Work Exact function computation Yao 79: Communication complexity. Gallager 88: Algorithm for parity computation in a network. Giridhar-Kumar 05: Algorithms for computing functions over sensor networks. Freris-Kowshik-Kumar 10: Survey: Connectivity, capacity, clocks, computation in large sensor networks. Orlitsky-El Gamal 84: Communication complexity with secrecy. Information theoretic function computation Körner-Marton 79: Minimum rate for computing parity. Orlitsky-Roche 01: Two terminal function computation. Nazer-Gastpar 07: Computation over noisy channels. Ma-Ishwar 08: Distributed source coding for interactive computing. Ma-Ishwar-Gupta 09: Multiround function computation in colocated networks. Tyagi-Gupta-Narayan 11: Secure function computation. Tyagi-Watanabe 13, 14 Secrecy generation, secure computing. Compressing interactive communication Schulman 92: Coding for interactive communication. Braverman-Rao 10: Information complexity of communication. Kol-Raz 13, Heupler 14: Interactive communication over noisy channels.

21 19/40 Common Randomness COMMUNICATION NETWORK F X 1 X 2 X m L 1 L 2 L m = L For 0 ɛ < 1, given interactive communication F, an rv L = LX 1,..., X m is ɛ-cr for the terminals in M using F, if there exist local estimates L i = L i Xi, F, i M, of L satisfying P L i = L, i M 1 ɛ.

22 20/40 Common Randomness COMMUNICATION NETWORK F X 1 X 2 X m L 1 L 2 L m = L Examples: Omniscience: L = X 1,..., X m. Single signal: L = X i, for some fixed i M. Function computation: L = g X 1,..., X m for a given g. Secret CR, i.e., secret key: L with IL F = 0.

23 21/40 A Basic Question COMMUNICATION NETWORK F X 1 X 2 X m L 1 L 2 L m = L What is the maximal CR, as measured by H L F, that can be generated by a given interactive communication F for a distributed processing task?

24 21/40 A Basic Question COMMUNICATION NETWORK F X 1 X 2 X m L 1 L 2 L m = L What is the maximal CR, as measured by H L F, that can be generated by a given interactive communication F for a distributed processing task? Answer in two steps: Fundamental property of interactive communication Upper bound on amount of CR achievable with interactive communication. Shall start with the case of m = 2 terminals.

25 22/40 Fundamental Property of Interactive Communication COMMUNICATION NETWORK F X 1 X 2 Lemma: [U. Maurer], [R. Ahlswede - I. Csiszár] For interactive communication F of the terminals i M = {1, 2}, with terminal i possessing initial data X i, I X 1 X 2 F I X1 X 2. In particular, independent rvs X 1, X 2 remain so upon conditioning on an interactive communication.

26 23/40 Fundamental Property of Interactive Communication Lemma: [U. Maurer], [R. Ahlswede - I. Csiszár] For interactive communication F of the terminals i M = {1, 2}, with terminal i possessing initial data X i, I X 1 X 2 F I X1 X 2. In particular, independent rvs X 1, X 2 remain so upon conditioning on an interactive communication. Proof: For interactive communication F = F 11, F 12,..., F r1, F r2, followed by iteration. I X 1 X 2 = I X1, F 11 X 2 I X 1 X 2 F 11 = I X 1 X 2, F 12 F 11 I X 1 X 2 F 11, F 12,

27 24/40 An Equivalent Form For interactive communication F of terminals 1 and 2: I X 1 X 2 F I X1 X 2 HF HF X 1 + HF X 2.

28 Upper Bound on CR for Two Terminals COMMUNICATION NETWORK F X 1 X 2 L1 L 2 = L Using L is ɛ-cr for M = {1, 2} with interactive F; and HF HF X 1 + HF X 2, we get H L F H X 1, X 2 [H X 1 X 2 + H X2 X 1 ] + 2νɛ, where lim ɛ 0 νɛ = 0. 25/40

29 26/40 Upper Bound on CR for Two Terminals COMMUNICATION NETWORK F X 1 X 2 L1 L 2 = L Lemma: [I. Csiszár - P. Narayan] Let L be any ɛ-cr for the terminals i M = {1, 2} with terminal i possessing initial data X i, achievable with interactive communication F. Then H L F I X 1 X 2 + 2ν, lim νɛ = 0. ɛ 0 Remark: When {X 1t, X 2t } t=1 is an X 1 X 2 -valued i.i.d. process, the upper bound is attained.

30 27/40 Interactive Communication for m 2 Terminals Theorem 1: [I. Csiszár-P. Narayan] For interactive communication F of the terminals i M = {1,..., m}, with terminal i possessing initial data X i, H F B B λ B H F X B c for every family B = {B M, B } and set of weights fractional partition { } λ 0 λ B 1, B B, satisfying λ B = 1 i M. B B:B i Equality holds if X 1,..., X m are mutually independent. Special case of: M. Madiman and P. Tetali, Information inequalities for joint distributions, with interpretations and applications, IEEE Trans. Inform. Theory, June 2010.

31 28/40 CR for m 2 Terminals: A Suggestive Analogy [S. Nitinawarat-P. Narayan] For interactive communication F of the terminals i M = {1,..., m}, with terminal i possessing initial data X i, m = 2 : HF HF X 1 + HF X 2 I X 1 X 2 F I X1 X 2 H F B B λ B H F X B c H X 1,..., X m F λ B H X B X B c, F B B H X 1,..., X m λ B H X B X B c. B B

32 29/40 [S. Nitinawarat-P. Narayan] An Analogy For interactive communication F of the terminals i M = {1,..., m}, with terminal i possessing initial data X i, H F B B λ B H F X B c H X 1,..., X m F λ B H X B X B c, F B B H X 1,..., X m λ B H X B X B c. B B? Does the RHS suggest a measure of mutual dependence among the rvs X 1,..., X m?

33 30/40 CR for m 2 Terminals Theorem 2: [I. Csiszár-P. Narayan] Given 0 ɛ < 1, for an ɛ-cr L for M achieved with interactive communication F, H L F H X 1,..., X m λ B H X B X B c + mν B B for every fractional partition λ of M, with ν = νɛ = ɛ log L + hɛ. Remarks: The proof of Theorem 2 relies on Theorem 1. When {X 1t,..., X mt } t=1 is an i.i.d. process, the upper bound is attained.

34 31/40 Shared Information Theorem 2: [I. Csiszár-P. Narayan] H L F H X 1,..., X m max λ λ B H X B X B c + mν B B = SI X 1,..., X m + mν

35 32/40 Extensions Theorems 1 and 2 extend to: random variables with densities [S. Nitinawarat-P. Narayan] a larger class of probability measures [H.Tyagi-P. Narayan].

36 33/40 Shared Information and Kullback-Leibler Divergence [I. Csiszár-P. Narayan, C. Chan-L. Zheng] SI X 1,..., X m = H X1,..., X m max λ λ B H X B X B c B B m = 2 = H X 1, X 2 [ H X 1 X 2 + H X2 X 1 ] = I X 1 X 2 m = 2 = D P X1X 2 P X1 P X2

37 33/40 Shared Information and Kullback-Leibler Divergence [I. Csiszár-P. Narayan, C. Chan-L. Zheng] SI X 1,..., X m = H X1,..., X m max λ λ B H X B X B c B B m = 2 = H X 1, X 2 [ H X 1 X 2 + H X2 X 1 ] = I X 1 X 2 m = 2 = D P X1X 2 P X1 P X2 m 2 = min 2 k m min A k =A 1,...,A k 1 k 1 D k P X1...X m P XAi i=1 and equals 0 iff P X1...X m = P XA P XA c for some A M. Does shared information have an operational significance as a measure of the mutual dependence among the rvs X 1,..., X m?

38 34/40 Outline Two-terminal model: Mutual information Interactive communication and common randomness Applications

39 35/40 Omniscience COMMUNICATION NETWORK F X 1 X 2 X m L 1 L 2 L m = L [I. Csiszár-P. Narayan] For L = X 1,..., X m, Theorem 2 gives H F H X 1,..., X m SI X 1,..., X m mν, which, for m = 2, is H F H X 1 X 2 + H X2 X 1 2ν. [Slepian Wolf]

40 36/40 Recovery of a Single Signal COMMUNICATION NETWORK F X 1 X 2 X m L 1 L 2 L m = L [S. Nitinawarat-P. Narayan] With L = X 1, by Theorem 2 H F H X 1 SI X 1,..., X m mν, which, for m = 2, gives H F I X 1 F H X 1 X 2 2ν. [Slepian-Wolf]

41 37/40 Secret Common Randomness COMMUNICATION NETWORK F X 1 X 2 X m L 1 L 2 L m = L Terminals 1,..., m generate CR L satisfying the secrecy condition I L F = 0. By Theorem 2, H L = H L F SI X1,..., X m + mν. Secret key generation [I. Csiszár-P. Narayan] Secure function computation [H. Tyagi-P. Narayan]

42 38/40 Shared information and a Hypothesis Testing Problem SI X 1,..., X m = min 2 k m min A k =A 1,...,A k 1 k 1 D k P X1...X m P XAi i=1 Related to exponent of P e -second kind for an appropriate binary composite hypothesis testing problem, involving restricted CR L and communication F. H. Tyagi and S. Watanabe, Converses for secret key agreement and secure computing, IEEE Trans. Information Theory, September 2015.

43 39/40 In Closing... How useful is the concept of shared information? A: Operational meaning in specific cases of distributed processing...

44 39/40 In Closing... How useful is the concept of shared information? A: Operational meaning in specific cases of distributed processing... For instance Consider n i.i.d. repetitions say, in time of the rvs X 1,..., X m. Data at time instant t is X 1t,..., X mt, t = 1,..., n. Terminal i observes the i.i.d. data X i1,..., X in, i M. Shared information-based results are asymptotically tight in n: Minimum rate of communication for omniscience. Maximum rate of a secret key. Necessary condition for secure function computation. Several problems in information theoretic cryptography.

45 40/40 Shared Information: Many Many Open Questions... Significance in network source and channel coding? Interactive communication over noisy channels? Communication links described by an undirected graph? Continuous-time models?..

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/41 Outline Two-terminal model: Mutual information Operational meaning in: Channel coding: channel

More information

Common Randomness Principles of Secrecy

Common Randomness Principles of Secrecy Common Randomness Principles of Secrecy Himanshu Tyagi Department of Electrical and Computer Engineering and Institute of Systems Research 1 Correlated Data, Distributed in Space and Time Sensor Networks

More information

Secret Key Generation and Secure Computing

Secret Key Generation and Secure Computing Secret Key Generation and Secure Computing Himanshu Tyagi Department of Electrical and Computer Engineering and Institute of System Research University of Maryland, College Park, USA. Joint work with Prakash

More information

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview AN INTRODUCTION TO SECRECY CAPACITY BRIAN DUNN. Overview This paper introduces the reader to several information theoretic aspects of covert communications. In particular, it discusses fundamental limits

More information

MULTITERMINAL SECRECY AND TREE PACKING. With Imre Csiszár, Sirin Nitinawarat, Chunxuan Ye, Alexander Barg and Alex Reznik

MULTITERMINAL SECRECY AND TREE PACKING. With Imre Csiszár, Sirin Nitinawarat, Chunxuan Ye, Alexander Barg and Alex Reznik MULTITERMINAL SECRECY AND TREE PACKING With Imre Csiszár, Sirin Nitinawarat, Chunxuan Ye, Alexander Barg and Alex Reznik Information Theoretic Security A complementary approach to computational security

More information

Reliable Computation over Multiple-Access Channels

Reliable Computation over Multiple-Access Channels Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,

More information

Interactive Communication for Data Exchange

Interactive Communication for Data Exchange Interactive Communication for Data Exchange Himanshu Tyagi Indian Institute of Science, Bangalore Joint work with Pramod Viswanath and Shun Watanabe The Data Exchange Problem [ElGamal-Orlitsky 84], [Csiszár-Narayan

More information

Secret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe

Secret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe Two party secret key agreement Maurer 93, Ahlswede-Csiszár 93 X F Y K x K y ArandomvariableK

More information

A Bound For Multiparty Secret Key Agreement And Implications For A Problem Of Secure Computing. Himanshu Tyagi and Shun Watanabe

A Bound For Multiparty Secret Key Agreement And Implications For A Problem Of Secure Computing. Himanshu Tyagi and Shun Watanabe A Bound For Multiparty Secret Key Agreement And Implications For A Problem Of Secure Computing Himanshu Tyagi and Shun Watanabe Multiparty Secret Key Agreement COMMUNICATION NETWORK F X 1 X 2 X m K 1 K

More information

Secret Key and Private Key Constructions for Simple Multiterminal Source Models

Secret Key and Private Key Constructions for Simple Multiterminal Source Models Secret Key and Private Key Constructions for Simple Multiterminal Source Models arxiv:cs/05050v [csit] 3 Nov 005 Chunxuan Ye Department of Electrical and Computer Engineering and Institute for Systems

More information

The Method of Types and Its Application to Information Hiding

The Method of Types and Its Application to Information Hiding The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

Hypothesis Testing with Communication Constraints

Hypothesis Testing with Communication Constraints Hypothesis Testing with Communication Constraints Dinesh Krithivasan EECS 750 April 17, 2006 Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 1 / 21 Presentation Outline

More information

Perfect Omniscience, Perfect Secrecy and Steiner Tree Packing

Perfect Omniscience, Perfect Secrecy and Steiner Tree Packing Perfect Omniscience, Perfect Secrecy and Steiner Tree Packing S. Nitinawarat and P. Narayan Department of Electrical and Computer Engineering and Institute for Systems Research University of Maryland College

More information

1 Background on Information Theory

1 Background on Information Theory Review of the book Information Theory: Coding Theorems for Discrete Memoryless Systems by Imre Csiszár and János Körner Second Edition Cambridge University Press, 2011 ISBN:978-0-521-19681-9 Review by

More information

Cases Where Finding the Minimum Entropy Coloring of a Characteristic Graph is a Polynomial Time Problem

Cases Where Finding the Minimum Entropy Coloring of a Characteristic Graph is a Polynomial Time Problem Cases Where Finding the Minimum Entropy Coloring of a Characteristic Graph is a Polynomial Time Problem Soheil Feizi, Muriel Médard RLE at MIT Emails: {sfeizi,medard}@mit.edu Abstract In this paper, we

More information

Common Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014

Common Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014 Common Information Abbas El Gamal Stanford University Viterbi Lecture, USC, April 2014 Andrew Viterbi s Fabulous Formula, IEEE Spectrum, 2010 El Gamal (Stanford University) Disclaimer Viterbi Lecture 2

More information

Arimoto Channel Coding Converse and Rényi Divergence

Arimoto Channel Coding Converse and Rényi Divergence Arimoto Channel Coding Converse and Rényi Divergence Yury Polyanskiy and Sergio Verdú Abstract Arimoto proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code

More information

EECS 750. Hypothesis Testing with Communication Constraints

EECS 750. Hypothesis Testing with Communication Constraints EECS 750 Hypothesis Testing with Communication Constraints Name: Dinesh Krithivasan Abstract In this report, we study a modification of the classical statistical problem of bivariate hypothesis testing.

More information

Secret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper

Secret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper Secret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper Mohsen Bahrami, Ali Bereyhi, Mahtab Mirmohseni and Mohammad Reza Aref Information Systems and Security

More information

ONE of Shannon s key discoveries was that for quite

ONE of Shannon s key discoveries was that for quite IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 44, NO. 6, OCTOBER 1998 2505 The Method of Types Imre Csiszár, Fellow, IEEE (Invited Paper) Abstract The method of types is one of the key technical tools

More information

Distributed Lossy Interactive Function Computation

Distributed Lossy Interactive Function Computation Distributed Lossy Interactive Function Computation Solmaz Torabi & John MacLaren Walsh Dept. of Electrical and Computer Engineering Drexel University Philadelphia, PA 19104 Email: solmaz.t@drexel.edu &

More information

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing Lecture 7 Agenda for the lecture M-ary hypothesis testing and the MAP rule Union bound for reducing M-ary to binary hypothesis testing Introduction of the channel coding problem 7.1 M-ary hypothesis testing

More information

Interactive Communication for Data Exchange

Interactive Communication for Data Exchange Interactive Communication for Data Exchange Himanshu Tyagi, Member, IEEE, Pramod Viswanath, Fellow, IEEE, and Shun Watanabe, Member, IEEE Abstract Two parties observing correlated data seek to exchange

More information

Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel

Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel Shahid Mehraj Shah and Vinod Sharma Department of Electrical Communication Engineering, Indian Institute of

More information

Optimal computation of symmetric Boolean functions in Tree networks

Optimal computation of symmetric Boolean functions in Tree networks Optimal computation of symmetric Boolean functions in Tree networks Hemant Kowshik CSL and Department of ECE University of Illinois Urbana-Champaign Email: kowshik2@illinoisedu P R Kumar CSL and Department

More information

Katalin Marton. Abbas El Gamal. Stanford University. Withits A. El Gamal (Stanford University) Katalin Marton Withits / 9

Katalin Marton. Abbas El Gamal. Stanford University. Withits A. El Gamal (Stanford University) Katalin Marton Withits / 9 Katalin Marton Abbas El Gamal Stanford University Withits 2010 A. El Gamal (Stanford University) Katalin Marton Withits 2010 1 / 9 Brief Bio Born in 1941, Budapest Hungary PhD from Eötvös Loránd University

More information

Information Theoretic Limits of Randomness Generation

Information Theoretic Limits of Randomness Generation Information Theoretic Limits of Randomness Generation Abbas El Gamal Stanford University Shannon Centennial, University of Michigan, September 2016 Information theory The fundamental problem of communication

More information

Secret Key Agreement: General Capacity and Second-Order Asymptotics

Secret Key Agreement: General Capacity and Second-Order Asymptotics Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe 1 Abstract We revisit the problem of secret key agreement using interactive public communication

More information

Network Coding for Computing

Network Coding for Computing Networ Coding for Computing Rathinaumar Appuswamy, Massimo Franceschetti, Nihil Karamchandani, and Kenneth Zeger Abstract The following networ computation problem is considered A set of source nodes in

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

Converses For Secret Key Agreement and Secure Computing

Converses For Secret Key Agreement and Secure Computing Converses For Secret Key Agreement and Secure Computing Himanshu Tyagi and Shun Watanabe Abstract We consider information theoretic secret key agreement and secure function computation by multiple parties

More information

Secret Key and Private Key Constructions for Simple Multiterminal Source Models Chunxuan Ye, Senior Member, IEEE, and Prakash Narayan, Fellow, IEEE

Secret Key and Private Key Constructions for Simple Multiterminal Source Models Chunxuan Ye, Senior Member, IEEE, and Prakash Narayan, Fellow, IEEE IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 58, NO 2, FEBRUARY 2012 639 Secret Key and Private Key Constructions for Simple Multiterminal Source Models Chunxuan Ye, Senior Member, IEEE, and Prakash Narayan,

More information

Variable-Rate Universal Slepian-Wolf Coding with Feedback

Variable-Rate Universal Slepian-Wolf Coding with Feedback Variable-Rate Universal Slepian-Wolf Coding with Feedback Shriram Sarvotham, Dror Baron, and Richard G. Baraniuk Dept. of Electrical and Computer Engineering Rice University, Houston, TX 77005 Abstract

More information

Lecture 2: August 31

Lecture 2: August 31 0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 2: August 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy

More information

Upper Bounds on the Capacity of Binary Intermittent Communication

Upper Bounds on the Capacity of Binary Intermittent Communication Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,

More information

The Poisson Channel with Side Information

The Poisson Channel with Side Information The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH

More information

Communication Cost of Distributed Computing

Communication Cost of Distributed Computing Communication Cost of Distributed Computing Abbas El Gamal Stanford University Brice Colloquium, 2009 A. El Gamal (Stanford University) Comm. Cost of Dist. Computing Brice Colloquium, 2009 1 / 41 Motivation

More information

Source Coding and Function Computation: Optimal Rate in Zero-Error and Vanishing Zero-Error Regime

Source Coding and Function Computation: Optimal Rate in Zero-Error and Vanishing Zero-Error Regime Source Coding and Function Computation: Optimal Rate in Zero-Error and Vanishing Zero-Error Regime Solmaz Torabi Dept. of Electrical and Computer Engineering Drexel University st669@drexel.edu Advisor:

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

Interactive Hypothesis Testing with Communication Constraints

Interactive Hypothesis Testing with Communication Constraints Fiftieth Annual Allerton Conference Allerton House, UIUC, Illinois, USA October - 5, 22 Interactive Hypothesis Testing with Communication Constraints Yu Xiang and Young-Han Kim Department of Electrical

More information

Secret Key Agreement Using Asymmetry in Channel State Knowledge

Secret Key Agreement Using Asymmetry in Channel State Knowledge Secret Key Agreement Using Asymmetry in Channel State Knowledge Ashish Khisti Deutsche Telekom Inc. R&D Lab USA Los Altos, CA, 94040 Email: ashish.khisti@telekom.com Suhas Diggavi LICOS, EFL Lausanne,

More information

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University) Performance-based Security for Encoding of Information Signals FA9550-15-1-0180 (2015-2018) Paul Cuff (Princeton University) Contributors Two students finished PhD Tiance Wang (Goldman Sachs) Eva Song

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is

More information

ELEMENTS O F INFORMATION THEORY

ELEMENTS O F INFORMATION THEORY ELEMENTS O F INFORMATION THEORY THOMAS M. COVER JOY A. THOMAS Preface to the Second Edition Preface to the First Edition Acknowledgments for the Second Edition Acknowledgments for the First Edition x

More information

LECTURE 13. Last time: Lecture outline

LECTURE 13. Last time: Lecture outline LECTURE 13 Last time: Strong coding theorem Revisiting channel and codes Bound on probability of error Error exponent Lecture outline Fano s Lemma revisited Fano s inequality for codewords Converse to

More information

Lecture 16. Error-free variable length schemes (contd.): Shannon-Fano-Elias code, Huffman code

Lecture 16. Error-free variable length schemes (contd.): Shannon-Fano-Elias code, Huffman code Lecture 16 Agenda for the lecture Error-free variable length schemes (contd.): Shannon-Fano-Elias code, Huffman code Variable-length source codes with error 16.1 Error-free coding schemes 16.1.1 The Shannon-Fano-Elias

More information

Distributed Lossless Compression. Distributed lossless compression system

Distributed Lossless Compression. Distributed lossless compression system Lecture #3 Distributed Lossless Compression (Reading: NIT 10.1 10.5, 4.4) Distributed lossless source coding Lossless source coding via random binning Time Sharing Achievability proof of the Slepian Wolf

More information

Minimum Energy Per Bit for Secret Key Acquisition Over Multipath Wireless Channels

Minimum Energy Per Bit for Secret Key Acquisition Over Multipath Wireless Channels Minimum Energy Per Bit for Secret Key Acquisition Over Multipath Wireless Channels Tzu-Han Chou Email: tchou@wisc.edu Akbar M. Sayeed Email: akbar@engr.wisc.edu Stark C. Draper Email: sdraper@ece.wisc.edu

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

The Gallager Converse

The Gallager Converse The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing

More information

On Source-Channel Communication in Networks

On Source-Channel Communication in Networks On Source-Channel Communication in Networks Michael Gastpar Department of EECS University of California, Berkeley gastpar@eecs.berkeley.edu DIMACS: March 17, 2003. Outline 1. Source-Channel Communication

More information

Ahmed S. Mansour, Rafael F. Schaefer and Holger Boche. June 9, 2015

Ahmed S. Mansour, Rafael F. Schaefer and Holger Boche. June 9, 2015 The Individual Secrecy Capacity of Degraded Multi-Receiver Wiretap Broadcast Channels Ahmed S. Mansour, Rafael F. Schaefer and Holger Boche Lehrstuhl für Technische Universität München, Germany Department

More information

Second-Order Asymptotics in Information Theory

Second-Order Asymptotics in Information Theory Second-Order Asymptotics in Information Theory Vincent Y. F. Tan (vtan@nus.edu.sg) Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) National Taiwan University November 2015

More information

An Achievable Error Exponent for the Mismatched Multiple-Access Channel

An Achievable Error Exponent for the Mismatched Multiple-Access Channel An Achievable Error Exponent for the Mismatched Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@camacuk Albert Guillén i Fàbregas ICREA & Universitat Pompeu Fabra University of

More information

LECTURE 3. Last time:

LECTURE 3. Last time: LECTURE 3 Last time: Mutual Information. Convexity and concavity Jensen s inequality Information Inequality Data processing theorem Fano s Inequality Lecture outline Stochastic processes, Entropy rate

More information

Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function

Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Dinesh Krithivasan and S. Sandeep Pradhan Department of Electrical Engineering and Computer Science,

More information

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying Ahmad Abu Al Haija, and Mai Vu, Department of Electrical and Computer Engineering McGill University Montreal, QC H3A A7 Emails: ahmadabualhaija@mailmcgillca,

More information

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Igal Sason Department of Electrical Engineering Technion - Israel Institute of Technology Haifa 32000, Israel 2009 IEEE International

More information

Secret Key Agreement: General Capacity and Second-Order Asymptotics

Secret Key Agreement: General Capacity and Second-Order Asymptotics Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe Abstract We revisit the problem of secret key agreement using interactive public communication

More information

Distributed Functional Compression through Graph Coloring

Distributed Functional Compression through Graph Coloring Distributed Functional Compression through Graph Coloring Vishal Doshi, Devavrat Shah, Muriel Médard, and Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology

More information

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I

More information

Group Secret Key Agreement over State-Dependent Wireless Broadcast Channels

Group Secret Key Agreement over State-Dependent Wireless Broadcast Channels Group Secret Key Agreement over State-Dependent Wireless Broadcast Channels Mahdi Jafari Siavoshani Sharif University of Technology, Iran Shaunak Mishra, Suhas Diggavi, Christina Fragouli Institute of

More information

COMPLEMENTARY graph entropy was introduced

COMPLEMENTARY graph entropy was introduced IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 6, JUNE 2009 2537 On Complementary Graph Entropy Ertem Tuncel, Member, IEEE, Jayanth Nayak, Prashant Koulgi, Student Member, IEEE, Kenneth Rose, Fellow,

More information

Paul Cuff, Han-I Su, and Abbas EI Gamal Department of Electrical Engineering Stanford University {cuff, hanisu,

Paul Cuff, Han-I Su, and Abbas EI Gamal Department of Electrical Engineering Stanford University   {cuff, hanisu, Cascade Multiterminal Source Coding Paul Cuff, Han-I Su, and Abbas EI Gamal Department of Electrical Engineering Stanford University E-mail: {cuff, hanisu, abbas}@stanford.edu Abstract-We investigate distributed

More information

A Formula for the Capacity of the General Gel fand-pinsker Channel

A Formula for the Capacity of the General Gel fand-pinsker Channel A Formula for the Capacity of the General Gel fand-pinsker Channel Vincent Y. F. Tan Institute for Infocomm Research (I 2 R, A*STAR, Email: tanyfv@i2r.a-star.edu.sg ECE Dept., National University of Singapore

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

LECTURE 10. Last time: Lecture outline

LECTURE 10. Last time: Lecture outline LECTURE 10 Joint AEP Coding Theorem Last time: Error Exponents Lecture outline Strong Coding Theorem Reading: Gallager, Chapter 5. Review Joint AEP A ( ɛ n) (X) A ( ɛ n) (Y ) vs. A ( ɛ n) (X, Y ) 2 nh(x)

More information

Lecture 4 Channel Coding

Lecture 4 Channel Coding Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

Polar Codes are Optimal for Write-Efficient Memories

Polar Codes are Optimal for Write-Efficient Memories Polar Codes are Optimal for Write-Efficient Memories Qing Li Department of Computer Science and Engineering Texas A & M University College Station, TX 7784 qingli@cse.tamu.edu Anxiao (Andrew) Jiang Department

More information

On the Simulatability Condition in Key Generation Over a Non-authenticated Public Channel

On the Simulatability Condition in Key Generation Over a Non-authenticated Public Channel On the Simulatability Condition in Key Generation Over a Non-authenticated Public Channel Wenwen Tu and Lifeng Lai Department of Electrical and Computer Engineering Worcester Polytechnic Institute Worcester,

More information

Optimal Distributed Detection Strategies for Wireless Sensor Networks

Optimal Distributed Detection Strategies for Wireless Sensor Networks Optimal Distributed Detection Strategies for Wireless Sensor Networks Ke Liu and Akbar M. Sayeed University of Wisconsin-Madison kliu@cae.wisc.edu, akbar@engr.wisc.edu Abstract We study optimal distributed

More information

ECE 4400:693 - Information Theory

ECE 4400:693 - Information Theory ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential

More information

On Network Functional Compression

On Network Functional Compression On Network Functional Compression Soheil Feizi, Student Member, IEEE, Muriel Médard, Fellow, IEEE arxiv:0.5496v2 [cs.it] 30 Nov 200 Abstract In this paper, we consider different aspects of the problem

More information

A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding

A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation

More information

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels S. Sandeep Pradhan a, Suhan Choi a and Kannan Ramchandran b, a {pradhanv,suhanc}@eecs.umich.edu, EECS Dept.,

More information

Graph Coloring and Conditional Graph Entropy

Graph Coloring and Conditional Graph Entropy Graph Coloring and Conditional Graph Entropy Vishal Doshi, Devavrat Shah, Muriel Médard, Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology Cambridge,

More information

Lecture 15: Conditional and Joint Typicaility

Lecture 15: Conditional and Joint Typicaility EE376A Information Theory Lecture 1-02/26/2015 Lecture 15: Conditional and Joint Typicaility Lecturer: Kartik Venkat Scribe: Max Zimet, Brian Wai, Sepehr Nezami 1 Notation We always write a sequence of

More information

Information Complexity vs. Communication Complexity: Hidden Layers Game

Information Complexity vs. Communication Complexity: Hidden Layers Game Information Complexity vs. Communication Complexity: Hidden Layers Game Jiahui Liu Final Project Presentation for Information Theory in TCS Introduction Review of IC vs CC Hidden Layers Game Upper Bound

More information

IN THIS paper, we consider a cooperative data exchange

IN THIS paper, we consider a cooperative data exchange IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 62, NO. 7, JULY 2016 3785 Coded Cooperative Data Exchange for a Secret Key Thomas A. Courtade, Member, IEEE, and Thomas R. Halford, Member, IEEE Abstract We

More information

Tight Bounds for Symmetric Divergence Measures and a Refined Bound for Lossless Source Coding

Tight Bounds for Symmetric Divergence Measures and a Refined Bound for Lossless Source Coding APPEARS IN THE IEEE TRANSACTIONS ON INFORMATION THEORY, FEBRUARY 015 1 Tight Bounds for Symmetric Divergence Measures and a Refined Bound for Lossless Source Coding Igal Sason Abstract Tight bounds for

More information

CONSIDER a joint stationary and memoryless process

CONSIDER a joint stationary and memoryless process 4006 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 55, NO 9, SEPTEMBER 2009 On the Duality Between Slepian Wolf Coding and Channel Coding Under Mismatched Decoding Jun Chen, Member, IEEE, Da-ke He, and

More information

6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011

6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011 6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011 On the Structure of Real-Time Encoding and Decoding Functions in a Multiterminal Communication System Ashutosh Nayyar, Student

More information

Joint Iterative Decoding of LDPC Codes and Channels with Memory

Joint Iterative Decoding of LDPC Codes and Channels with Memory Joint Iterative Decoding of LDPC Codes and Channels with Memory Henry D. Pfister and Paul H. Siegel University of California, San Diego 3 rd Int l Symposium on Turbo Codes September 1, 2003 Outline Channels

More information

Incremental and Decremental Secret Key Agreement

Incremental and Decremental Secret Key Agreement Incremental and Decremental Secret Key Agreement Chung Chan, Ali Al-Bashabsheh and Qiaoqiao Zhou arxiv:605.086v [cs.it] 6 May 206 Abstract We study the rate of change of the multivariate mutual information

More information

A Simple Converse of Burnashev s Reliability Function

A Simple Converse of Burnashev s Reliability Function A Simple Converse of Burnashev s Reliability Function 1 arxiv:cs/0610145v3 [cs.it] 23 Sep 2008 Peter Berlin, Barış Nakiboğlu, Bixio Rimoldi, Emre Telatar School of Computer and Communication Sciences Ecole

More information

Common Randomness and Secret Key Generation with a Helper

Common Randomness and Secret Key Generation with a Helper 344 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 46, NO. 2, MARCH 2000 Common Romness Secret Key Generation with a Helper Imre Csiszár, Fellow, IEEE, Prakash Narayan, Senior Member, IEEE Abstract We consider

More information

Network coding for multicast relation to compression and generalization of Slepian-Wolf

Network coding for multicast relation to compression and generalization of Slepian-Wolf Network coding for multicast relation to compression and generalization of Slepian-Wolf 1 Overview Review of Slepian-Wolf Distributed network compression Error exponents Source-channel separation issues

More information

Distributed Hypothesis Testing Over Discrete Memoryless Channels

Distributed Hypothesis Testing Over Discrete Memoryless Channels 1 Distributed Hypothesis Testing Over Discrete Memoryless Channels Sreejith Sreekumar and Deniz Gündüz Imperial College London, UK Email: {s.sreekumar15, d.gunduz}@imperial.ac.uk Abstract A distributed

More information

Can Feedback Increase the Capacity of the Energy Harvesting Channel?

Can Feedback Increase the Capacity of the Energy Harvesting Channel? Can Feedback Increase the Capacity of the Energy Harvesting Channel? Dor Shaviv EE Dept., Stanford University shaviv@stanford.edu Ayfer Özgür EE Dept., Stanford University aozgur@stanford.edu Haim Permuter

More information

Joint Source-Channel Coding for the Multiple-Access Relay Channel

Joint Source-Channel Coding for the Multiple-Access Relay Channel Joint Source-Channel Coding for the Multiple-Access Relay Channel Yonathan Murin, Ron Dabora Department of Electrical and Computer Engineering Ben-Gurion University, Israel Email: moriny@bgu.ac.il, ron@ee.bgu.ac.il

More information

On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers

On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers Albrecht Wolf, Diana Cristina González, Meik Dörpinghaus, José Cândido Silveira Santos Filho, and Gerhard Fettweis Vodafone

More information

On Large Deviation Analysis of Sampling from Typical Sets

On Large Deviation Analysis of Sampling from Typical Sets Communications and Signal Processing Laboratory (CSPL) Technical Report No. 374, University of Michigan at Ann Arbor, July 25, 2006. On Large Deviation Analysis of Sampling from Typical Sets Dinesh Krithivasan

More information

On Function Computation with Privacy and Secrecy Constraints

On Function Computation with Privacy and Secrecy Constraints 1 On Function Computation with Privacy and Secrecy Constraints Wenwen Tu and Lifeng Lai Abstract In this paper, the problem of function computation with privacy and secrecy constraints is considered. The

More information

On the Rate-Limited Gelfand-Pinsker Problem

On the Rate-Limited Gelfand-Pinsker Problem On the Rate-Limited Gelfand-Pinsker Problem Ravi Tandon Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 ravit@umd.edu ulukus@umd.edu Abstract

More information

Lossy Distributed Source Coding

Lossy Distributed Source Coding Lossy Distributed Source Coding John MacLaren Walsh, Ph.D. Multiterminal Information Theory, Spring Quarter, 202 Lossy Distributed Source Coding Problem X X 2 S {,...,2 R } S 2 {,...,2 R2 } Ẑ Ẑ 2 E d(z,n,

More information