SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

Size: px
Start display at page:

Download "SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe"

Transcription

1 SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

2 2/41 Outline Two-terminal model: Mutual information Operational meaning in: Channel coding: channel capacity Lossy source coding: rate distortion function Binary hypothesis testing: Stein s lemma Interactive communication and common randomness Two-terminal model: Mutual information Multiterminal model: Shared information Applications

3 3/41 Outline Two-terminal model: Mutual information Operational meaning in: Channel coding: channel capacity Lossy source coding: rate distortion function Binary hypothesis testing: Stein s lemma Interactive communication and common randomness Applications

4 4/41 Mutual Information Mutual information is a measure of mutual dependence between two rvs.

5 4/41 Mutual Information Mutual information is a measure of mutual dependence between two rvs. Let X 1 and X 2 be R-valued rvs with joint probability distribution P X1X 2. The mutual information between X 1 and X 2 is { [ ] EPX1X2 log dp X 1 X 2 dp I X 1 X 2 = X1 P X2 X 1, X 2, if P X1X 2 P X1 P X2, if P X1X 2 P X1 P X2 = D P X1X 2 P X1 P X2. Kullback Leibler divergence When X 1 and X 2 are finite-valued, I X 1 X 2 = H X 1 + H X 2 H X 1, X 2 = H X 1 H X 1 X 2 = H X 2 H X 2 X 1 [ ] = H X 1, X 2 H X 1 X 2 + H X 2 X 1.

6 5/41 Channel Coding Let X 1 and X 2 be finite alphabets, and W : X 1 X 2 be a stochastic matrix. message m {1,..., M} fm encoder f DMC decoder φ x 11,..., x 1n x 21,..., x 2n m Discrete memoryless channel DMC: n W n x 21,..., x 2n x 11,..., x 1n = W x 2i x 1i. i=1

7 6/41 Channel Capacity message m {1,..., M} fm encoder f DMC decoder φ x 11,..., x 1n x 21,..., x 2n m Goal: Make code rate 1 n log M as large as possible while keeping max P φ X 21,..., X 2n m fm m to be small, in the asymptotic sense as n. [C.E. Shannon, 1948] Channel capacity C = max I X 1 X 2. P X1 :P X2 X 1 =W

8 7/41 Lossy Source Coding Let {X 1t } t=1 be an X 1-valued i.i.d. source. source fx 11,..., x 1n φj encoder f decoder φ x 11,..., x 1n j {1,..., J} x 21,..., x 2n Distortion measure: d x 11,..., x 1n, x 21,..., x 2n = 1 n d x 1i, x 2i. n i=1

9 8/41 Rate Distortion Function source fx 11,..., x 1n φj encoder f decoder φ x 11,..., x 1n j {1,..., J} x 21,..., x 2n Goal: Make compression code rate 1 n log J as small as possible while keeping 1 P n n i=1 d X 1i, X 2i to be large, in the asymptotic sense as n. [Shannon, 1948, 1959] Rate distortion function R = min I X 1 X 2. P X2 X 1 : E[dX 1,X 2]

10 9/41 Simple Binary Hypothesis Testing Let {X 1t, X 2t } t=1 be an X 1 X 2 -valued i.i.d. process generated according to H 0 : P X1X 2 or H 1 : P X1 P X2. Test: Decides H 0 w.p. T 0 x 11,..., x 1n, x 21,..., x 2n, H 1 w.p. T 1 x 11,..., x 1n, x 21,..., x 2n = 1 T Stein s lemma [H. Chernoff, 1956]: For every 0 < ɛ < 1, lim n 1 n log inf T : P H0 T says H 0 1 ɛ P H 1 T says H 0 = D P X1X 2 P X1 P X2 = I X 1 X 2.

11 10/41 Outline Two-terminal model: Mutual information Interactive communication and common randomness Two-terminal model: Mutual information Multiterminal model: Shared information Applications

12 11/41 Multiterminal Model COMMUNICATION NETWORK F X 1 X 2 X m Set of terminals = M = {1,..., m}. X 1,..., X m are finite-valued rvs with known joint distribution P X1...X m on X 1 X m. Terminal i M observes data X i. Multiple rounds of interactive communication on a noiseless channel of unlimited capacity; all terminals hear all communication.

13 12/41 Interactive communication Interactive Communication Assume: Communication occurs in consecutive time slots in r rounds. The corresponding rvs representing the communication are F = F X 1,..., X m = F 11,..., F 1m, F 21,..., F 2m,..., F r1,..., F rm F 11 = f 11 X1, F12 = f 12 X2, F 11,... F ji = f ji Xi; all previous communication. Simple communication: F = F 1,..., F m, Fi = f i Xi, 1 i m. A. Yao, Some complexity questions related to distributive computing, Proc. Annual Symposium on Theory of Computing, 1979.

14 13/41 Applications COMMUNICATION NETWORK F X 1 X 2 X m Data exchange: Omniscience Signal recovery: Data compression Function computation Cryptography: Secret key generation

15 14/41 WatanExample: Function Computation X 1 = X 11 X 12 F 1 F 2 X 2 = X 21 X 22 [S. Watanabe] X 11, X 12, X 21, X 22 are mutually independent 0.5, 0.5 bits. Terminals 1 and 2 wish to compute: G = gx 1, X 2 = 1 X 11, X 12 = X 21, X 22. Simple communication: F = Communication complexity: HF = 4 bits. F 1 = X 11, X 12, F 2 = X 21, X 22. No privacy: Terminal 1 or 2, or an observer of F, learns all the data X 1, X 2.

16 15/41 WatanExample: Function Computation X 1 = X 11 X 12 F 11 F 12 X 2 = X 21 X 22 An interactive communication protocol: F = F 11 = X 11, X 12, F 12 = G. Complexity: HF = 2.81 bits. Some privacy: Terminal 2, or an observer of F, learns X 1; Terminal 1, or an observer of F, either learns X 2 w.p or w.p that X 2 differs from X 1.

17 15/41 WatanExample: Function Computation X 1 = X 11 X 12 F 11 F 12 X 2 = X 21 X 22 An interactive communication protocol: F = F 11 = X 11, X 12, F 12 = G. Complexity: HF = 2.81 bits. Some privacy: Terminal 2, or an observer of F, learns X 1; Terminal 1, or an observer of F, either learns X 2 w.p or w.p that X 2 differs from X 1. Can a communication complexity of 2.81 bits be bettered?

18 16/41 Related Work Exact function computation Yao 79: Communication complexity. Gallager 88: Algorithm for parity computation in a network. Giridhar-Kumar 05: Algorithms for computing functions over sensor networks. Freris-Kowshik-Kumar 10: Survey: Connectivity, capacity, clocks, computation in large sensor networks. Orlitsky-El Gamal 84: Communication complexity with secrecy. Information theoretic function computation Körner-Marton 79: Minimum rate for computing parity. Orlitsky-Roche 01: Two terminal function computation. Nazer-Gastpar 07: Computation over noisy channels. Ma-Ishwar 08: Distributed source coding for interactive computing. Ma-Ishwar-Gupta 09: Multiround function computation in colocated networks. Tyagi-Gupta-Narayan 11: Secure function computation. Tyagi-Watanabe 13, 14 Secrecy generation, secure computing. Compressing interactive communication Schulman 92: Coding for interactive communication. Braverman-Rao 10: Information complexity of communication. Kol-Raz 13, Heupler 14: Interactive communication over noisy channels.

19 17/41 Mathematical Economics: Mechanism Design Thomas Marschak and Stefan Reichelstein, Communication requirements for individual agents in networks and hierarchies, in The Economics of Informational Decentralization: Complexity, Efficiency and Stability: Essays in Honor of Stanley Reiter, John O. Ledyard, Ed., Springer, Kenneth R. Mount and Stanley Reiter, Computation and Complexity in Economic Behavior and Organization, Cambridge U. Press, Courtesy: Demos Teneketzis

20 18/41 Common Randomness COMMUNICATION NETWORK F X 1 X 2 X m L 1 L 2 L m = L For 0 ɛ < 1, given interactive communication F, a rv L = LX 1,..., X m is ɛ-cr for the terminals in M using F, if there exist local estimates L i = L i Xi, F, i M, of L satisfying P L i = L, i M 1 ɛ.

21 19/41 Common Randomness COMMUNICATION NETWORK F X 1 X 2 X m L 1 L 2 L m = L Examples: Data exchange: Omniscience: L = X 1,..., X m. Signal recovery: Data compression: L X i, for some fixed i M. Function computation: L g X 1,..., X m for a given g. Cryptography: Secret CR, i.e., secret key: L with IL F = 0.

22 20/41 A Basic Operational Question COMMUNICATION NETWORK F X 1 X 2 X m L 1 L 2 L m = L What is the maximal CR, as measured by H L F, that can be generated by a given interactive communication F for a distributed processing task?

23 20/41 A Basic Operational Question COMMUNICATION NETWORK F X 1 X 2 X m L 1 L 2 L m = L What is the maximal CR, as measured by H L F, that can be generated by a given interactive communication F for a distributed processing task? Answer in two steps: Fundamental structural property of interactive communication Upper bound on amount of CR achievable with interactive communication. Shall start with the case of m = 2 terminals.

24 21/41 Fundamental Property of Interactive Communication COMMUNICATION NETWORK F X 1 X 2 Lemma: [U. Maurer], [R. Ahlswede - I. Csiszár] For interactive communication F of the Terminals 1 and 2 observing data X 1 and X 2, respectively, I X 1 X 2 F I X1 X 2. In particular, independent rvs X 1, X 2 remain so upon conditioning on an interactive communication.

25 22/41 Fundamental Property of Interactive Communication Lemma: [U. Maurer], [R. Ahlswede - I. Csiszár] For interactive communication F of the Terminals 1 and 2 observing data X 1 and X 2, respectively, I X 1 X 2 F I X1 X 2. In particular, independent rvs X 1, X 2 remain so upon conditioning on an interactive communication. Proof: For interactive communication F = F 11, F 12,..., F r1, F r2, followed by iteration. I X 1 X 2 = I X1, F 11 X 2 I X 1 X 2 F 11 = I X 1 X 2, F 12 F 11 I X 1 X 2 F 11, F 12,

26 23/41 An Equivalent Form For interactive communication F of Terminals 1 and 2: I X 1 X 2 F I X1 X 2 HF HF X 1 + HF X 2.

27 Upper Bound on CR for Two Terminals COMMUNICATION NETWORK F X 1 X 2 L1 L 2 = L Using L is ɛ-cr for Terminals 1 and 2 with interactive communication F; and HF HF X 1 + HF X 2, we get H L F H X 1, X 2 [H X 1 X 2 + H X2 X 1 ] + 2νɛ, where lim ɛ 0 νɛ = 0. 24/41

28 25/41 Maximum CR for Two Terminals: Mutual Information COMMUNICATION NETWORK F X 1 X 2 L1 L 2 = L Lemma: [I. Csiszár - P. Narayan] Let L be any ɛ-cr for Terminals 1 and 2 observing data X 1 and X 2, respectively, achievable with interactive F. Then H L F I X 1 X 2 = D PX1X 2 P X1 P X2. Remark: When {X 1t, X 2t } t=1 is an X 1 X 2 -valued i.i.d. process, the upper bound is attained.

29 26/41 Interactive Communication for m 2 Terminals Theorem 1: [I. Csiszár-P. Narayan] For interactive communication F of the terminals i M = {1,..., m}, with Terminal i oberving data X i, H F B B λ B H F X B c for every family B = {B M, B } and set of weights fractional partition { } λ 0 λ B 1, B B, satisfying λ B = 1 i M. B B:B i Equality holds if X 1,..., X m are mutually independent. Special case of: M. Madiman and P. Tetali, Information inequalities for joint distributions, with interpretations and applications, IEEE Trans. Inform. Theory, June 2010.

30 27/41 CR for m 2 Terminals: A Suggestive Analogy [S. Nitinawarat-P. Narayan] For interactive communication F of the terminals i M = {1,..., m}, with Terminal i observing data X i, m = 2 : HF HF X 1 + HF X 2 I X 1 X 2 F I X1 X 2 H F B B λ B H F X B c H X 1,..., X m F λ B H X B X B c, F B B H X 1,..., X m λ B H X B X B c. B B

31 28/41 [S. Nitinawarat-P. Narayan] An Analogy For interactive communication F of the terminals i M = {1,..., m}, with Terminal i observing data X i, H F B B λ B H F X B c H X 1,..., X m F λ B H X B X B c, F B B H X 1,..., X m λ B H X B X B c. B B Does the RHS suggest a measure of mutual dependence among the rvs X 1,..., X m?

32 29/41 Maximum CR for m 2 Terminals: Shared Information Theorem 2: [I. Csiszár-P. Narayan] Given 0 ɛ < 1, for an ɛ-cr L for M achieved with interactive communication F, H L F H X 1,..., X m λ B H X B X B c + mν B B for every fractional partition λ of M, with ν = νɛ = ɛ log L + hɛ. Remarks: The proof of Theorem 2 relies on Theorem 1. When {X 1t,..., X mt } t=1 is an i.i.d. process, the upper bound is attained.

33 30/41 Shared Information Theorem 2: [I. Csiszár-P. Narayan] H L F H X 1,..., X m max λ = SI X 1,..., X m λ B H X B X B c B B

34 31/41 Extensions Theorems 1 and 2 extend to: random variables with densities [S. Nitinawarat-P. Narayan] a larger class of probability measures [H.Tyagi-P. Narayan].

35 32/41 Shared Information and Kullback-Leibler Divergence [I. Csiszár-P. Narayan, C. Chan-L. Zheng] SI X 1,..., X m = H X1,..., X m max λ λ B H X B X B c B B m = 2 = H X 1, X 2 [ H X 1 X 2 + H X2 X 1 ] = I X 1 X 2 m = 2 = D P X1X 2 P X1 P X2

36 32/41 Shared Information and Kullback-Leibler Divergence [I. Csiszár-P. Narayan, C. Chan-L. Zheng] SI X 1,..., X m = H X1,..., X m max λ λ B H X B X B c B B m = 2 = H X 1, X 2 [ H X 1 X 2 + H X2 X 1 ] = I X 1 X 2 m = 2 = D P X1X 2 P X1 P X2 m 2 = min 2 k m min A k =A 1,...,A k 1 k 1 D k P X1...X m P XAi i=1 and equals 0 iff P X1...X m = P XA P XA c for some A M. Does shared information have an operational significance as a measure of the mutual dependence among the rvs X 1,..., X m?

37 33/41 Outline Two-terminal model: Mutual information Interactive communication and common randomness Applications

38 34/41 Omniscience COMMUNICATION NETWORK F X 1 X 2 X m L 1 L 2 L m = L [I. Csiszár-P. Narayan] For L = X 1,..., X m, Theorem 2 gives which, for m = 2, is H F H X 1,..., X m SI X 1,..., X m, H F H X 1 X 2 + H X2 X 1. [Slepian Wolf]

39 35/41 Signal Recovery: Data Compression COMMUNICATION NETWORK F X 1 X 2 X m L 1 L 2 L m = L [S. Nitinawarat-P. Narayan] With L = X 1, by Theorem 2 H F H X 1 SI X 1,..., X m, which, for m = 2, gives H F H X 1 X 2. [Slepian-Wolf]

40 36/41 Secret Common Randomness COMMUNICATION NETWORK F X 1 X 2 X m L 1 L 2 L m = L Terminals 1,..., m generate CR L satisfying the secrecy condition I L F = 0. By Theorem 2, H L = H L F SI X1,..., X m. Secret key generation [I. Csiszár-P. Narayan] Secure function computation [H. Tyagi-P. Narayan]

41 37/41 Querying Common Randomness COMMUNICATION NETWORK F X n 1 X n 2 X n m L 1 L 2 L m = L [H. Tyagi-P. Narayan] A querier observes communication F and seeks to resolve the value of CR L by asking questions: Is L = l? with yes-no answers. The terminals in M seek to generate L using F so as to make the querier s burden as onerous as possible. What is the largest query exponent?

42 38/41 Largest Query Exponent COMMUNICATION NETWORK F X n 1 X n 2 X n m L 1 L 2 L m = L E arg sup E [ inf q P q L F 2 ne ] 1 as n E = SI X 1,..., X m

43 39/41 Shared information and a Hypothesis Testing Problem SI X 1,..., X m = min 2 k m min A k =A 1,...,A k 1 k 1 D k P X1...X m P XAi i=1 Related to exponent of P e -second kind for an appropriate binary composite hypothesis testing problem, involving restricted CR L and communication F. H. Tyagi and S. Watanabe, Converses for secret key agreement and secure computing, IEEE Trans. Information Theory, September 2015.

44 40/41 In Closing... How useful is the concept of shared information? A: Operational meaning in specific cases of distributed processing...

45 40/41 In Closing... How useful is the concept of shared information? A: Operational meaning in specific cases of distributed processing... For instance Consider n i.i.d. repetitions say, in time of the rvs X 1,..., X m. Data at time instant t is X 1t,..., X mt, t = 1,..., n. Terminal i observes the i.i.d. data X i1,..., X in, i M. Shared information-based results are asymptotically tight in n: Minimum rate of communication for omniscience Maximum rate of a secret key Largest query exponent Necessary condition for secure function computation Several problems in information theoretic cryptography.

46 41/41 Shared Information: Many Open Questions... Significance in network source and channel coding? Interactive communication over noisy channels? Data-clustering applications? [C. Chan-A. Al-Bashabsheh-Q. Zhou-T. Kaced-T.Liu, 2016]..

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/40 Acknowledgement Praneeth Boda Himanshu Tyagi Shun Watanabe 3/40 Outline Two-terminal model: Mutual

More information

Common Randomness Principles of Secrecy

Common Randomness Principles of Secrecy Common Randomness Principles of Secrecy Himanshu Tyagi Department of Electrical and Computer Engineering and Institute of Systems Research 1 Correlated Data, Distributed in Space and Time Sensor Networks

More information

Secret Key Generation and Secure Computing

Secret Key Generation and Secure Computing Secret Key Generation and Secure Computing Himanshu Tyagi Department of Electrical and Computer Engineering and Institute of System Research University of Maryland, College Park, USA. Joint work with Prakash

More information

MULTITERMINAL SECRECY AND TREE PACKING. With Imre Csiszár, Sirin Nitinawarat, Chunxuan Ye, Alexander Barg and Alex Reznik

MULTITERMINAL SECRECY AND TREE PACKING. With Imre Csiszár, Sirin Nitinawarat, Chunxuan Ye, Alexander Barg and Alex Reznik MULTITERMINAL SECRECY AND TREE PACKING With Imre Csiszár, Sirin Nitinawarat, Chunxuan Ye, Alexander Barg and Alex Reznik Information Theoretic Security A complementary approach to computational security

More information

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview AN INTRODUCTION TO SECRECY CAPACITY BRIAN DUNN. Overview This paper introduces the reader to several information theoretic aspects of covert communications. In particular, it discusses fundamental limits

More information

Reliable Computation over Multiple-Access Channels

Reliable Computation over Multiple-Access Channels Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,

More information

Interactive Communication for Data Exchange

Interactive Communication for Data Exchange Interactive Communication for Data Exchange Himanshu Tyagi Indian Institute of Science, Bangalore Joint work with Pramod Viswanath and Shun Watanabe The Data Exchange Problem [ElGamal-Orlitsky 84], [Csiszár-Narayan

More information

Secret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe

Secret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe Two party secret key agreement Maurer 93, Ahlswede-Csiszár 93 X F Y K x K y ArandomvariableK

More information

Secret Key and Private Key Constructions for Simple Multiterminal Source Models

Secret Key and Private Key Constructions for Simple Multiterminal Source Models Secret Key and Private Key Constructions for Simple Multiterminal Source Models arxiv:cs/05050v [csit] 3 Nov 005 Chunxuan Ye Department of Electrical and Computer Engineering and Institute for Systems

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

A Bound For Multiparty Secret Key Agreement And Implications For A Problem Of Secure Computing. Himanshu Tyagi and Shun Watanabe

A Bound For Multiparty Secret Key Agreement And Implications For A Problem Of Secure Computing. Himanshu Tyagi and Shun Watanabe A Bound For Multiparty Secret Key Agreement And Implications For A Problem Of Secure Computing Himanshu Tyagi and Shun Watanabe Multiparty Secret Key Agreement COMMUNICATION NETWORK F X 1 X 2 X m K 1 K

More information

The Method of Types and Its Application to Information Hiding

The Method of Types and Its Application to Information Hiding The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,

More information

Common Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014

Common Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014 Common Information Abbas El Gamal Stanford University Viterbi Lecture, USC, April 2014 Andrew Viterbi s Fabulous Formula, IEEE Spectrum, 2010 El Gamal (Stanford University) Disclaimer Viterbi Lecture 2

More information

Information Theoretic Limits of Randomness Generation

Information Theoretic Limits of Randomness Generation Information Theoretic Limits of Randomness Generation Abbas El Gamal Stanford University Shannon Centennial, University of Michigan, September 2016 Information theory The fundamental problem of communication

More information

Hypothesis Testing with Communication Constraints

Hypothesis Testing with Communication Constraints Hypothesis Testing with Communication Constraints Dinesh Krithivasan EECS 750 April 17, 2006 Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 1 / 21 Presentation Outline

More information

EECS 750. Hypothesis Testing with Communication Constraints

EECS 750. Hypothesis Testing with Communication Constraints EECS 750 Hypothesis Testing with Communication Constraints Name: Dinesh Krithivasan Abstract In this report, we study a modification of the classical statistical problem of bivariate hypothesis testing.

More information

Arimoto Channel Coding Converse and Rényi Divergence

Arimoto Channel Coding Converse and Rényi Divergence Arimoto Channel Coding Converse and Rényi Divergence Yury Polyanskiy and Sergio Verdú Abstract Arimoto proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code

More information

Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel

Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel Shahid Mehraj Shah and Vinod Sharma Department of Electrical Communication Engineering, Indian Institute of

More information

Perfect Omniscience, Perfect Secrecy and Steiner Tree Packing

Perfect Omniscience, Perfect Secrecy and Steiner Tree Packing Perfect Omniscience, Perfect Secrecy and Steiner Tree Packing S. Nitinawarat and P. Narayan Department of Electrical and Computer Engineering and Institute for Systems Research University of Maryland College

More information

Cases Where Finding the Minimum Entropy Coloring of a Characteristic Graph is a Polynomial Time Problem

Cases Where Finding the Minimum Entropy Coloring of a Characteristic Graph is a Polynomial Time Problem Cases Where Finding the Minimum Entropy Coloring of a Characteristic Graph is a Polynomial Time Problem Soheil Feizi, Muriel Médard RLE at MIT Emails: {sfeizi,medard}@mit.edu Abstract In this paper, we

More information

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing Lecture 7 Agenda for the lecture M-ary hypothesis testing and the MAP rule Union bound for reducing M-ary to binary hypothesis testing Introduction of the channel coding problem 7.1 M-ary hypothesis testing

More information

Variable-Rate Universal Slepian-Wolf Coding with Feedback

Variable-Rate Universal Slepian-Wolf Coding with Feedback Variable-Rate Universal Slepian-Wolf Coding with Feedback Shriram Sarvotham, Dror Baron, and Richard G. Baraniuk Dept. of Electrical and Computer Engineering Rice University, Houston, TX 77005 Abstract

More information

Secret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper

Secret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper Secret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper Mohsen Bahrami, Ali Bereyhi, Mahtab Mirmohseni and Mohammad Reza Aref Information Systems and Security

More information

1 Background on Information Theory

1 Background on Information Theory Review of the book Information Theory: Coding Theorems for Discrete Memoryless Systems by Imre Csiszár and János Körner Second Edition Cambridge University Press, 2011 ISBN:978-0-521-19681-9 Review by

More information

Lecture 2: August 31

Lecture 2: August 31 0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 2: August 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

Network Coding for Computing

Network Coding for Computing Networ Coding for Computing Rathinaumar Appuswamy, Massimo Franceschetti, Nihil Karamchandani, and Kenneth Zeger Abstract The following networ computation problem is considered A set of source nodes in

More information

Interactive Communication for Data Exchange

Interactive Communication for Data Exchange Interactive Communication for Data Exchange Himanshu Tyagi, Member, IEEE, Pramod Viswanath, Fellow, IEEE, and Shun Watanabe, Member, IEEE Abstract Two parties observing correlated data seek to exchange

More information

ONE of Shannon s key discoveries was that for quite

ONE of Shannon s key discoveries was that for quite IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 44, NO. 6, OCTOBER 1998 2505 The Method of Types Imre Csiszár, Fellow, IEEE (Invited Paper) Abstract The method of types is one of the key technical tools

More information

Distributed Lossy Interactive Function Computation

Distributed Lossy Interactive Function Computation Distributed Lossy Interactive Function Computation Solmaz Torabi & John MacLaren Walsh Dept. of Electrical and Computer Engineering Drexel University Philadelphia, PA 19104 Email: solmaz.t@drexel.edu &

More information

Secret Key Agreement: General Capacity and Second-Order Asymptotics

Secret Key Agreement: General Capacity and Second-Order Asymptotics Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe 1 Abstract We revisit the problem of secret key agreement using interactive public communication

More information

Upper Bounds on the Capacity of Binary Intermittent Communication

Upper Bounds on the Capacity of Binary Intermittent Communication Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,

More information

6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011

6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011 6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011 On the Structure of Real-Time Encoding and Decoding Functions in a Multiterminal Communication System Ashutosh Nayyar, Student

More information

Optimal computation of symmetric Boolean functions in Tree networks

Optimal computation of symmetric Boolean functions in Tree networks Optimal computation of symmetric Boolean functions in Tree networks Hemant Kowshik CSL and Department of ECE University of Illinois Urbana-Champaign Email: kowshik2@illinoisedu P R Kumar CSL and Department

More information

Converses For Secret Key Agreement and Secure Computing

Converses For Secret Key Agreement and Secure Computing Converses For Secret Key Agreement and Secure Computing Himanshu Tyagi and Shun Watanabe Abstract We consider information theoretic secret key agreement and secure function computation by multiple parties

More information

Secret Key and Private Key Constructions for Simple Multiterminal Source Models Chunxuan Ye, Senior Member, IEEE, and Prakash Narayan, Fellow, IEEE

Secret Key and Private Key Constructions for Simple Multiterminal Source Models Chunxuan Ye, Senior Member, IEEE, and Prakash Narayan, Fellow, IEEE IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 58, NO 2, FEBRUARY 2012 639 Secret Key and Private Key Constructions for Simple Multiterminal Source Models Chunxuan Ye, Senior Member, IEEE, and Prakash Narayan,

More information

Katalin Marton. Abbas El Gamal. Stanford University. Withits A. El Gamal (Stanford University) Katalin Marton Withits / 9

Katalin Marton. Abbas El Gamal. Stanford University. Withits A. El Gamal (Stanford University) Katalin Marton Withits / 9 Katalin Marton Abbas El Gamal Stanford University Withits 2010 A. El Gamal (Stanford University) Katalin Marton Withits 2010 1 / 9 Brief Bio Born in 1941, Budapest Hungary PhD from Eötvös Loránd University

More information

Finding the best mismatched detector for channel coding and hypothesis testing

Finding the best mismatched detector for channel coding and hypothesis testing Finding the best mismatched detector for channel coding and hypothesis testing Sean Meyn Department of Electrical and Computer Engineering University of Illinois and the Coordinated Science Laboratory

More information

Interactive Hypothesis Testing with Communication Constraints

Interactive Hypothesis Testing with Communication Constraints Fiftieth Annual Allerton Conference Allerton House, UIUC, Illinois, USA October - 5, 22 Interactive Hypothesis Testing with Communication Constraints Yu Xiang and Young-Han Kim Department of Electrical

More information

The Poisson Channel with Side Information

The Poisson Channel with Side Information The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH

More information

Incremental and Decremental Secret Key Agreement

Incremental and Decremental Secret Key Agreement Incremental and Decremental Secret Key Agreement Chung Chan, Ali Al-Bashabsheh and Qiaoqiao Zhou arxiv:605.086v [cs.it] 6 May 206 Abstract We study the rate of change of the multivariate mutual information

More information

Info-Clustering: An Information-Theoretic Paradigm for Data Clustering. Information Theory Guest Lecture Fall 2018

Info-Clustering: An Information-Theoretic Paradigm for Data Clustering. Information Theory Guest Lecture Fall 2018 Info-Clustering: An Information-Theoretic Paradigm for Data Clustering Information Theory Guest Lecture Fall 208 Problem statement Input: Independently and identically drawn samples of the jointly distributed

More information

An Achievable Error Exponent for the Mismatched Multiple-Access Channel

An Achievable Error Exponent for the Mismatched Multiple-Access Channel An Achievable Error Exponent for the Mismatched Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@camacuk Albert Guillén i Fàbregas ICREA & Universitat Pompeu Fabra University of

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

On the Simulatability Condition in Key Generation Over a Non-authenticated Public Channel

On the Simulatability Condition in Key Generation Over a Non-authenticated Public Channel On the Simulatability Condition in Key Generation Over a Non-authenticated Public Channel Wenwen Tu and Lifeng Lai Department of Electrical and Computer Engineering Worcester Polytechnic Institute Worcester,

More information

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University) Performance-based Security for Encoding of Information Signals FA9550-15-1-0180 (2015-2018) Paul Cuff (Princeton University) Contributors Two students finished PhD Tiance Wang (Goldman Sachs) Eva Song

More information

Communication Cost of Distributed Computing

Communication Cost of Distributed Computing Communication Cost of Distributed Computing Abbas El Gamal Stanford University Brice Colloquium, 2009 A. El Gamal (Stanford University) Comm. Cost of Dist. Computing Brice Colloquium, 2009 1 / 41 Motivation

More information

LECTURE 10. Last time: Lecture outline

LECTURE 10. Last time: Lecture outline LECTURE 10 Joint AEP Coding Theorem Last time: Error Exponents Lecture outline Strong Coding Theorem Reading: Gallager, Chapter 5. Review Joint AEP A ( ɛ n) (X) A ( ɛ n) (Y ) vs. A ( ɛ n) (X, Y ) 2 nh(x)

More information

Graph Coloring and Conditional Graph Entropy

Graph Coloring and Conditional Graph Entropy Graph Coloring and Conditional Graph Entropy Vishal Doshi, Devavrat Shah, Muriel Médard, Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology Cambridge,

More information

Minimum Energy Per Bit for Secret Key Acquisition Over Multipath Wireless Channels

Minimum Energy Per Bit for Secret Key Acquisition Over Multipath Wireless Channels Minimum Energy Per Bit for Secret Key Acquisition Over Multipath Wireless Channels Tzu-Han Chou Email: tchou@wisc.edu Akbar M. Sayeed Email: akbar@engr.wisc.edu Stark C. Draper Email: sdraper@ece.wisc.edu

More information

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is

More information

Second-Order Asymptotics in Information Theory

Second-Order Asymptotics in Information Theory Second-Order Asymptotics in Information Theory Vincent Y. F. Tan (vtan@nus.edu.sg) Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) National Taiwan University November 2015

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

Secret Key Agreement Using Asymmetry in Channel State Knowledge

Secret Key Agreement Using Asymmetry in Channel State Knowledge Secret Key Agreement Using Asymmetry in Channel State Knowledge Ashish Khisti Deutsche Telekom Inc. R&D Lab USA Los Altos, CA, 94040 Email: ashish.khisti@telekom.com Suhas Diggavi LICOS, EFL Lausanne,

More information

Polar Codes are Optimal for Write-Efficient Memories

Polar Codes are Optimal for Write-Efficient Memories Polar Codes are Optimal for Write-Efficient Memories Qing Li Department of Computer Science and Engineering Texas A & M University College Station, TX 7784 qingli@cse.tamu.edu Anxiao (Andrew) Jiang Department

More information

Lecture 16. Error-free variable length schemes (contd.): Shannon-Fano-Elias code, Huffman code

Lecture 16. Error-free variable length schemes (contd.): Shannon-Fano-Elias code, Huffman code Lecture 16 Agenda for the lecture Error-free variable length schemes (contd.): Shannon-Fano-Elias code, Huffman code Variable-length source codes with error 16.1 Error-free coding schemes 16.1.1 The Shannon-Fano-Elias

More information

Channels with cost constraints: strong converse and dispersion

Channels with cost constraints: strong converse and dispersion Channels with cost constraints: strong converse and dispersion Victoria Kostina, Sergio Verdú Dept. of Electrical Engineering, Princeton University, NJ 08544, USA Abstract This paper shows the strong converse

More information

Information Complexity vs. Communication Complexity: Hidden Layers Game

Information Complexity vs. Communication Complexity: Hidden Layers Game Information Complexity vs. Communication Complexity: Hidden Layers Game Jiahui Liu Final Project Presentation for Information Theory in TCS Introduction Review of IC vs CC Hidden Layers Game Upper Bound

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

On Source-Channel Communication in Networks

On Source-Channel Communication in Networks On Source-Channel Communication in Networks Michael Gastpar Department of EECS University of California, Berkeley gastpar@eecs.berkeley.edu DIMACS: March 17, 2003. Outline 1. Source-Channel Communication

More information

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Igal Sason Department of Electrical Engineering Technion - Israel Institute of Technology Haifa 32000, Israel 2009 IEEE International

More information

A Formula for the Capacity of the General Gel fand-pinsker Channel

A Formula for the Capacity of the General Gel fand-pinsker Channel A Formula for the Capacity of the General Gel fand-pinsker Channel Vincent Y. F. Tan Institute for Infocomm Research (I 2 R, A*STAR, Email: tanyfv@i2r.a-star.edu.sg ECE Dept., National University of Singapore

More information

Optimal Distributed Detection Strategies for Wireless Sensor Networks

Optimal Distributed Detection Strategies for Wireless Sensor Networks Optimal Distributed Detection Strategies for Wireless Sensor Networks Ke Liu and Akbar M. Sayeed University of Wisconsin-Madison kliu@cae.wisc.edu, akbar@engr.wisc.edu Abstract We study optimal distributed

More information

Lecture 4 Channel Coding

Lecture 4 Channel Coding Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity

More information

On Network Functional Compression

On Network Functional Compression On Network Functional Compression Soheil Feizi, Student Member, IEEE, Muriel Médard, Fellow, IEEE arxiv:0.5496v2 [cs.it] 30 Nov 200 Abstract In this paper, we consider different aspects of the problem

More information

CONSIDER a joint stationary and memoryless process

CONSIDER a joint stationary and memoryless process 4006 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 55, NO 9, SEPTEMBER 2009 On the Duality Between Slepian Wolf Coding and Channel Coding Under Mismatched Decoding Jun Chen, Member, IEEE, Da-ke He, and

More information

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I

More information

ELEMENTS O F INFORMATION THEORY

ELEMENTS O F INFORMATION THEORY ELEMENTS O F INFORMATION THEORY THOMAS M. COVER JOY A. THOMAS Preface to the Second Edition Preface to the First Edition Acknowledgments for the Second Edition Acknowledgments for the First Edition x

More information

Cut-Set Bound and Dependence Balance Bound

Cut-Set Bound and Dependence Balance Bound Cut-Set Bound and Dependence Balance Bound Lei Xiao lxiao@nd.edu 1 Date: 4 October, 2006 Reading: Elements of information theory by Cover and Thomas [1, Section 14.10], and the paper by Hekstra and Willems

More information

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying Ahmad Abu Al Haija, and Mai Vu, Department of Electrical and Computer Engineering McGill University Montreal, QC H3A A7 Emails: ahmadabualhaija@mailmcgillca,

More information

Convexity/Concavity of Renyi Entropy and α-mutual Information

Convexity/Concavity of Renyi Entropy and α-mutual Information Convexity/Concavity of Renyi Entropy and -Mutual Information Siu-Wai Ho Institute for Telecommunications Research University of South Australia Adelaide, SA 5095, Australia Email: siuwai.ho@unisa.edu.au

More information

Frans M.J. Willems. Authentication Based on Secret-Key Generation. Frans M.J. Willems. (joint work w. Tanya Ignatenko)

Frans M.J. Willems. Authentication Based on Secret-Key Generation. Frans M.J. Willems. (joint work w. Tanya Ignatenko) Eindhoven University of Technology IEEE EURASIP Spain Seminar on Signal Processing, Communication and Information Theory, Universidad Carlos III de Madrid, December 11, 2014 : Secret-Based Authentication

More information

LECTURE 13. Last time: Lecture outline

LECTURE 13. Last time: Lecture outline LECTURE 13 Last time: Strong coding theorem Revisiting channel and codes Bound on probability of error Error exponent Lecture outline Fano s Lemma revisited Fano s inequality for codewords Converse to

More information

Distributed Lossless Compression. Distributed lossless compression system

Distributed Lossless Compression. Distributed lossless compression system Lecture #3 Distributed Lossless Compression (Reading: NIT 10.1 10.5, 4.4) Distributed lossless source coding Lossless source coding via random binning Time Sharing Achievability proof of the Slepian Wolf

More information

Secret Key Agreement: General Capacity and Second-Order Asymptotics

Secret Key Agreement: General Capacity and Second-Order Asymptotics Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe Abstract We revisit the problem of secret key agreement using interactive public communication

More information

Group Secret Key Agreement over State-Dependent Wireless Broadcast Channels

Group Secret Key Agreement over State-Dependent Wireless Broadcast Channels Group Secret Key Agreement over State-Dependent Wireless Broadcast Channels Mahdi Jafari Siavoshani Sharif University of Technology, Iran Shaunak Mishra, Suhas Diggavi, Christina Fragouli Institute of

More information

Tight Bounds for Symmetric Divergence Measures and a Refined Bound for Lossless Source Coding

Tight Bounds for Symmetric Divergence Measures and a Refined Bound for Lossless Source Coding APPEARS IN THE IEEE TRANSACTIONS ON INFORMATION THEORY, FEBRUARY 015 1 Tight Bounds for Symmetric Divergence Measures and a Refined Bound for Lossless Source Coding Igal Sason Abstract Tight bounds for

More information

The Gallager Converse

The Gallager Converse The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing

More information

ECE 4400:693 - Information Theory

ECE 4400:693 - Information Theory ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential

More information

Paul Cuff, Han-I Su, and Abbas EI Gamal Department of Electrical Engineering Stanford University {cuff, hanisu,

Paul Cuff, Han-I Su, and Abbas EI Gamal Department of Electrical Engineering Stanford University   {cuff, hanisu, Cascade Multiterminal Source Coding Paul Cuff, Han-I Su, and Abbas EI Gamal Department of Electrical Engineering Stanford University E-mail: {cuff, hanisu, abbas}@stanford.edu Abstract-We investigate distributed

More information

Ahmed S. Mansour, Rafael F. Schaefer and Holger Boche. June 9, 2015

Ahmed S. Mansour, Rafael F. Schaefer and Holger Boche. June 9, 2015 The Individual Secrecy Capacity of Degraded Multi-Receiver Wiretap Broadcast Channels Ahmed S. Mansour, Rafael F. Schaefer and Holger Boche Lehrstuhl für Technische Universität München, Germany Department

More information

On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers

On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers Albrecht Wolf, Diana Cristina González, Meik Dörpinghaus, José Cândido Silveira Santos Filho, and Gerhard Fettweis Vodafone

More information

On the Rate-Limited Gelfand-Pinsker Problem

On the Rate-Limited Gelfand-Pinsker Problem On the Rate-Limited Gelfand-Pinsker Problem Ravi Tandon Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 ravit@umd.edu ulukus@umd.edu Abstract

More information

Cooperative Communication with Feedback via Stochastic Approximation

Cooperative Communication with Feedback via Stochastic Approximation Cooperative Communication with Feedback via Stochastic Approximation Utsaw Kumar J Nicholas Laneman and Vijay Gupta Department of Electrical Engineering University of Notre Dame Email: {ukumar jnl vgupta}@ndedu

More information

On Common Information and the Encoding of Sources that are Not Successively Refinable

On Common Information and the Encoding of Sources that are Not Successively Refinable On Common Information and the Encoding of Sources that are Not Successively Refinable Kumar Viswanatha, Emrah Akyol, Tejaswi Nanjundaswamy and Kenneth Rose ECE Department, University of California - Santa

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function

Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Dinesh Krithivasan and S. Sandeep Pradhan Department of Electrical Engineering and Computer Science,

More information

Chapter 9 Fundamental Limits in Information Theory

Chapter 9 Fundamental Limits in Information Theory Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For

More information

Joint Source-Channel Coding for the Multiple-Access Relay Channel

Joint Source-Channel Coding for the Multiple-Access Relay Channel Joint Source-Channel Coding for the Multiple-Access Relay Channel Yonathan Murin, Ron Dabora Department of Electrical and Computer Engineering Ben-Gurion University, Israel Email: moriny@bgu.ac.il, ron@ee.bgu.ac.il

More information

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@cam.ac.uk Alfonso Martinez Universitat Pompeu Fabra alfonso.martinez@ieee.org

More information

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Silas Fong (Joint work with Vincent Tan) Department of Electrical & Computer Engineering National University

More information

Amobile satellite communication system, like Motorola s

Amobile satellite communication system, like Motorola s I TRANSACTIONS ON INFORMATION THORY, VOL. 45, NO. 4, MAY 1999 1111 Distributed Source Coding for Satellite Communications Raymond W. Yeung, Senior Member, I, Zhen Zhang, Senior Member, I Abstract Inspired

More information

Distributed Functional Compression through Graph Coloring

Distributed Functional Compression through Graph Coloring Distributed Functional Compression through Graph Coloring Vishal Doshi, Devavrat Shah, Muriel Médard, and Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology

More information

Bounds on Achievable Rates for General Multi-terminal Networks with Practical Constraints

Bounds on Achievable Rates for General Multi-terminal Networks with Practical Constraints Bounds on Achievable Rates for General Multi-terminal Networs with Practical Constraints Mohammad Ali Khojastepour, Ashutosh Sabharwal, and Behnaam Aazhang Department of Electrical and Computer Engineering

More information

Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities

Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities Maxim Raginsky and Igal Sason ISIT 2013, Istanbul, Turkey Capacity-Achieving Channel Codes The set-up DMC

More information

A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding

A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation

More information

Information Theory and Coding Techniques: Chapter 1.1. What is Information Theory? Why you should take this course?

Information Theory and Coding Techniques: Chapter 1.1. What is Information Theory? Why you should take this course? Information Theory and Coding Techniques: Chapter 1.1 What is Information Theory? Why you should take this course? 1 What is Information Theory? Information Theory answers two fundamental questions in

More information