Amobile satellite communication system, like Motorola s

Size: px
Start display at page:

Download "Amobile satellite communication system, like Motorola s"

Transcription

1 I TRANSACTIONS ON INFORMATION THORY, VOL. 45, NO. 4, MAY Distributed Source Coding for Satellite Communications Raymond W. Yeung, Senior Member, I, Zhen Zhang, Senior Member, I Abstract Inspired by mobile satellite communications systems, we consider a source coding system which consists of multiple sources, multiple encoders, multiple decoders. ach encoder has access to a certain subset of the sources, each decoder has access to certain subset of the encoders, each decoder reconstructs a certain subset of the sources almost perfectly. The connectivity between the sources the encoders, the connectivity between the encoders the decoders, the reconstruction requirements for the decoders are all arbitrary. Our goal is to characterize the admissible coding rate region. Despite the generality of the problem, we have developed an approach which enables us to study all cases on the same footing. We obtain inner outer bounds of the admissible coding rate region in terms of 0 3 N 0N, 3 respectively, which are fundamental regions in the entropy space recently defined by Yeung. So far, there has not been a full characterization of 0N, 3 so these bounds cannot be evaluated explicitly except for some special cases. Nevertheless, we obtain an alternative outer bound which can be evaluated explicitly. We show that this bound is tight the special cases for which the admissible coding rate region is known. The model we study in this paper is more general than all previously reported models on multilevel diversity coding, the tools we use are new in multiuser information theory. Index Terms Diversity coding, multiterminal source coding, multiuser information theory, satellite communication. I. INTRODUCTION Amobile satellite communication system, like Motorola s Iridium TM System Qualcomm s GlobalStar TM System, provides telephone data services to mobile users within a certain geographical range through satellite links. For example, the Iridium TM System covers users anywhere in the world, while the GlobalStar TM System covers users between 70 latitude. At any time, a mobile user is covered by one or more satellites. Through satellite links, the message from a mobile user is transmitted to a certain set of mobile users within the system. In a generic mobile satellite communication system, transmitters receivers are not necessarily colocated. In principle, a transmitter can transmit information to all the satellites Manuscript received December 12, 1996; revised November 4, The work of R. W. Yeung was supported in part by the Research Grant Council of Hong Kong under armarked Grant CUHK 332/96. The work of Z. Zhang was supported in part by the National Science Foundation under Grant NCR R. W. Yeung is with Department of Information ngineering, The Chinese University of Hong Kong, N.T., Hong Kong, China ( whyeung@ie.cuhk.edu.hk). Z. Zhang is with the Communication Sciences Institute, Department of lectrical ngineering Systems, University of Southern California, Los Angeles, CA USA ( zzhang@milly.usc.edu). Communicated by K. Zeger, ditor At Large. Publisher Item Identifier S (99) within the line of sight simultaneously; a satellite can combine, encode, broadcast the information it receives from all the transmitters it covers; a receiver can combine decode the information it receives from all the satellites within the line of sight. We are motivated to study a new problem called the distributed source coding problem. A distributed source coding system consists of multiple sources, multiple encoders, multiple decoders. ach encoder has access to a certain subset of the sources, each decoder has access to a certain subset of the encoders, each decoder reconstructs a certain subset of the sources almost perfectly. The connectivity between the sources the encoders, the connectivity between the encoders the decoders, the reconstruction requirements for the decoders are all arbitrary. The sources, the encoders, the decoders in a distributed coding system correspond to the transmitters, the satellites, the receivers in a mobile satellite communication system, respectively. The set of encoders connected to a source refers to the set of satellites within the line of sight of a transmitter, while the set of decoders connected to an encoder refers to the set of receivers covered by a satellite. Throughout the paper, we will use a boldfaced letter to denote a vector. The th component of a vector is denoted by unless otherwise specified. For a rom variable, we will use to denote the alphabet set of, to denote a generic outcome of. For a set, we will use to denote the closure of. We will use to denote the probability of an event, to denote the entropy of a set of rom variables in base. Let us now present the formal description of the problem. A distributed source coding system consists of the following elements: 1), the index set of the information sources; 2), the index set of the encoders; 3), the index set of the decoders; 4), the set of connections between the sources the encoders; 5), the set of connections between the encoders the decoders; 6), which specify the reconstruction requirements of the decoders. The th source is denoted by,. We assume that are independent, are independent identically distributed (i.i.d.) copies of a generic rom variable, where. The th encoder /99$ I

2 1112 I TRANSACTIONS ON INFORMATION THORY, VOL. 45, NO. 4, MAY 1999 An -tuple is admissible if for every, there exists for sufficiently large an code such that Fig. 1. A distributed source coding system. Let, let is admissible is denoted by,, the th decoder is denoted by,. The sets specify the distributed source coding system as follows: has access to if only if, has access to if only if, reconstructs. Let us illustrate the above notations by a small example. For the distributed source coding system in Fig. 1, To facilitate the description of our model, we define contains the indices of the sources which are accessed by, contains the indices of the encoders which are accessed by. Let be the Hamming distortion measure in Let code is defined by.an if if. ; i.e., for any be the admissible coding rate region (henceforth the coding rate region if there is no ambiguity). The goal of this paper is to characterize. In Section II, we prove an inner bound for. In Section III, we prove an outer bound for. In Section IV, we give geometrical interpretations of. The regions are defined in terms of, which are fundamental regions recently defined by Yeung [10]. So far, there has not been a full characterization of either or, so explicit evaluation of is not possible. Based on the geometrical interpretation of these regions, we obtain an outer bound for called the LP bound (for linear programming bound) which can be evaluated explicitly. This is described in Section V. In this section, we also show that the LP bound is tight the special cases for which the admissible coding rate region is known. Concluding remarks are in Section VI. II. INNR BOUND Before we define, we first introduce some notation from the framework for information inequalities developed in [10]. Let be a finite set of discrete rom variables whose joint distribution is unspecified, let. Note that. Let denote the coordinates of. A vector is said to be constructible if there exists a joint distribution for such that. We then define is constructible To simplify notation in the sequel, for any nonempty, we further define where we have used juxtaposition to denote the union of two sets. In using the above definitions, we will not distinguish elements singletons of ; i.e., for a rom variable, is the same as. Let be discrete rom variables whose joint distribution is unspecified, let where i.e., the set containing all the rom variables. is an auxiliary rom variable associated

3 YUNG AND ZHANG: DISTRIBUTD SOURC CODING FOR SATLLIT COMMUNICATIONS 1113 with the source, is an auxiliary rom variable associated with, the output of the encoder. The actual meaning of will become clear later. Let be the set of all such that there exists which satisfies the following conditions: (1) for (2) for (3) for (4) for (5) Note that (1) (3) are hyperplanes, (4) is an open halfspace in. We then define. Theorem 1:. Note that can be defined more conventionally as the set of all such that there exist auxiliary rom variables satisfying (6) for (7) for (8) for (9) for (10) Although this alternative definition is more intuitive, the region so defined appears to be totally different from case to case. On the other h, defining in terms of enables us to study all cases on the same footing. In particular, if is an explicit inner bound on, upon replacing by in the definition of, we immediately obtain an explicit inner bound on cases. (Unfortunately, no explicit inner bound on for is available at this point; a nontrivial inner bound for has been obtained in [6] [12].) The introduction of in Section V as an explicit outer bound on is in the same spirit. Let us give the motivations of the conditions in (1) (5) before we prove the theorem, although the meaning of these conditions cannot be fully explained until we come to the proof. The condition (1) corresponds to the assumption that the sources are independent. The condition (2) corresponds to the fact that the encoder has access to the sources. The condition (3) corresponds to the requirement that the sources can be reconstructed by the decoder. The condition (4) means that the entropy of the auxiliary rom variable is strictly greater than the entropy rate of the source, the condition (5) means that the coding rate of the encoder is greater than or equal to the entropy of the auxiliary rom variable. We will state the following lemma before we present the proof of the theorem. Since this lemma is a stard result, its proof will be omitted. We first recall the definitions of strong typicality of sequences [2]. A sequence is -typical with respect to a distribution if where is the number of occurrences of the symbol in. Similarly, a pair of sequences is -typical with respect to a distribution if In the sequel, the -typical notations in [4] will be adopted. Lemma 1: Let be -vectors drawn according to If, then (11) where denotes any function such that for some constant in a neighborhood of. Proof of Theorem 1: We will prove the theorem by describing a rom coding scheme showing that it has the desired performance. Let be small positive quantities be a large integer to be specified later. Let. Then there exist rom variables such that the left-h sides of (1) (5) are the corresponding Shannon information measures; i.e., (1) (5) can be written as (6) (10). The encoding scheme is described in the following steps: 1. Let be the distribution of, let. For each, independently generate vectors in according to the distribution where. Let be the set of the vectors generated, denote these vectors by (12) 2. Let index the vectors in by the set. Since by (4), by taking small enough sufficiently large, we can make Define a mapping (13) as follows. For any, if, then ;if, then is equal to the index of in defined above. We note that for, is a constant vector in,

4 1114 I TRANSACTIONS ON INFORMATION THORY, VOL. 45, NO. 4, MAY 1999 for, are distinct vectors in (for all, by (13), so is properly defined). 3. Let be the conditional distribution of given. Let index the vectors in by the set. For each, define a mapping 2) For each, define a mapping by if otherwise (14) romly as follows. For each, send it through a discrete memoryless channel with transition probability by using the channel times. If the received vector is not in, then, otherwise, is equal to the index of the received vector in defined above. 4. The encoding function where is an arbitrary constant vector in (cf. step 2 of the encoding scheme). 3) The decoding function is defined as where is defined as follows. For each We now analyze the performance of this rom code. Let be any positive quantity. First,, since if for some, then, otherwise, The decoding scheme is described in the following steps: 1) For each, define a mapping we have by taking (15) (16) sufficiently small. Therefore, (17) as follows. Let be the vector in with index (cf. step 3 in the encoding scheme). For any by (10). We now introduce the following notations: if there exists a unique set of indices Define the events such that then otherwise is the event that is -typical with respect to for all. is the event that Decoder decodes correctly the index assigned to. By the weak law of large numbers, we see that for sufficiently large for some (18)

5 YUNG AND ZHANG: DISTRIBUTD SOURC CODING FOR SATLLIT COMMUNICATIONS 1115 Let be the joint distribution of. By (6) (7), we have following. To see that the claim is true, consider (27) at the top of the following page. Note that in in (27) (19) Since is drawn i.i.d. according to the above distribution, we see from the weak law of large numbers that for a sufficiently large (20) By noting that implies, we see that for any Assuming without loss of generality that all (28) for (29) Further, since are independent events, we have (21) (22) Therefore its reciprocal is well-defined. We now bound the probability in (23). By the symmetry of the problem, we assume without loss of generality that for. To simplify notation, define the sets Now for any sufficiently large, we have where. Note that is a partition of, iff Given (23) where the last inequality follows because subset of. Then (30) is always a proper (24) for some Note that the last equality follows from (14) because (25) given. Therefore, (26) The discussion that follows is a variation of the Channel Coding Theorem. We claim that can be obtained by sending through a discrete memoryless channel with transition probability, where is the conditional distribution of given. To simply notation, we assume denote by in the (31) We now prove an identity by considering the information diagram [8] for the rom variables, in Fig. 2. By (8), we see from the diagram that (32) (33)

6 1116 I TRANSACTIONS ON INFORMATION THORY, VOL. 45, NO. 4, MAY 1999 (27) (34) By (6), are independent, so (35) From the diagram, we see that this implies (36) (cf. the definition of the mutual information among more than two rom variables in [8]). We mark each of these atoms with measure zero by an asterisk. Then we see immediately

7 YUNG AND ZHANG: DISTRIBUTD SOURC CODING FOR SATLLIT COMMUNICATIONS 1117 We note that the definition of (recall that )is very similar to the definition of except that 1) is replaced by. 2) The inequality in (4) is strict while the inequality in (44) is nonstrict. It is clear that is a subset of. As a consequence of the above discrepancies, there is a gap between. It is not apparent that the gap between the two regions has zero measure in general. Proof of Theorem 2: Let be admissible. Then for any, there exists for sufficiently large an code such that Fig. 2. The information diagram for (Y j ;j 2 Fmn9); (Y j ;j2 Fm \ 9); (Z l ;l 2 Vm). that (46) (47) We first consider such a code for a fixed. Now for, by Lemma 1 (37) (37) From (30), (31), (38), we have (38) (48) In the above, follows because is a function of, is the vector version of Fano s inequality where denotes the binary entropy function, follows from (47). From (46), we also have (49). Thus for this code, we have (39) when is small enough that is sufficiently large. Finally, from (23) (39), we obtain (40) Hence, from (17) (40), is admissible, which implies. III. OUTR BOUND Let be the set of all such that there exists satisfying the following conditions: (50) for (51) for (52) for (53) for (54) The inequality in (53) is in fact an equality, which follows from the i.i.d. assumption of the source. We note the oneto-one correspondence between (50) (54) (41) (45). By letting, we see that there exists such that Theorem 2: (41) for (42) for (43) for (44) for (45) (55) for (56) for (57) for (58) for (59) It was shown in the recent work of Zhang Yeung [11] that is a convex cone. Therefore, if, then.

8 1118 I TRANSACTIONS ON INFORMATION THORY, VOL. 45, NO. 4, MAY 1999 Dividing (55) (59) by replacing by, we see that there exists such that (60) for (61) for (62) for (63) for (64) We then let to conclude that there exists which satisfies (41) (45), completing the proof. IV. GOMTRICAL INTRPRTATIONS OF AND In Sections II III, are specified in a way which facilitates analysis. In this section, we will present geometrical interpretations of these regions which give further insight. For a set, define for some where we write to mean greater than or equal to componentwise. For a set, define to be the projection of the set on the coordinates. Define the sets (65) Following [10], we define such that for any nonempty ) V. TH LP BOUND to be the set of (recall that (71) (72) (73) (74) These inequalities are referred to as the basic inequalities, which are linear inequalities in. It is easy to see that. Since is a closed set,. Therefore, upon replacing in the definition of by, we immediately obtain an outer bound on. We call this outer bound the LP bound (for linear programming bound) denote it by. Thus (75) The geometrical interpretation of in the last section can also be applied to, so it is not difficult to see that can be evaluated explicitly. The multilevel diversity coding problem (with independent data streams) studied in [9] is a special case of the problem we study in the current paper. For such a problem with levels, there are sources,, whose importance decreases in the order. ach encoder has access to all the sources. Further, Decoder belongs to Level, where, it reconstructs, the most important sources. The problem is specified by for (66) for (67) for (68) Note that the independence relation in (1) is interpreted as the hyperplane in. Similarly, the Markov conditions the functional dependencies in (2) (3) are interpreted as the hyperplanes, respectively. Then we see that So far, the coding rate region of the problem we study in this paper has been determined for certain special cases [5], [7], [9], [13], these are all special cases of multilevel diversity coding. The coding rate region for each of these cases has the form (69) Similarly, we see that (70) The bounds are specified in terms of, respectively. So far, there exists no full characterization of either or [11], [12]. Therefore, although these bounds are in single-letter form, they cannot be evaluated explicitly. Nevertheless, in view of the geometrical interpretation of, one can easily obtain an outer bound on which can be evaluated explicitly. This will be described in the next section. (76) where, is some subset of,, are some nonnegative functions depending only on. Further, the necessity of the coding rate region (i.e., the converse coding theorem) these cases can be proved by invoking the basic inequalities. We claim that the LP bound is tight the special cases for which the coding rate region has the form in (76) the converse coding theorem is a consequence of the basic inequalities. Among these is the -encoder symmetrical

9 YUNG AND ZHANG: DISTRIBUTD SOURC CODING FOR SATLLIT COMMUNICATIONS 1119 multilevel diversity coding (SMDC) problem recently studied by the authors [13]. In the rest of the section, we will prove the tightness of the LP bound for the SMDC problem; the techniques involved in proving the other cases are exactly the same. We further conjecture that the LP bound is tight for any special case as long as the converse coding theorem is a consequence of the basic inequalities, but we cannot prove it. In the following we will use to denote the th component of a vector. In the SMDC problem 1) ; 2) such that ; 3) for. then (78) With the above specifications, we implicitly have. From [13], the coding rate region for the SMDC problem is given by Comparing with (76), we see that for the SMDC problem. Note that the definition of implies for. This can be seen by letting be the -vector whose components are equal to except that the th component is equal to, so that is lower-bounded by. Lemma 2:. Proof: Trivial. In the proof for the necessity of in [13], the authors consider a code with blocklength. Careful examination of the proof for the zero-error case reveals that the same region can be obtained by considering a code with. (This is due to the assumption that the sources are i.i.d.) In the following, we use to denote the collection of rom variables such that. Basically, it is proved in [13] that Proposition 1: For any discrete rom variables,if 1) ; 2) such that, then Note that in the above proposition, the quantities are interpreted as constants. Lemma 3: Propositions 1 2 are equivalent. Proof: We first show that Proposition 1 implies Proposition 2. Proposition 1 says that 1) 2) implies (77). Together with 3), we have since, which proves Proposition 2. We now show that Proposition 2 implies Proposition 1. Suppose 1) 2) are satisfied. We let for so that 3) is satisfied with equality. Then by Proposition 2, we have Thus we see that 1) 2) imply (77), which proves Proposition 1. The constraints 1), 2), 3) in Proposition 2 correspond to the sets,,, respectively, defined in Section IV, which are subsets of. Since the Proof of Proposition 1 in [13] involves only the basic inequalities (i.e., Proposition 1, hence Proposition 2, are implied by the basic inequalities), using the results in [10], we see that So (79) (77). Let us now state another proposition. Proposition 2: For any discrete rom variables,if (80)

10 1120 I TRANSACTIONS ON INFORMATION THORY, VOL. 45, NO. 4, MAY 1999 By projecting onto the coordinates proj, we have for a class of special cases, including all those for which the coding rate region is known. would be tight if, but this has recently been disproved by the authors [11], [12]. Nevertheless, we believe that it is tight for most cases. A problem for future research is to determine the conditions for which is tight. We point out that the rom code we have constructed in Section III has an arbitrarily small probability of error which does not go away even if we are allowed to compress the sources first by variable rate codes. This characteristic is undesirable for many applications. A challenging problem for future research is to construct simple zero-error algebraic codes for distributed source coding. Then ACKNOWLDGMNT The authors wish to thank Reviewer A for the detailed comments on the manuscript. where the last step follows from Lemma 2. Since outer bound on, we have is an which implies. Thus is tight. Compared with, has the advantage that it is more explicit can easily be evaluated. VI. CONCLUSION In this paper, we introduce a new multiterminal source coding problem called the distributed source coding problem. Our model is more general than all previously reported models on multilevel diversity coding [5], [7], [9], [13]. We mention that an even more general model has been formulated in [1]. We have obtained an inner bound an outer bound on the coding rate region. Both are implicit in the sense that they are specified in terms of, respectively, which are fundamental regions in the entropy space yet to be determined. Our work is an application of the results in [10] to information theory problems. We have also obtained an explicit outer bound for the coding rate region. We have shown that this bound is tight RFRNCS [1] R. Ahlswede, N. Cai, S.-Y. R. Li, R. W. Yeung, Network information flow theory, I Trans. Inform. Theory, to be published. [2] T. Berger, Multiterminal source coding, in The Information Theory Approach to Communications, G. Longo, d. New York: Springer- Verlag, [3] T. M. Cover J. A. Thomas, lements of Information Theory. New York: Wiley, [4] I. Csiszár J. Körner, Information Theory: Coding Theorems for Discrete Memoryless Systems. New York: Academic, [5] K. P. Hau, Multilevel diversity coding with independent data streams, M.Phil. thesis, The Chinese University of Hong Kong, May [6] F. Matú s M. Studený, Conditional independences among four rom variables I, Combin., Prob. Comput., vol. 4, pp , [7] J. R. Roche, R. W. Yeung, K. P. Hau, Symmetrical multilevel diversity coding, I Trans. Inform. Theory, vol. 43, pp , May [8] R. W. Yeung, A new outlook on Shannon s information measures, I Trans. Inform. Theory, vol. 37, pp , May [9], Multilevel diversity coding with distortion, I Trans. Inform. Theory, vol. 41, pp , May, [10], A framework for linear information inequalities, I Trans. Inform. Theory, vol. 43, pp , Nov [11] Z. Zhang R. W. Yeung, A non-shannon-type conditional inequality of information quantities, I Trans. Inform. Theory, vol. 43, pp , Nov [12], On characterization of entropy function via information inequalities, I Trans. Inform. Theory, vol. 44, pp , July [13] R. W. Yeung Z. Zhang, On symmetrical multilevel diversity coding, I Trans. Inform. Theory, vol. 45, pp , Mar

The Capacity Region for Multi-source Multi-sink Network Coding

The Capacity Region for Multi-source Multi-sink Network Coding The Capacity Region for Multi-source Multi-sink Network Coding Xijin Yan Dept. of Electrical Eng. - Systems University of Southern California Los Angeles, CA, U.S.A. xyan@usc.edu Raymond W. Yeung Dept.

More information

SHANNON S information measures refer to entropies, conditional

SHANNON S information measures refer to entropies, conditional 1924 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 43, NO. 6, NOVEMBER 1997 A Framework for Linear Information Inequalities Raymond W. Yeung, Senior Member, IEEE Abstract We present a framework for information

More information

1440 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 44, NO. 4, JULY On Characterization of Entropy Function via Information Inequalities

1440 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 44, NO. 4, JULY On Characterization of Entropy Function via Information Inequalities 1440 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 44, NO. 4, JULY 1998 On Characterization of Entropy Function via Information Inequalities Zhen Zhang, Senior Member, IEEE, Raymond W. Yeung, Senior Member,

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,

More information

Variable Length Codes for Degraded Broadcast Channels

Variable Length Codes for Degraded Broadcast Channels Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates

More information

SUCCESSIVE refinement of information, or scalable

SUCCESSIVE refinement of information, or scalable IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 49, NO. 8, AUGUST 2003 1983 Additive Successive Refinement Ertem Tuncel, Student Member, IEEE, Kenneth Rose, Fellow, IEEE Abstract Rate-distortion bounds for

More information

On Common Information and the Encoding of Sources that are Not Successively Refinable

On Common Information and the Encoding of Sources that are Not Successively Refinable On Common Information and the Encoding of Sources that are Not Successively Refinable Kumar Viswanatha, Emrah Akyol, Tejaswi Nanjundaswamy and Kenneth Rose ECE Department, University of California - Santa

More information

Approximating the Gaussian multiple description rate region under symmetric distortion constraints

Approximating the Gaussian multiple description rate region under symmetric distortion constraints Approximating the Gaussian multiple description rate region under symmetric distortion constraints Chao Tian AT&T Labs-Research Florham Park, NJ 0793, USA. tian@research.att.com Soheil Mohajer Suhas Diggavi

More information

WITH advances in communications media and technologies

WITH advances in communications media and technologies IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 45, NO. 6, SEPTEMBER 1999 1887 Distortion-Rate Bounds for Fixed- Variable-Rate Multiresolution Source Codes Michelle Effros, Member, IEEE Abstract The source

More information

On the Duality between Multiple-Access Codes and Computation Codes

On the Duality between Multiple-Access Codes and Computation Codes On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch

More information

The Gallager Converse

The Gallager Converse The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing

More information

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels Representation of Correlated s into Graphs for Transmission over Broadcast s Suhan Choi Department of Electrical Eng. and Computer Science University of Michigan, Ann Arbor, MI 80, USA Email: suhanc@eecs.umich.edu

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function

Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Dinesh Krithivasan and S. Sandeep Pradhan Department of Electrical Engineering and Computer Science,

More information

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016 Midterm Exam Time: 09:10 12:10 11/23, 2016 Name: Student ID: Policy: (Read before You Start to Work) The exam is closed book. However, you are allowed to bring TWO A4-size cheat sheet (single-sheet, two-sided).

More information

PERFECTLY secure key agreement has been studied recently

PERFECTLY secure key agreement has been studied recently IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 45, NO. 2, MARCH 1999 499 Unconditionally Secure Key Agreement the Intrinsic Conditional Information Ueli M. Maurer, Senior Member, IEEE, Stefan Wolf Abstract

More information

A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality

A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality 0 IEEE Information Theory Workshop A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality Kumar Viswanatha, Emrah Akyol and Kenneth Rose ECE Department, University of California

More information

Symmetries in the Entropy Space

Symmetries in the Entropy Space Symmetries in the Entropy Space Jayant Apte, Qi Chen, John MacLaren Walsh Abstract This paper investigates when Shannon-type inequalities completely characterize the part of the closure of the entropy

More information

Lecture 11: Quantum Information III - Source Coding

Lecture 11: Quantum Information III - Source Coding CSCI5370 Quantum Computing November 25, 203 Lecture : Quantum Information III - Source Coding Lecturer: Shengyu Zhang Scribe: Hing Yin Tsang. Holevo s bound Suppose Alice has an information source X that

More information

Upper Bounds on the Capacity of Binary Intermittent Communication

Upper Bounds on the Capacity of Binary Intermittent Communication Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011

6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011 6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011 On the Structure of Real-Time Encoding and Decoding Functions in a Multiterminal Communication System Ashutosh Nayyar, Student

More information

On Lossless Coding With Coded Side Information Daniel Marco, Member, IEEE, and Michelle Effros, Fellow, IEEE

On Lossless Coding With Coded Side Information Daniel Marco, Member, IEEE, and Michelle Effros, Fellow, IEEE 3284 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 7, JULY 2009 On Lossless Coding With Coded Side Information Daniel Marco, Member, IEEE, Michelle Effros, Fellow, IEEE Abstract This paper considers

More information

The Shannon s basic inequalities refer to the following fundamental properties of entropy function:

The Shannon s basic inequalities refer to the following fundamental properties of entropy function: COMMUNICATIONS IN INFORMATION AND SYSTEMS c 2003 International Press Vol. 3, No. 1, pp. 47-60, June 2003 004 ON A NEW NON-SHANNON TYPE INFORMATION INEQUALITY ZHEN ZHANG Abstract. Recently, K. Makarychev,

More information

On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers

On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers Albrecht Wolf, Diana Cristina González, Meik Dörpinghaus, José Cândido Silveira Santos Filho, and Gerhard Fettweis Vodafone

More information

IN this paper, we consider the capacity of sticky channels, a

IN this paper, we consider the capacity of sticky channels, a 72 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 54, NO. 1, JANUARY 2008 Capacity Bounds for Sticky Channels Michael Mitzenmacher, Member, IEEE Abstract The capacity of sticky channels, a subclass of insertion

More information

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 Capacity Theorems for Discrete, Finite-State Broadcast Channels With Feedback and Unidirectional Receiver Cooperation Ron Dabora

More information

SOURCE coding problems with side information at the decoder(s)

SOURCE coding problems with side information at the decoder(s) 1458 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 59, NO. 3, MARCH 2013 Heegard Berger Cascade Source Coding Problems With Common Reconstruction Constraints Behzad Ahmadi, Student Member, IEEE, Ravi Ton,

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

An Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel

An Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 4, APRIL 2012 2427 An Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel Ersen Ekrem, Student Member, IEEE,

More information

On Scalable Source Coding for Multiple Decoders with Side Information

On Scalable Source Coding for Multiple Decoders with Side Information On Scalable Source Coding for Multiple Decoders with Side Information Chao Tian School of Computer and Communication Sciences Laboratory for Information and Communication Systems (LICOS), EPFL, Lausanne,

More information

Towards control over fading channels

Towards control over fading channels Towards control over fading channels Paolo Minero, Massimo Franceschetti Advanced Network Science University of California San Diego, CA, USA mail: {minero,massimo}@ucsd.edu Invited Paper) Subhrakanti

More information

Cases Where Finding the Minimum Entropy Coloring of a Characteristic Graph is a Polynomial Time Problem

Cases Where Finding the Minimum Entropy Coloring of a Characteristic Graph is a Polynomial Time Problem Cases Where Finding the Minimum Entropy Coloring of a Characteristic Graph is a Polynomial Time Problem Soheil Feizi, Muriel Médard RLE at MIT Emails: {sfeizi,medard}@mit.edu Abstract In this paper, we

More information

Reliable Computation over Multiple-Access Channels

Reliable Computation over Multiple-Access Channels Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,

More information

On Gaussian MIMO Broadcast Channels with Common and Private Messages

On Gaussian MIMO Broadcast Channels with Common and Private Messages On Gaussian MIMO Broadcast Channels with Common and Private Messages Ersen Ekrem Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 20742 ersen@umd.edu

More information

Analyzing Large Communication Networks

Analyzing Large Communication Networks Analyzing Large Communication Networks Shirin Jalali joint work with Michelle Effros and Tracey Ho Dec. 2015 1 The gap Fundamental questions: i. What is the best achievable performance? ii. How to communicate

More information

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview AN INTRODUCTION TO SECRECY CAPACITY BRIAN DUNN. Overview This paper introduces the reader to several information theoretic aspects of covert communications. In particular, it discusses fundamental limits

More information

A computational approach for determining rate regions and codes using entropic vector bounds

A computational approach for determining rate regions and codes using entropic vector bounds 1 A computational approach for determining rate regions and codes using entropic vector bounds Congduan Li, John MacLaren Walsh, Steven Weber Drexel University, Dept. of ECE, Philadelphia, PA 19104, USA

More information

Capacity Region of the Permutation Channel

Capacity Region of the Permutation Channel Capacity Region of the Permutation Channel John MacLaren Walsh and Steven Weber Abstract We discuss the capacity region of a degraded broadcast channel (DBC) formed from a channel that randomly permutes

More information

THIS paper addresses the quadratic Gaussian two-encoder

THIS paper addresses the quadratic Gaussian two-encoder 1938 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 54, NO 5, MAY 2008 Rate Region of the Quadratic Gaussian Two-Encoder Source-Coding Problem Aaron B Wagner, Member, IEEE, Saurabha Tavildar, Pramod Viswanath,

More information

On Dependence Balance Bounds for Two Way Channels

On Dependence Balance Bounds for Two Way Channels On Dependence Balance Bounds for Two Way Channels Ravi Tandon Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 20742 ravit@umd.edu ulukus@umd.edu

More information

The Poisson Channel with Side Information

The Poisson Channel with Side Information The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH

More information

Alphabet Size Reduction for Secure Network Coding: A Graph Theoretic Approach

Alphabet Size Reduction for Secure Network Coding: A Graph Theoretic Approach ALPHABET SIZE REDUCTION FOR SECURE NETWORK CODING: A GRAPH THEORETIC APPROACH 1 Alphabet Size Reduction for Secure Network Coding: A Graph Theoretic Approach Xuan Guang, Member, IEEE, and Raymond W. Yeung,

More information

On Scalable Coding in the Presence of Decoder Side Information

On Scalable Coding in the Presence of Decoder Side Information On Scalable Coding in the Presence of Decoder Side Information Emrah Akyol, Urbashi Mitra Dep. of Electrical Eng. USC, CA, US Email: {eakyol, ubli}@usc.edu Ertem Tuncel Dep. of Electrical Eng. UC Riverside,

More information

arxiv: v4 [cs.it] 17 Oct 2015

arxiv: v4 [cs.it] 17 Oct 2015 Upper Bounds on the Relative Entropy and Rényi Divergence as a Function of Total Variation Distance for Finite Alphabets Igal Sason Department of Electrical Engineering Technion Israel Institute of Technology

More information

1 Background on Information Theory

1 Background on Information Theory Review of the book Information Theory: Coding Theorems for Discrete Memoryless Systems by Imre Csiszár and János Körner Second Edition Cambridge University Press, 2011 ISBN:978-0-521-19681-9 Review by

More information

Duality Between Channel Capacity and Rate Distortion With Two-Sided State Information

Duality Between Channel Capacity and Rate Distortion With Two-Sided State Information IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 6, JUNE 2002 1629 Duality Between Channel Capacity Rate Distortion With Two-Sided State Information Thomas M. Cover, Fellow, IEEE, Mung Chiang, Student

More information

Distributed Functional Compression through Graph Coloring

Distributed Functional Compression through Graph Coloring Distributed Functional Compression through Graph Coloring Vishal Doshi, Devavrat Shah, Muriel Médard, and Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology

More information

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Silas Fong (Joint work with Vincent Tan) Department of Electrical & Computer Engineering National University

More information

On bounded redundancy of universal codes

On bounded redundancy of universal codes On bounded redundancy of universal codes Łukasz Dębowski Institute of omputer Science, Polish Academy of Sciences ul. Jana Kazimierza 5, 01-248 Warszawa, Poland Abstract onsider stationary ergodic measures

More information

5218 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER 2006

5218 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER 2006 5218 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER 2006 Source Coding With Limited-Look-Ahead Side Information at the Decoder Tsachy Weissman, Member, IEEE, Abbas El Gamal, Fellow,

More information

ASIGNIFICANT research effort has been devoted to the. Optimal State Estimation for Stochastic Systems: An Information Theoretic Approach

ASIGNIFICANT research effort has been devoted to the. Optimal State Estimation for Stochastic Systems: An Information Theoretic Approach IEEE TRANSACTIONS ON AUTOMATIC CONTROL, VOL 42, NO 6, JUNE 1997 771 Optimal State Estimation for Stochastic Systems: An Information Theoretic Approach Xiangbo Feng, Kenneth A Loparo, Senior Member, IEEE,

More information

The Capacity of Finite Abelian Group Codes Over Symmetric Memoryless Channels Giacomo Como and Fabio Fagnani

The Capacity of Finite Abelian Group Codes Over Symmetric Memoryless Channels Giacomo Como and Fabio Fagnani IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 5, MAY 2009 2037 The Capacity of Finite Abelian Group Codes Over Symmetric Memoryless Channels Giacomo Como and Fabio Fagnani Abstract The capacity

More information

Network Coding on Directed Acyclic Graphs

Network Coding on Directed Acyclic Graphs Network Coding on Directed Acyclic Graphs John MacLaren Walsh, Ph.D. Multiterminal Information Theory, Spring Quarter, 0 Reference These notes are directly derived from Chapter of R. W. Yeung s Information

More information

EE 4TM4: Digital Communications II. Channel Capacity

EE 4TM4: Digital Communications II. Channel Capacity EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.

More information

A Criterion for the Stochasticity of Matrices with Specified Order Relations

A Criterion for the Stochasticity of Matrices with Specified Order Relations Rend. Istit. Mat. Univ. Trieste Vol. XL, 55 64 (2009) A Criterion for the Stochasticity of Matrices with Specified Order Relations Luca Bortolussi and Andrea Sgarro Abstract. We tackle the following problem:

More information

Information Masking and Amplification: The Source Coding Setting

Information Masking and Amplification: The Source Coding Setting 202 IEEE International Symposium on Information Theory Proceedings Information Masking and Amplification: The Source Coding Setting Thomas A. Courtade Department of Electrical Engineering University of

More information

Approximating the Gaussian Multiple Description Rate Region Under Symmetric Distortion Constraints

Approximating the Gaussian Multiple Description Rate Region Under Symmetric Distortion Constraints Approximating the Gaussian Multiple Description Rate Region Under Symmetric Distortion Constraints Chao Tian, Member, IEEE, Soheil Mohajer, Student Member, IEEE, and Suhas N. Diggavi, Member, IEEE Abstract

More information

Characterising Probability Distributions via Entropies

Characterising Probability Distributions via Entropies 1 Characterising Probability Distributions via Entropies Satyajit Thakor, Terence Chan and Alex Grant Indian Institute of Technology Mandi University of South Australia Myriota Pty Ltd arxiv:1602.03618v2

More information

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR 1 Stefano Rini, Daniela Tuninetti and Natasha Devroye srini2, danielat, devroye @ece.uic.edu University of Illinois at Chicago Abstract

More information

Rate region for a class of delay mitigating codes and P2P networks

Rate region for a class of delay mitigating codes and P2P networks Rate region for a class of delay mitigating codes and P2P networks Steven Weber, Congduan Li, John MacLaren Walsh Drexel University, Dept of ECE, Philadelphia, PA 19104 Abstract This paper identifies the

More information

Common Randomness and Secret Key Generation with a Helper

Common Randomness and Secret Key Generation with a Helper 344 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 46, NO. 2, MARCH 2000 Common Romness Secret Key Generation with a Helper Imre Csiszár, Fellow, IEEE, Prakash Narayan, Senior Member, IEEE Abstract We consider

More information

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel

More information

A Summary of Multiple Access Channels

A Summary of Multiple Access Channels A Summary of Multiple Access Channels Wenyi Zhang February 24, 2003 Abstract In this summary we attempt to present a brief overview of the classical results on the information-theoretic aspects of multiple

More information

A NICE PROOF OF FARKAS LEMMA

A NICE PROOF OF FARKAS LEMMA A NICE PROOF OF FARKAS LEMMA DANIEL VICTOR TAUSK Abstract. The goal of this short note is to present a nice proof of Farkas Lemma which states that if C is the convex cone spanned by a finite set and if

More information

Chain Independence and Common Information

Chain Independence and Common Information 1 Chain Independence and Common Information Konstantin Makarychev and Yury Makarychev Abstract We present a new proof of a celebrated result of Gács and Körner that the common information is far less than

More information

Bounding the Entropic Region via Information Geometry

Bounding the Entropic Region via Information Geometry Bounding the ntropic Region via Information Geometry Yunshu Liu John MacLaren Walsh Dept. of C, Drexel University, Philadelphia, PA 19104, USA yunshu.liu@drexel.edu jwalsh@coe.drexel.edu Abstract This

More information

THIS work is motivated by the goal of finding the capacity

THIS work is motivated by the goal of finding the capacity IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 53, NO. 8, AUGUST 2007 2693 Improved Lower Bounds for the Capacity of i.i.d. Deletion Duplication Channels Eleni Drinea, Member, IEEE, Michael Mitzenmacher,

More information

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels S. Sandeep Pradhan a, Suhan Choi a and Kannan Ramchandran b, a {pradhanv,suhanc}@eecs.umich.edu, EECS Dept.,

More information

ProblemsWeCanSolveWithaHelper

ProblemsWeCanSolveWithaHelper ITW 2009, Volos, Greece, June 10-12, 2009 ProblemsWeCanSolveWitha Haim Permuter Ben-Gurion University of the Negev haimp@bgu.ac.il Yossef Steinberg Technion - IIT ysteinbe@ee.technion.ac.il Tsachy Weissman

More information

0 Sets and Induction. Sets

0 Sets and Induction. Sets 0 Sets and Induction Sets A set is an unordered collection of objects, called elements or members of the set. A set is said to contain its elements. We write a A to denote that a is an element of the set

More information

Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel

Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel Jun Chen Dept. of Electrical and Computer Engr. McMaster University Hamilton, Ontario, Canada Chao Tian AT&T Labs-Research 80 Park

More information

II. THE TWO-WAY TWO-RELAY CHANNEL

II. THE TWO-WAY TWO-RELAY CHANNEL An Achievable Rate Region for the Two-Way Two-Relay Channel Jonathan Ponniah Liang-Liang Xie Department of Electrical Computer Engineering, University of Waterloo, Canada Abstract We propose an achievable

More information

Entropy Vectors and Network Information Theory

Entropy Vectors and Network Information Theory Entropy Vectors and Network Information Theory Sormeh Shadbakht and Babak Hassibi Department of Electrical Engineering Caltech Lee Center Workshop May 25, 2007 1 Sormeh Shadbakht and Babak Hassibi Entropy

More information

CONSIDER a joint stationary and memoryless process

CONSIDER a joint stationary and memoryless process 4006 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 55, NO 9, SEPTEMBER 2009 On the Duality Between Slepian Wolf Coding and Channel Coding Under Mismatched Decoding Jun Chen, Member, IEEE, Da-ke He, and

More information

Partial cubes: structures, characterizations, and constructions

Partial cubes: structures, characterizations, and constructions Partial cubes: structures, characterizations, and constructions Sergei Ovchinnikov San Francisco State University, Mathematics Department, 1600 Holloway Ave., San Francisco, CA 94132 Abstract Partial cubes

More information

On the Capacity Region of the Gaussian Z-channel

On the Capacity Region of the Gaussian Z-channel On the Capacity Region of the Gaussian Z-channel Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@eng.umd.edu ulukus@eng.umd.edu

More information

A Comparison of Two Achievable Rate Regions for the Interference Channel

A Comparison of Two Achievable Rate Regions for the Interference Channel A Comparison of Two Achievable Rate Regions for the Interference Channel Hon-Fah Chong, Mehul Motani, and Hari Krishna Garg Electrical & Computer Engineering National University of Singapore Email: {g030596,motani,eleghk}@nus.edu.sg

More information

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/40 Acknowledgement Praneeth Boda Himanshu Tyagi Shun Watanabe 3/40 Outline Two-terminal model: Mutual

More information

Non-isomorphic Distribution Supports for Calculating Entropic Vectors

Non-isomorphic Distribution Supports for Calculating Entropic Vectors Non-isomorphic Distribution Supports for Calculating Entropic Vectors Yunshu Liu & John MacLaren Walsh Adaptive Signal Processing and Information Theory Research Group Department of Electrical and Computer

More information

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@cam.ac.uk Alfonso Martinez Universitat Pompeu Fabra alfonso.martinez@ieee.org

More information

On the Rate-Limited Gelfand-Pinsker Problem

On the Rate-Limited Gelfand-Pinsker Problem On the Rate-Limited Gelfand-Pinsker Problem Ravi Tandon Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 ravit@umd.edu ulukus@umd.edu Abstract

More information

ONE of Shannon s key discoveries was that for quite

ONE of Shannon s key discoveries was that for quite IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 44, NO. 6, OCTOBER 1998 2505 The Method of Types Imre Csiszár, Fellow, IEEE (Invited Paper) Abstract The method of types is one of the key technical tools

More information

THE capacity of the exponential server timing channel

THE capacity of the exponential server timing channel IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 6, JUNE 2006 2697 Capacity of Queues Via Point-Process Channels Rajesh Sundaresan, Senior Member, IEEE, and Sergio Verdú, Fellow, IEEE Abstract A conceptually

More information

Side-information Scalable Source Coding

Side-information Scalable Source Coding Side-information Scalable Source Coding Chao Tian, Member, IEEE, Suhas N. Diggavi, Member, IEEE Abstract The problem of side-information scalable (SI-scalable) source coding is considered in this work,

More information

Information Theory CHAPTER. 5.1 Introduction. 5.2 Entropy

Information Theory CHAPTER. 5.1 Introduction. 5.2 Entropy Haykin_ch05_pp3.fm Page 207 Monday, November 26, 202 2:44 PM CHAPTER 5 Information Theory 5. Introduction As mentioned in Chapter and reiterated along the way, the purpose of a communication system is

More information

THIS paper constitutes a systematic study of the capacity

THIS paper constitutes a systematic study of the capacity 4 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 1, JANUARY 2002 Capacities of TimeVarying MultipleAccess Channels With Side Information Arnab Das Prakash Narayan, Fellow, IEEE Abstract We determine

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

A Comparison of Superposition Coding Schemes

A Comparison of Superposition Coding Schemes A Comparison of Superposition Coding Schemes Lele Wang, Eren Şaşoğlu, Bernd Bandemer, and Young-Han Kim Department of Electrical and Computer Engineering University of California, San Diego La Jolla, CA

More information

Bounds on Achievable Rates for General Multi-terminal Networks with Practical Constraints

Bounds on Achievable Rates for General Multi-terminal Networks with Practical Constraints Bounds on Achievable Rates for General Multi-terminal Networs with Practical Constraints Mohammad Ali Khojastepour, Ashutosh Sabharwal, and Behnaam Aazhang Department of Electrical and Computer Engineering

More information

820 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 2, FEBRUARY Stefano Rini, Daniela Tuninetti, and Natasha Devroye

820 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 2, FEBRUARY Stefano Rini, Daniela Tuninetti, and Natasha Devroye 820 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 2, FEBRUARY 2012 Inner and Outer Bounds for the Gaussian Cognitive Interference Channel and New Capacity Results Stefano Rini, Daniela Tuninetti,

More information

Random Access: An Information-Theoretic Perspective

Random Access: An Information-Theoretic Perspective Random Access: An Information-Theoretic Perspective Paolo Minero, Massimo Franceschetti, and David N. C. Tse Abstract This paper considers a random access system where each sender can be in two modes of

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

Linearly Representable Entropy Vectors and their Relation to Network Coding Solutions

Linearly Representable Entropy Vectors and their Relation to Network Coding Solutions 2009 IEEE Information Theory Workshop Linearly Representable Entropy Vectors and their Relation to Network Coding Solutions Asaf Cohen, Michelle Effros, Salman Avestimehr and Ralf Koetter Abstract In this

More information

A Generalized Uncertainty Principle and Sparse Representation in Pairs of Bases

A Generalized Uncertainty Principle and Sparse Representation in Pairs of Bases 2558 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 48, NO 9, SEPTEMBER 2002 A Generalized Uncertainty Principle Sparse Representation in Pairs of Bases Michael Elad Alfred M Bruckstein Abstract An elementary

More information

THE information capacity is one of the most important

THE information capacity is one of the most important 256 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 44, NO. 1, JANUARY 1998 Capacity of Two-Layer Feedforward Neural Networks with Binary Weights Chuanyi Ji, Member, IEEE, Demetri Psaltis, Senior Member,

More information

Interference channel capacity region for randomized fixed-composition codes

Interference channel capacity region for randomized fixed-composition codes Interference channel capacity region for randomized fixed-composition codes Cheng Chang D. E. Shaw & Co, New York, NY. 20 West 45th Street 39th Floor New York, NY 0036 cchang@eecs.berkeley.edu Abstract

More information

On Function Computation with Privacy and Secrecy Constraints

On Function Computation with Privacy and Secrecy Constraints 1 On Function Computation with Privacy and Secrecy Constraints Wenwen Tu and Lifeng Lai Abstract In this paper, the problem of function computation with privacy and secrecy constraints is considered. The

More information