On Lossless Coding With Coded Side Information Daniel Marco, Member, IEEE, and Michelle Effros, Fellow, IEEE

Size: px
Start display at page:

Download "On Lossless Coding With Coded Side Information Daniel Marco, Member, IEEE, and Michelle Effros, Fellow, IEEE"

Transcription

1 3284 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 7, JULY 2009 On Lossless Coding With Coded Side Information Daniel Marco, Member, IEEE, Michelle Effros, Fellow, IEEE Abstract This paper considers the problem, first introduced by Ahlswede Körner in 1975, of lossless source coding with coded side information. Specifically, let X Y be two rom variables such that X is desired losslessly at the decoder while Y serves as side information. The rom variables are encoded independently, both descriptions are used by the decoder to reconstruct X. Ahlswede Körner describe the achievable rate region in terms of an auxiliary rom variable. This paper gives a partial solution for an optimal auxiliary rom variable, thereby describing part of the rate region explicitly in terms of the distribution of X Y. Index Terms Auxiliary rom variables, coded side information, common information, lossless coding, rate region. Fig. 1. Rom variables X Y are independently encoded jointly decoded. The decoder wishes to reconstruct almost losslessly only X. I. INTRODUCTION I N 1975, Ahlswede Körner [1] introduced the following coding problem (see Fig. 1). Rom variables are independently encoded jointly decoded. The decoder wishes to reconstruct almost losslessly only, so the description of serves as side information. Letting denote the rates used to encode, respectively, the question becomes: What rate pairs are achievable. The answer is given in terms of an auxiliary rom variable in [1]. Specifically, can be reconstructed with arbitrarily small probability of error if only if for some rom variable such that is a Markov chain, where are the alphabet sizes of, respectively. The bound on is tightened to for points on the lower boundary of the rate region in [2]. The intuition behind this solution is quite simple. Rom variable can be thought of as the encoded version of ; thus,. Since the useful part of is then known to the decoder, the description of requires rate. The Markov condition is quite straightforward, the original bound on the alphabet size of derives from Carathéodory s theorem. Manuscript received December 26, 2006; revised January 04, Current version published June 24, This work was supported by the Center for the Mathematics of Information at California Institute of Technology. The material in this paper was presented in part at the IEEE Information Threory Workshop, Punta del Este, Uruguay, March D. Marco was with the Department of Electrical Engineering, California Institute of Technology, Pasadena, CA USA ( danielmarco@gmail. com). M. Effros is with the Department of Electrical Engineering, California Institute of Technology, Pasadena, CA USA ( effros@caltech.edu), Communicated by W. Szpankowski, Associate Editor for Source Coding. Digital Object Identifier /TIT Fig. 2. The relationship between entropies mutual information of rom variables X Y. The above method for describing a rate region in terms of auxiliary rom variables is rather common, for example [3] [7]. These characterizations give great intuition into the basic form of a solution. Unfortunately, precise calculation of the rate region requires solution of the optimal auxiliary rom variable, which is surprisingly difficult, even for simple sources [2]. For any, it is possible to use this characterization to approximate the rate region to within a multiplicative factor in time polynomial in by [8]. Unfortunately, numerical solutions of this type fail to provide much insight into basic questions of theoretical interest. For example, is the point always in the achievable rate region? Is it ever in the achievable rate region? Does achieving ever require? Furthermore, no intuition is provided as to how one should go about designing optimal auxiliary rom variables. Ideally, we would like an explicit description comparable to the one given by Slepian Wolf [9] for their famous problem. In this paper, we give a partial solution for an optimal auxiliary rom variable in Ahlswede Körner s coding with side information problem. Thus, we describe part of the achievable rate region explicitly in terms of the distribution of the conditional distribution of given. As a byproduct of this effort, we are able to provide answers to some of our fundamental questions regarding the relationships between rom variables. For example, it is tempting to interpret the Venn diagram of [10, p. 20] (reproduced in Fig. 2) to mean that describing the information that holds about at rate describing the remaining uncertainty about at rate should always suffice for a complete description of. This /$ IEEE

2 MARCO AND EFFROS: ON LOSSLESS CODING WITH CODED SIDE INFORMATION 3285 Fig. 3. The three types of components are illustrated: (a) DC, (b) MDC, (c) ZIC. A number next to a line represents the transition probability. A line with no number represents a positive transition probability p(yjx). turns out not to be the case. In fact, we show that there exist simple examples where is arbitrarily small, is arbitrarily large, yet in order to make full use of the information that holds about, one needs to fully describe, giving. This shows that the information contained in about cannot be separated from the other information that contains, in general. In the process of deriving the partial solution for the coding with side information problem we define two functionals of the joint distribution of. One of these functionals,, turns out to equal the common information defined by Gács Körner [11]. See Section V for more details. The remainder of this paper is organized as follows. Section II introduces notation definitions. Section III provides the main results, namely, a partial explicit description of the achievable rate region for which the structure of optimal auxiliary rom variables is found. Section IV provides additional results that are useful for constructing optimal auxiliary rom variables. Additionally, it outlines open questions that need to be resolved in order to obtain a complete explicit solution. In Section V, a connection is made between the functional defined in Section III common information. Section VI offers concluding remarks. Finally, the Appendix A contains certain lemmas proofs. II. NOTATION AND DEFINITIONS Let,, denote discrete rom variables with finite alphabets, respectively. Set. Let,, denote subsets of the possible outcomes of,,, respectively. The index allows us to distinguish between distinct (but possibly overlapping) subsets. Pairs, triplet are called components. The functions,,,,,,,, are naturally defined marginal conditional probabilities on the alphabets. Additionally,, are similarly defined. We define use the convention. Next, we provide three definitions, which are key in the derivations that follow. Definition 1: is a disjoint component (DC) if Definition 2: is a minimal disjoint component (MDC) if it is a DC that contains no DCs other than itself. Definition 3: is a zero information component (ZIC) if We call the size of the ZIC. Fig. 3 illustrates DCs, MDCs, ZICs. Note that in order to check that a component is a ZIC, one needs to translate from the transition probabilities to. Note further that unlike DCs MDCs, ZICs are not symmetrical. Specifically, is a ZIC does not imply is a ZIC. (In fact, if is a ZIC, then is a ZIC if only if is an MDC.) MDCs are useful because they allow us to break a large problem into smaller subproblems. The importance of ZICs stems from the fact that for a ZIC, say, knowing that tells us that specifies a conditional distribution on that is unchanged by the knowledge of which has occurred; that is, for all (Note that this does not imply that the conditional distribution of given is uniform, which ordinarily is not the case.) Two properties of MDCs ZICs are useful to the ensuing discussion. First, every imposes a unique decomposition of into MDCs, e.g.,, where for any sets, implies partitions. Secondly, an MDC can be uniquely decomposed into largest ZICs. Specifically,, where for each, is a ZIC, there does not exist a ZIC that strictly contains. This can be shown by identifying ZICs enlarging them as much as possible. We proceed with two more definitions. Definition 4: Let be a decomposition of into DCs or into largest ZICs (in both cases, for ). We say that rom variable satisfies the decomposition property if there exits a partition

3 3286 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 7, JULY 2009 of such that for all,,,. Definition 5: Suppose that is a decomposition of into DCs that satisfies the decomposition property. Then we define,, to be the restrictions of,, to,,, respectively, call them component rom variables. (Note that if, then, so component rom variables need not be rom variables.) It is convenient to define the operations of mutual information, entropy, conditional entropy for component rom variables. These operations are defined analogously to their stard definitions. Specifically Proof: We show the lemma by showing the contrapositive. Namely, we show that if there exists a symbol symbols from distinct partition sets, respectively, for which, then. To do so, we construct an auxiliary rom variable show that. The alphabet of is. For each, set to the probability of the corresponding symbol under the conditional distribution of given. For each, set for all for all. (Notice that there may be for which, however, are guaranteed by the construction since.) Since forms a Markov chain, the Data Processing Inequality implies that. Thus, it suffices to show that. Note that Because the components are disjoint, the following properties hold:,,,, where the last property is the Data Processing Inequality, which holds for component rom variables as well. Finally, we define to be the rate designated by the encoder for the DC. It follows that. III. RESULTS We focus on identifying key points in the achievable rate region. The point 1 is clearly in the achievable rate region. Likewise, is achievable. It is the auxiliary rom variables, respectively, that attain these points. The straight line connecting these two points is an upper bound to the lower convex hull of the achievable rate region, as immediately follows from a time sharing argument. A more interesting question raised in Section I is whether one can operate at rate while maintaining. As noted, is shown, the answer is sometimes yes. We define to be the minimal rate for which is achievable note that. Theorem 2 provides a formula for computing. Corollary 3 Theorem 4 give necessary sufficient conditions under which, respectively. The following lemma shows that when, must satisfy the decomposition property. This is needed in the proofs of Theorems 2 4. Lemma 1: Let be either the unique decomposition of into largest ZICs or the decomposition into MDCs. Any auxiliary rom variable for which must satisfy the decomposition property. by the construction of. Next, we consider two cases. The first case is that is the decomposition of into largest ZICs. Notice that in this case there exists such that (otherwise, for all would imply that we could form a larger ZIC by combining with, which would give a contradiction). Therefore 1 Note that Fig. 4 draws R on the vertical axis R on the horizontal axis. We therefore report rate points as (R ;R ) for consistency.

4 MARCO AND EFFROS: ON LOSSLESS CODING WITH CODED SIDE INFORMATION 3287 obtain we let be an arbitrary element of where the strict inequality in follows from the strict concavity of, the fact that are both nonzero, the previous observation that ; follows since is a partition of ; derives from the fact that is the same for all by definition of a ZIC; follows since Thus, when the decomposition is into largest ZICs. When the decomposition of is into MDCs, there exists such that while. Therefore where derives from the fact that for all ; follows since is a Markov chain for all ; follow since is a ZIC, thus for any, is independent of ; also follows from being a ZIC, since that implies that for any, for all. The fact that follows since is a deterministic function of, giving where the strict inequality in derives from the strict concavity of, the fact that are both nonzero, the observation that ; follows since ; holds trivially if, if, then To show that minimizes given that, recall that Lemma 1 shows that any auxiliary rom variable for which has to satisfy the decomposition property. Let be such an auxiliary rom variable use to denote its decomposition. Then be the unique de- into largest ZICs. Then Theorem 2: Let composition of Proof: We obtain constructively. Let set for all for all. We need to show that,, can be no smaller when. To show that where follow since are DCs thus, are component rom variables; follows from Lemma A1 of the Appendix, which shows that is maximized when, which is precisely how is defined. Theorem 2 enables us to improve the previous upper bound to the lower convex hull of the achievable rate region. Specifically,

5 3288 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 7, JULY 2009 the improved upper bound is the connecting line between the rate points. Below are a direct corollary to Theorem 2 a theorem that uses Theorem 2. Corollary 3: if only if contains no ZICs of size greater than one. Theorem 4: Let be the decomposition of into MDCs. Then if only if each is a ZIC. Proof: We begin by showing the if part. Let be the decomposition of into MDCs that are also ZICs. We show that. We notice that the given decomposition is also into largest ZICs. Thus, applying Theorem 2 we obtain that. Next since, where are component rom variables, it suffices to show that for all. Letting be an arbitrary element of,wehave First follows since is a ZIC an MDC implies that is also a ZIC, thus for any, is independent of.next derives from the fact that for all, which can be seen as follows. Let be an arbitrary element of. Then where the third equality follows since is a ZIC. Finally, is due to being an MDC, which implies. Next, we let show that must decompose as given in the theorem statement. Let be the decomposition of into MDCs. We wish to show that each is a ZIC. First, observe that if, then for an (optimal) auxiliary rom variable that achieves,. Second, observe that if is an optimal auxiliary rom variable for which, then, thus Lemma 1 implies that satisfies the decomposition property. Let be an optimal auxiliary rom variable such that, let be the induced decomposition of into DCs, let,, be the corresponding component rom variables. The two observations above imply The last equation together with Lemma A2 (Part B) of the Appendix, which shows that for all, implies that for all. It then follows from the condition for equality in Lemma A2 (Part B) that is a ZIC, which implies via Lemma A2 (Part A) that. Consequently,. Thus Finally, letting play the role of in Lemma A2 (Part A), we have that for all. This together with the last equation implies that for all. Thus, it follows from the condition for equality in Lemma A2 (Part A) that is a ZIC for all, which gives the desired result. Corollary 3 Theorem 4 give conditions under which reaches its highest lowest possible values, respectively. Corollary 3 demonstrates that when lacks the special structure required to form ZICs of size larger than one, we cannot transmit the useful part of without describing completely. The following is an example where yet. This shows that there are cases where contains very little information about yet extracting this minuscule amount of information requires a complete description of. Example 1 (, ): Let suppose. Let. For all, let, where for all, each is very close, but not equal, to. Then, since are both approximately uniform, it follows that. Since, induces no ZICs of size greater than one. Consequently, Corollary 3 implies that. Fig. 4 illustrates for an example pair of rom variables for which. It also shows the lower bound, which follows from the source coding theorem. It is interesting to ask how much of this lower bound (beyond the obvious point) can actually be achieved what auxiliary rom variables achieve points on this lower bound. We define to be the maximal value of for which this lower bound is achieved with equality. Thus, when is the only achievable point on that lower bound under the conditions of Theorem 4. Theorem 5 characterizes precisely.

6 MARCO AND EFFROS: ON LOSSLESS CODING WITH CODED SIDE INFORMATION 3289 Fig. 4. The achievable rate region as known thus far. be the decompo- Theorem 5: Let sition of into MDCs. Then Proof: The proof proceeds in two steps. First, we design an auxiliary rom variable for which,asgiven in the theorem statement, show that the sum rate equals. Secondly, we show that if, then the sum rate must be strictly greater than. Let set for all, for all. Then is a deterministic function of, which implies This completes the first step of the proof. We show next that for any auxiliary rom variable,if, then. Theorem 7 of Section IV shows that if is optimal, then must have the decomposition property. Let,, be component rom variables. We write where the inequality is our case assumption. Then Lemma A2 (Part A) implies that there exists for which that is not a ZIC. Lemma A2 (Part B) implies that. Using this the observation that for all, as follows from the Data Processing Inequality for component rom variables, gives This completes the proof of the second step of the theorem as a whole. From Theorem 5 we find that if only if is an MDC (i.e., in the construct of the proof, see Example 2 below), using Theorem 4, if only if all MDCs are ZICs (see Example 3 below). The second observation follows from Theorem 4 since implies is achievable. Thus, where the last two inequalities are either both strict or both equalities. Notice that all MDCs are ZICs if only if are conditionally independent given the auxiliary rom variable defined in Theorem 5. Essentially the same rom variable was defined in the context of studying common information in [11], which shows [11, Corollary 1] that (which equals ) equals if only if the conditional independence mentioned above holds. We briefly discuss the relationship between common information coding with side information in Section V. Examples 2 3 below illustrate these concepts. The first is the case where can only be achieved when. The second is the case where is achievable for all interesting values (i.e., ). Example 2 ( ): The distribution is a binary-symmetric channel with crossover probability that is strictly between zero one. In this case, there is only one MDC, thus. Example 3 ( ): The distribution is defined by binary-symmetric channels, each with crossover probability one half. Specifically, let,, for all, let for all.in this case,. Corollary 6: Any point on the line connecting is an optimal achievable rate point. Corollary 6 follows immediately from a time sharing argument. Example 4 below shows that any point on the line connecting, can also be achieved via a direct construction. Example 4 (Achieving Directly Any Point Between ): Let be the decomposition of into MDCs. Let be arbitrary. We define auxiliary rom variable with alphabet by setting for all, for all. Let be an arbitrary element of. Then

7 3290 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 7, JULY 2009 Fig. 5. Construction of V from U, where B(u ) =f1; 2g, B(u )=f1g, B(u ) = f2; 3g, B(u ) = f2g, B(u ) = f3g, B(u ) = f3g; where v v are generated from u, v v are generated from u. where follows since for all, both follow since for any fixed, is the same for all. Thus such that. The alphabet of is. For all, for all, for all. Fig. 5 illustrates this construction. Note that satisfies the decomposition property that Since is a continuous function of that equals when equals when, for any value, there exists, for which as follows from the Intermediate Value Theorem. IV. THE REGION The lower convex hull of the achievable rate region is still not known for. The following theorem characterizes what we know so far about optimal auxiliary rom variables in that region. The optimal used in Theorem 5 to achieve the point satisfies the decomposition property. This is the lowest rate achievable by any auxiliary rom variable with this property. Theorem 7 below shows that any optimal for any rate must satisfy this property as well. Theorem 7: Let be the decomposition of into MDCs. Any optimal auxiliary rom variable operating at rate must satisfy the decomposition property. Proof: We show the contrapositive. Namely, we show that if satisfies does not satisfy the decomposition property, then there exists an auxiliary rom variable satisfying the decomposition property for which, thus implying that is not optimal. We construct an auxiliary rom variable as an intermediate step in constructing. Essentially, is constructed from by duplicating each that is connected to several components so that each of its copies is connected to a single component. More precisely, let define where follows from the strict concavity of since violates the decomposition property therefore there exists some for which has more than one element. Let. While, note that since Similarly,. Thus We next build from in a manner that maintains the decomposition property while decreasing from to increasing from to. We construct by constructing a component rom variable for each MDC,. We choose each so that, where for all, which gives

8 MARCO AND EFFROS: ON LOSSLESS CODING WITH CODED SIDE INFORMATION 3291 as desired. To prove that such a exists, we must show that the given values are achievable for all. The following argument shows that for each we can design a component rom variable to achieve equal to any value between. The upper bound is achieved when, which occurs when with probability one. The lower bound is achieved when, which occurs when by Lemma A1. Any value between these bounds can be achieved by a time sharing argument. As a result, we can design component rom variables to achieve any value of between. Since by assumption, is achievable. For each we now fix assume that each is optimal in the sense that it minimizes subject to the given constraint on. It remains to show that. For all let be the curve representing the lower convex hull of the achievable rate region for the th MDC. Lemma A2 shows that for any component rom variable with equality if only if. This implies that the curve, for, lies strictly above the line of slope that originates at the point. Thus for any any satisfying by the convexity of each with (1). See Fig. 6 for an illustration. Thus, for where follow since is an optimal component rom variable, while might not be an optimal component rom variable, follows from the definition of, is obtained from (1). Finally, implies Fig. 6. The lower convex hull of the ith MDC, denoted by g, a representation of (1). Lemma 8 below, whose proof is left to the Appendix, provides necessary conditions on the structure of an optimal auxiliary rom variable. Theorem 9 builds on this result, showing that an optimal auxiliary rom variable for an arbitrary rom pair can be solved by collapsing any ZIC of size greater than one in into a ZIC of size one in some new rom pair then finding an optimal auxiliary rom variable for ; auxiliary rom variable can be easily transformed into an optimal auxiliary rom variable for that achieves the same rate as for. Lemma 8: Let be a decomposition of into ZICs. If is an optimal auxiliary rom variable, then it must satisfy for all all, for all. The necessary conditions of Lemma 8 are not sufficient. It is not difficult to construct examples of suboptimal auxiliary rom variables that satisfy Lemma 8. Theorem 9: Given an arbitrary rom pair with alphabet a decomposition into ZICs, construct another rom pair with alphabet let have the same marginal distribution as for all all. If the auxiliary rom variable with alphabet is optimal for at rate, then the auxiliary rom variable with alphabet conditional distribution for all all is optimal for at rate. Proof: We must show both that is optimal that or, equivalently, that. We begin by showing that. For any Theorem 7 implies that we can find an optimal for any by allocating the rate among the MDCs then independently finding an optimal at the given rate for each MDC.

9 3292 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 7, JULY 2009 (2) Here follows since,, have the same marginal distribution. Then follows since for all. Thus. Next. This contradicts the optimality of, thereby gives the desired result. We construct from as follows. Index the alphabet of as. Since is an optimal auxiliary rom variable, Lemma 8 implies that for all, for all all. Set for,,. We show that by noting that (3) where follows since as shown below (3), follows from the definition of. By similar arguments where follows since derives from the fact that for all. Thus, since, it follows that. Next, to prove that, note that where are obtained in the same way that, respectively, are obtained in (2). Since, have the same marginal distribution, implies. It remains to show that is optimal for. The proof is by contradiction. Suppose that is not optimal. Then there exists an auxiliary rom variable such that. We use to construct an auxiliary rom variable for such that Thus,, which gives the desired result. The remainder of this section focuses on the minimum alphabet size of optimal auxiliary rom variables. The solution of the rate region provided in [1] bounds the alphabet size of the auxiliary rom variable by the alphabet size of plus (i.e., ). In [2] it is shown that suffices for optimal auxiliary rom variables. Corollary 10 combines this tighter bound with Theorem 9 shows that need never exceed the number of largest ZICs imposed by. The bound is sometimes tight (for example when ). This further reduces the space of possible optimal auxiliary rom variables. Corollary 10: Let be the decomposition of into largest ZICs. For any achievable rate

10 MARCO AND EFFROS: ON LOSSLESS CODING WITH CODED SIDE INFORMATION 3293 Fig. 7. The rates of the encoders are R = H(X) 0 R, R = R, R = H(Y ) 0 R in the case of C (X; Y ) R = R, R = R R = R in the case of C (X; Y ). point, there exists an auxiliary rom variable such that,,. Together, Theorems 7 9 Corollary 10 significantly restrict the space of possible optimal auxiliary rom variables by reducing the problem to that of finding an optimal auxiliary component rom variable for an MDC that has no ZICs of size greater than one, whose alphabet size is no larger than the alphabet size of the component rom variable. V. COMMON INFORMATION AND The notion of common information of two rom variables has been addressed in [11] [14], where various definitions have been proposed. In [11], Gács Körner define common information by defining functions for which with probability one, the number of values taken by (or ) with positive probability is largest possible; the Gács Körner common information, here denoted by, is then given by.by [13, p. 404], equals the largest rate for which is an achievable rate triple for the network given in Fig. 7. Furthermore, always holds [11], [13, p. 405]. In [12], Wyner defines common information, here denoted by, as the least rate for which there exist such that is an achievable rate triple for the same network. Wyner shows that, where the infimum is taken over auxiliary rom variables that satisfy the Markov chain. Wyner also shows that. It follows that of the three encoders is trivial when the side encoders have rate but more difficult otherwise.) Thus, describes the maximal amount of shared information that is useful in describing both individually, while describes the minimal amount of shared information needed to describe jointly. It is interesting to note that the auxiliary rom variable used to achieve meets the definition of the rom variable in the definition of. Thus, if only if is an MDC. Csiszár Körner call such a distribution indecomposable [13, p. 403]. Further, if only if all MDCs are ZICs or, equivalently, if only if there exist functions such that with probability one are conditionally independent given (or ) [13, p. 405]. The fact that might be a little surprising since the middle encoder in Fig. 7 has access to both, while the side information in Fig. 1 has access to only. This shows that indeed the common information is common in the sense that it can be fully extracted from either or separately. VI. CONCLUSION This paper considers the problem of lossless source coding with coded side information. Specifically, are two rom variables that are independently encoded jointly decoded, only needs to be reconstructed (losslessly). The solution to this problem, namely, the achievable rate region, is given in [1] in terms of an auxiliary rom variable. In this paper, we obtain a partial solution for an optimal auxiliary rom variable, thus providing part of the rate region explicitly in terms of the distribution of the conditional distribution of given. An explicit solution of the rate region remains elusive for rates. Solution in this region hinges on finding a construction for an optimal auxiliary rom variable for a single MDC that is not a ZIC. We also show that the alphabet size for this optimal auxiliary rom variable need not exceed the number of largest ZICs in the decomposition of this MDC. APPENDIX Lemma A1: Let be a DC the corresponding component rom variables. Then with equality if only if is a ZIC, for example, when. Proof: While both consider the rates associated with the scheme in Fig. 7, they answer different questions. Specifically, the Gács Körner interpretation minimizes the sum rate into each decoder while carrying as much of the load as possible with the central encoder. (Achieving sum rates of into the decoders is trivial when the central encoder has rate but more difficult when the central encoder is involved.) In contrast, Wyner s interpretation minimizes the sum rate out of the three encoders while carrying the minimal load at the central encoder. (Achieving sum rate out

11 3294 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 7, JULY 2009 where follows from the strict concavity of the fact that. The inequality is satisfied with equality if only if for each, for all for which. This is equivalent to saying that is a ZIC. Lemma A2: Let be the decomposition of into MDCs, let be an auxiliary rom variable that has the decomposition property. For each let,, be component rom variables for. Then A. B. with equality if only if with equality if only if Proof: We begin with the proof of Part A. is a ZIC is a ZIC where follows from the strict concavity of the fact that. Since for all, holds with equality if only if for all, which means that is a ZIC. Next, consider Part B. Since, we prove Part B by proving that or, equivalently, with equality if only if is a ZIC. This generalizes the Data Processing Inequality to component rom variables. Note that where follows from the strict concavity of the fact that. Inequality holds with equality if only if for each, for all satisfying.we call this condition ( for all all for which there exists with ) Condition A. All that remains is to show that since is an MDC, Condition A is satisfied if only if is a ZIC. The forward direction follows immediately from the definition of a ZIC. We next show that since is an MDC, Condition A implies that is a ZIC. The proof is by contradiction. Specifically, we suppose that is not a ZIC show that this implies that can be broken into smaller DCs, say, which gives a contradiction since is an MDC. The following argument builds up incrementally. If is not a ZIC, then there exist, for which. Let be the first members of, respectively, initialize as. Notice that are disjoint since Condition A allows only for that are not accessible from the same. Next, we add to all members of the set for some to all members of the set for some. Notice that for all for all. This observation follows from Condition A since for any there exists an for which for any there exists an for which. It further implies that since for some. This process continues, iteratively adding to all that are newly accessible from, respectively, then adding to all that are newly accessible from, respectively. At each iteration, Condition A guarantees that. For example, for any newly added to there exists an a already in for which, which implies that is constant for all in the newly enlarged set thus that are disjoint. Similarly, are disjoint. Further, since are finite the sizes of, are nondecreasing from one iteration to the next, the procedure converges. Since the resulting are DCs, we have the desired result. Proof of Lemma 8 We show the contrapositive. Namely, we show that if there exists some such that for some, then is not optimal. To show that is not optimal, we construct an auxiliary rom variable such that. Set for some ; that is, is the set of s that are connected to (notice that need not be empty). By assumption, there exists such that for some. For each define ; represents the contribution to the probability of from the members of. We are now ready to define. The alphabet of is. For each, there is a corresponding, we define for all. For each

12 MARCO AND EFFROS: ON LOSSLESS CODING WITH CODED SIDE INFORMATION 3295 there is a corresponding, we define for all, for all. Notice that satisfies the property given in the lemma with respect to. We begin by proving that is a valid auxiliary rom variable, namely, for all. The result is immediate when, since for all.for we have If, then this holds trivially, since for all.if, then for any where follows since for all. Next, we show. For any its corresponding where follow since is a ZIC, which implies that for all. Hence, as desired. It remains to show that or, equivalently, since, that. Recall that when, its corresponding is in, for all. Likewise, when, its corresponding is in, for all. Thus Finally Likewise, for any its corresponding, for all, which implies that. Thus. Next, we show that or, equivalently, since, we show that. It suffices to show that for any its corresponding, for all. Consider an arbitrary. If, then as needed. If, then Therefore, we wish to show that

13 3296 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 7, JULY 2009 where follow because implies since for all ; follows from the strict concavity of since, for all, for some. REFERENCES [1] R. F. Ahlswede J. Körner, Source coding with side information a converse for degraded broadcast channels, IEEE Trans. Inf. Theory, vol. IT-21, no. 6, pp , Nov [2] W. Gu, R. Koetter, M. Effros, T. Ho, On source coding with coded side information for a binary source with binary side information, in IEEE Int. Symp. Information Theory (ISIT), Nice, France, Jun. 2007, pp [3] A. D. Wyner J. Ziv, The rate-distortion function for source coding with side information at the decoder, IEEE Trans. Inf. Theory, vol. IT-22, no. 1, pp. 1 10, Jan [4] T. M. Cover, A. El Gamal, M. Salehi, Multiple access channels with arbitrarily correlated sources, IEEE Trans. Inf. Theory, vol. IT-26, no. 6, pp , Nov [5] T. M. Cover C. S. K. Leung, An achievable rate region for the multiple-access channel with feedback, IEEE Trans. Inf. Theory, vol. IT-27, no. 3, pp , May [6] F. M. J. Willems, The feedback capacity region of a class of discrete memoryless multiple access channels, IEEE Trans. Inf. Theory, vol. IT-28, no. 1, pp , Jan [7] R. Ahlswede T. S. Han, On source coding with side information via a multiple-access channel related problems in multi-user information theory, IEEE Trans. Inf. Theory, vol. IT-29, no. 3, pp , May [8] W. Gu M. Effros, On approximating the rate region for source coding with coded side information, in IEEE Information Theory Workshop, Lake Tahoe, CA., Sep. 2007, pp [9] D. Slepian J. K. Wolf, Noiseless coding of correlated information sources, IEEE Trans. Inf. Theory, vol. IT-19, no. 4, pp , Jul [10] T. M. Cover J. A. Thomas, Elements of Information Theory. New York: Wiley, [11] P. Gács J. Körner, Common information is far less than mutual information, Probl. Contr. Inf. Theory, vol. 2, pp , [12] A. D. Wyner, The common information of two dependent rom variables, IEEE Trans. Inf. Theory, vol. IT-21, no. 2, pp , Mar [13] I. Csiszár J. Körner, Information Theory: Coding Theorems for Discrete Memoryless Systems. New York: Academic, [14] H. Yamamoto, Coding theorems for Shannon s cipher system with correlated source outputs, common information, IEEE Trans. Inf. Theory, vol. 40, no. 1, pp , Jan Daniel Marco (S 02 M 04) received the B.Sc. degree in computer engineering from the Technion Israel Institute of Technology, Haifa, Israel, in 1999 the M.S. degree in electrical engineering, the M.S. degree in mathematics, the Ph.D. degree in electrical engineering, all from the University of Michigan, Ann Arbor, in 2001, 2003, 2004, respectively. From 2004 to 2006, he was with the California Institute of Technology, Pasadena, as a Postdoctoral Scholar. His research interests lie primarily in information theory, high- low-resolution quantization theory, rate distortion theory, coding with side information, sensor networks. Michelle Effros (S 93 M 95 SM 03 F 09) received the B.S. degree with distinction in 1989, the M.S. degree in 1990, the Ph.D. degree in 1994, all in electrical engineering from Stanford University, Stanford, CA. During the summers of , she worked at Hughes Aircraft Company. She joined the faculty at the California Institute of Technology, Pasadena, in 1994 is currently a Professor of Electrical Engineering. Her research interests include information theory, network coding, data compression, communications, pattern recognition, image processing. Prof. Effros received Stanford s Frederick Emmons Terman Engineering Scholastic Award (for excellence in engineering) in 1989, the Hughes Masters Full-Study Fellowship in 1989, the National Science Foundation Graduate Fellowship in 1990, the AT&T Ph.D. Scholarship in 1993, the NSF CAREER Award in 1995, the Charles Lee Powell Foundation Award in 1997, the Richard Feynman-Hughes Fellowship in 1997, an Okawa Research Grant in 2000, was cited by Technology Review as one of the world s top 100 young innovators in She is a member of Tau Beta Pi, Phi Beta Kappa, Sigma Xi, the IEEE Information Theory, Signal Processing, Communications societies. She served as the Editor of the IEEE Information Theory Society NEWSLETTER from 1995 to 1998 as a Member of the Board of Governors of the IEEE Information Theory Society from 1998 to 2003 since 2008 to the present. She served on the IEEE Signal Processing Society Image Multi-Dimensional Signal Processing (IMDSP) Technical Committee since She was an Associate Editor for the joint special issue on Networking Information Theory in the IEEE TRANSACTIONS ON INFORMATION THEORY the IEEE/ACM TRANSACTIONS ON NETWORKING as Associate Editor for Source Coding for the IEEE TRANSACTIONS ON INFORMATION THEORY from 2004 to 2007.

On Scalable Source Coding for Multiple Decoders with Side Information

On Scalable Source Coding for Multiple Decoders with Side Information On Scalable Source Coding for Multiple Decoders with Side Information Chao Tian School of Computer and Communication Sciences Laboratory for Information and Communication Systems (LICOS), EPFL, Lausanne,

More information

On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers

On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers Albrecht Wolf, Diana Cristina González, Meik Dörpinghaus, José Cândido Silveira Santos Filho, and Gerhard Fettweis Vodafone

More information

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels Representation of Correlated s into Graphs for Transmission over Broadcast s Suhan Choi Department of Electrical Eng. and Computer Science University of Michigan, Ann Arbor, MI 80, USA Email: suhanc@eecs.umich.edu

More information

5218 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER 2006

5218 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER 2006 5218 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER 2006 Source Coding With Limited-Look-Ahead Side Information at the Decoder Tsachy Weissman, Member, IEEE, Abbas El Gamal, Fellow,

More information

Joint Source-Channel Coding for the Multiple-Access Relay Channel

Joint Source-Channel Coding for the Multiple-Access Relay Channel Joint Source-Channel Coding for the Multiple-Access Relay Channel Yonathan Murin, Ron Dabora Department of Electrical and Computer Engineering Ben-Gurion University, Israel Email: moriny@bgu.ac.il, ron@ee.bgu.ac.il

More information

Reliable Computation over Multiple-Access Channels

Reliable Computation over Multiple-Access Channels Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,

More information

An Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel

An Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 4, APRIL 2012 2427 An Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel Ersen Ekrem, Student Member, IEEE,

More information

On Common Information and the Encoding of Sources that are Not Successively Refinable

On Common Information and the Encoding of Sources that are Not Successively Refinable On Common Information and the Encoding of Sources that are Not Successively Refinable Kumar Viswanatha, Emrah Akyol, Tejaswi Nanjundaswamy and Kenneth Rose ECE Department, University of California - Santa

More information

6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011

6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011 6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011 On the Structure of Real-Time Encoding and Decoding Functions in a Multiterminal Communication System Ashutosh Nayyar, Student

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

Amobile satellite communication system, like Motorola s

Amobile satellite communication system, like Motorola s I TRANSACTIONS ON INFORMATION THORY, VOL. 45, NO. 4, MAY 1999 1111 Distributed Source Coding for Satellite Communications Raymond W. Yeung, Senior Member, I, Zhen Zhang, Senior Member, I Abstract Inspired

More information

WITH advances in communications media and technologies

WITH advances in communications media and technologies IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 45, NO. 6, SEPTEMBER 1999 1887 Distortion-Rate Bounds for Fixed- Variable-Rate Multiresolution Source Codes Michelle Effros, Member, IEEE Abstract The source

More information

An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel

An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel Forty-Ninth Annual Allerton Conference Allerton House, UIUC, Illinois, USA September 8-3, An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel Invited Paper Bernd Bandemer and

More information

SUCCESSIVE refinement of information, or scalable

SUCCESSIVE refinement of information, or scalable IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 49, NO. 8, AUGUST 2003 1983 Additive Successive Refinement Ertem Tuncel, Student Member, IEEE, Kenneth Rose, Fellow, IEEE Abstract Rate-distortion bounds for

More information

Duality Between Channel Capacity and Rate Distortion With Two-Sided State Information

Duality Between Channel Capacity and Rate Distortion With Two-Sided State Information IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 6, JUNE 2002 1629 Duality Between Channel Capacity Rate Distortion With Two-Sided State Information Thomas M. Cover, Fellow, IEEE, Mung Chiang, Student

More information

Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels

Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels Bike ie, Student Member, IEEE and Richard D. Wesel, Senior Member, IEEE Abstract Certain degraded broadcast channels

More information

Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel

Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel Jun Chen Dept. of Electrical and Computer Engr. McMaster University Hamilton, Ontario, Canada Chao Tian AT&T Labs-Research 80 Park

More information

On Scalable Coding in the Presence of Decoder Side Information

On Scalable Coding in the Presence of Decoder Side Information On Scalable Coding in the Presence of Decoder Side Information Emrah Akyol, Urbashi Mitra Dep. of Electrical Eng. USC, CA, US Email: {eakyol, ubli}@usc.edu Ertem Tuncel Dep. of Electrical Eng. UC Riverside,

More information

Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function

Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Dinesh Krithivasan and S. Sandeep Pradhan Department of Electrical Engineering and Computer Science,

More information

The Gallager Converse

The Gallager Converse The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing

More information

A Comparison of Superposition Coding Schemes

A Comparison of Superposition Coding Schemes A Comparison of Superposition Coding Schemes Lele Wang, Eren Şaşoğlu, Bernd Bandemer, and Young-Han Kim Department of Electrical and Computer Engineering University of California, San Diego La Jolla, CA

More information

Cut-Set Bound and Dependence Balance Bound

Cut-Set Bound and Dependence Balance Bound Cut-Set Bound and Dependence Balance Bound Lei Xiao lxiao@nd.edu 1 Date: 4 October, 2006 Reading: Elements of information theory by Cover and Thomas [1, Section 14.10], and the paper by Hekstra and Willems

More information

Communicating the sum of sources in a 3-sources/3-terminals network

Communicating the sum of sources in a 3-sources/3-terminals network Communicating the sum of sources in a 3-sources/3-terminals network Michael Langberg Computer Science Division Open University of Israel Raanana 43107, Israel Email: mikel@openu.ac.il Aditya Ramamoorthy

More information

On the Capacity Region of the Gaussian Z-channel

On the Capacity Region of the Gaussian Z-channel On the Capacity Region of the Gaussian Z-channel Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@eng.umd.edu ulukus@eng.umd.edu

More information

Optimal Multiple Description and Multiresolution Scalar Quantizer Design

Optimal Multiple Description and Multiresolution Scalar Quantizer Design Optimal ultiple Description and ultiresolution Scalar Quantizer Design ichelle Effros California Institute of Technology Abstract I present new algorithms for fixed-rate multiple description and multiresolution

More information

Afundamental problem of multiple description coding is to

Afundamental problem of multiple description coding is to IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 3, MARCH 2011 1443 On the Role of the Refinement Layer in Multiple Description Coding and Scalable Coding Jia Wang, Jun Chen, Member, IEEE, Lei Zhao,

More information

PERFECTLY secure key agreement has been studied recently

PERFECTLY secure key agreement has been studied recently IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 45, NO. 2, MARCH 1999 499 Unconditionally Secure Key Agreement the Intrinsic Conditional Information Ueli M. Maurer, Senior Member, IEEE, Stefan Wolf Abstract

More information

The Information Lost in Erasures Sergio Verdú, Fellow, IEEE, and Tsachy Weissman, Senior Member, IEEE

The Information Lost in Erasures Sergio Verdú, Fellow, IEEE, and Tsachy Weissman, Senior Member, IEEE 5030 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 54, NO. 11, NOVEMBER 2008 The Information Lost in Erasures Sergio Verdú, Fellow, IEEE, Tsachy Weissman, Senior Member, IEEE Abstract We consider sources

More information

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview AN INTRODUCTION TO SECRECY CAPACITY BRIAN DUNN. Overview This paper introduces the reader to several information theoretic aspects of covert communications. In particular, it discusses fundamental limits

More information

On Gaussian MIMO Broadcast Channels with Common and Private Messages

On Gaussian MIMO Broadcast Channels with Common and Private Messages On Gaussian MIMO Broadcast Channels with Common and Private Messages Ersen Ekrem Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 20742 ersen@umd.edu

More information

Upper Bounds on the Capacity of Binary Intermittent Communication

Upper Bounds on the Capacity of Binary Intermittent Communication Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,

More information

Variable Length Codes for Degraded Broadcast Channels

Variable Length Codes for Degraded Broadcast Channels Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

The Poisson Channel with Side Information

The Poisson Channel with Side Information The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH

More information

ProblemsWeCanSolveWithaHelper

ProblemsWeCanSolveWithaHelper ITW 2009, Volos, Greece, June 10-12, 2009 ProblemsWeCanSolveWitha Haim Permuter Ben-Gurion University of the Negev haimp@bgu.ac.il Yossef Steinberg Technion - IIT ysteinbe@ee.technion.ac.il Tsachy Weissman

More information

THIS paper addresses the quadratic Gaussian two-encoder

THIS paper addresses the quadratic Gaussian two-encoder 1938 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 54, NO 5, MAY 2008 Rate Region of the Quadratic Gaussian Two-Encoder Source-Coding Problem Aaron B Wagner, Member, IEEE, Saurabha Tavildar, Pramod Viswanath,

More information

Remote Source Coding with Two-Sided Information

Remote Source Coding with Two-Sided Information Remote Source Coding with Two-Sided Information Basak Guler Ebrahim MolavianJazi Aylin Yener Wireless Communications and Networking Laboratory Department of Electrical Engineering The Pennsylvania State

More information

COMPLEMENTARY graph entropy was introduced

COMPLEMENTARY graph entropy was introduced IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 6, JUNE 2009 2537 On Complementary Graph Entropy Ertem Tuncel, Member, IEEE, Jayanth Nayak, Prashant Koulgi, Student Member, IEEE, Kenneth Rose, Fellow,

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

IN this paper, we show that the scalar Gaussian multiple-access

IN this paper, we show that the scalar Gaussian multiple-access 768 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 50, NO. 5, MAY 2004 On the Duality of Gaussian Multiple-Access and Broadcast Channels Nihar Jindal, Student Member, IEEE, Sriram Vishwanath, and Andrea

More information

Multiuser Successive Refinement and Multiple Description Coding

Multiuser Successive Refinement and Multiple Description Coding Multiuser Successive Refinement and Multiple Description Coding Chao Tian Laboratory for Information and Communication Systems (LICOS) School of Computer and Communication Sciences EPFL Lausanne Switzerland

More information

Alphabet Size Reduction for Secure Network Coding: A Graph Theoretic Approach

Alphabet Size Reduction for Secure Network Coding: A Graph Theoretic Approach ALPHABET SIZE REDUCTION FOR SECURE NETWORK CODING: A GRAPH THEORETIC APPROACH 1 Alphabet Size Reduction for Secure Network Coding: A Graph Theoretic Approach Xuan Guang, Member, IEEE, and Raymond W. Yeung,

More information

Cases Where Finding the Minimum Entropy Coloring of a Characteristic Graph is a Polynomial Time Problem

Cases Where Finding the Minimum Entropy Coloring of a Characteristic Graph is a Polynomial Time Problem Cases Where Finding the Minimum Entropy Coloring of a Characteristic Graph is a Polynomial Time Problem Soheil Feizi, Muriel Médard RLE at MIT Emails: {sfeizi,medard}@mit.edu Abstract In this paper, we

More information

CONSIDER a joint stationary and memoryless process

CONSIDER a joint stationary and memoryless process 4006 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 55, NO 9, SEPTEMBER 2009 On the Duality Between Slepian Wolf Coding and Channel Coding Under Mismatched Decoding Jun Chen, Member, IEEE, Da-ke He, and

More information

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,

More information

On the Rate-Limited Gelfand-Pinsker Problem

On the Rate-Limited Gelfand-Pinsker Problem On the Rate-Limited Gelfand-Pinsker Problem Ravi Tandon Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 ravit@umd.edu ulukus@umd.edu Abstract

More information

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER 2006 5177 The Distributed Karhunen Loève Transform Michael Gastpar, Member, IEEE, Pier Luigi Dragotti, Member, IEEE, and Martin Vetterli,

More information

WE study the capacity of peak-power limited, single-antenna,

WE study the capacity of peak-power limited, single-antenna, 1158 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 3, MARCH 2010 Gaussian Fading Is the Worst Fading Tobias Koch, Member, IEEE, and Amos Lapidoth, Fellow, IEEE Abstract The capacity of peak-power

More information

Distributed Functional Compression through Graph Coloring

Distributed Functional Compression through Graph Coloring Distributed Functional Compression through Graph Coloring Vishal Doshi, Devavrat Shah, Muriel Médard, and Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology

More information

Computing sum of sources over an arbitrary multiple access channel

Computing sum of sources over an arbitrary multiple access channel Computing sum of sources over an arbitrary multiple access channel Arun Padakandla University of Michigan Ann Arbor, MI 48109, USA Email: arunpr@umich.edu S. Sandeep Pradhan University of Michigan Ann

More information

The Duality Between Information Embedding and Source Coding With Side Information and Some Applications

The Duality Between Information Embedding and Source Coding With Side Information and Some Applications IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 49, NO. 5, MAY 2003 1159 The Duality Between Information Embedding and Source Coding With Side Information and Some Applications Richard J. Barron, Member,

More information

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 Capacity Theorems for Discrete, Finite-State Broadcast Channels With Feedback and Unidirectional Receiver Cooperation Ron Dabora

More information

Side-information Scalable Source Coding

Side-information Scalable Source Coding Side-information Scalable Source Coding Chao Tian, Member, IEEE, Suhas N. Diggavi, Member, IEEE Abstract The problem of side-information scalable (SI-scalable) source coding is considered in this work,

More information

ONE approach to improving the performance of a quantizer

ONE approach to improving the performance of a quantizer 640 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 2, FEBRUARY 2006 Quantizers With Unim Decoders Channel-Optimized Encoders Benjamin Farber Kenneth Zeger, Fellow, IEEE Abstract Scalar quantizers

More information

Variable-Rate Universal Slepian-Wolf Coding with Feedback

Variable-Rate Universal Slepian-Wolf Coding with Feedback Variable-Rate Universal Slepian-Wolf Coding with Feedback Shriram Sarvotham, Dror Baron, and Richard G. Baraniuk Dept. of Electrical and Computer Engineering Rice University, Houston, TX 77005 Abstract

More information

Error Exponent Region for Gaussian Broadcast Channels

Error Exponent Region for Gaussian Broadcast Channels Error Exponent Region for Gaussian Broadcast Channels Lihua Weng, S. Sandeep Pradhan, and Achilleas Anastasopoulos Electrical Engineering and Computer Science Dept. University of Michigan, Ann Arbor, MI

More information

Capacity of a Class of Semi-Deterministic Primitive Relay Channels

Capacity of a Class of Semi-Deterministic Primitive Relay Channels Capacity of a Class of Semi-Deterministic Primitive Relay Channels Ravi Tandon Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 2742 ravit@umd.edu

More information

YAMAMOTO [1] considered the cascade source coding

YAMAMOTO [1] considered the cascade source coding IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 58, NO 6, JUNE 2012 3339 Cascade Triangular Source Coding With Side Information at the First Two Nodes Haim H Permuter, Member, IEEE, Tsachy Weissman, Senior

More information

IN this paper, we consider the capacity of sticky channels, a

IN this paper, we consider the capacity of sticky channels, a 72 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 54, NO. 1, JANUARY 2008 Capacity Bounds for Sticky Channels Michael Mitzenmacher, Member, IEEE Abstract The capacity of sticky channels, a subclass of insertion

More information

A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality

A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality 0 IEEE Information Theory Workshop A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality Kumar Viswanatha, Emrah Akyol and Kenneth Rose ECE Department, University of California

More information

Arimoto Channel Coding Converse and Rényi Divergence

Arimoto Channel Coding Converse and Rényi Divergence Arimoto Channel Coding Converse and Rényi Divergence Yury Polyanskiy and Sergio Verdú Abstract Arimoto proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

A Single-letter Upper Bound for the Sum Rate of Multiple Access Channels with Correlated Sources

A Single-letter Upper Bound for the Sum Rate of Multiple Access Channels with Correlated Sources A Single-letter Upper Bound for the Sum Rate of Multiple Access Channels with Correlated Sources Wei Kang Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College

More information

Chain Independence and Common Information

Chain Independence and Common Information 1 Chain Independence and Common Information Konstantin Makarychev and Yury Makarychev Abstract We present a new proof of a celebrated result of Gács and Körner that the common information is far less than

More information

Interference Decoding for Deterministic Channels Bernd Bandemer, Student Member, IEEE, and Abbas El Gamal, Fellow, IEEE

Interference Decoding for Deterministic Channels Bernd Bandemer, Student Member, IEEE, and Abbas El Gamal, Fellow, IEEE 2966 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 5, MAY 2011 Interference Decoding for Deterministic Channels Bernd Bemer, Student Member, IEEE, Abbas El Gamal, Fellow, IEEE Abstract An inner

More information

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels S. Sandeep Pradhan a, Suhan Choi a and Kannan Ramchandran b, a {pradhanv,suhanc}@eecs.umich.edu, EECS Dept.,

More information

Channel Polarization and Blackwell Measures

Channel Polarization and Blackwell Measures Channel Polarization Blackwell Measures Maxim Raginsky Abstract The Blackwell measure of a binary-input channel (BIC is the distribution of the posterior probability of 0 under the uniform input distribution

More information

Noisy channel communication

Noisy channel communication Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 6 Communication channels and Information Some notes on the noisy channel setup: Iain Murray, 2012 School of Informatics, University

More information

Optimal Decentralized Control of Coupled Subsystems With Control Sharing

Optimal Decentralized Control of Coupled Subsystems With Control Sharing IEEE TRANSACTIONS ON AUTOMATIC CONTROL, VOL. 58, NO. 9, SEPTEMBER 2013 2377 Optimal Decentralized Control of Coupled Subsystems With Control Sharing Aditya Mahajan, Member, IEEE Abstract Subsystems that

More information

Chapter 9 Fundamental Limits in Information Theory

Chapter 9 Fundamental Limits in Information Theory Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For

More information

Common Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014

Common Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014 Common Information Abbas El Gamal Stanford University Viterbi Lecture, USC, April 2014 Andrew Viterbi s Fabulous Formula, IEEE Spectrum, 2010 El Gamal (Stanford University) Disclaimer Viterbi Lecture 2

More information

1590 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 6, JUNE Source Coding, Large Deviations, and Approximate Pattern Matching

1590 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 6, JUNE Source Coding, Large Deviations, and Approximate Pattern Matching 1590 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 6, JUNE 2002 Source Coding, Large Deviations, and Approximate Pattern Matching Amir Dembo and Ioannis Kontoyiannis, Member, IEEE Invited Paper

More information

4488 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 54, NO. 10, OCTOBER /$ IEEE

4488 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 54, NO. 10, OCTOBER /$ IEEE 4488 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 54, NO. 10, OCTOBER 2008 List Decoding of Biorthogonal Codes the Hadamard Transform With Linear Complexity Ilya Dumer, Fellow, IEEE, Grigory Kabatiansky,

More information

On Dependence Balance Bounds for Two Way Channels

On Dependence Balance Bounds for Two Way Channels On Dependence Balance Bounds for Two Way Channels Ravi Tandon Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 20742 ravit@umd.edu ulukus@umd.edu

More information

Equivalence for Networks with Adversarial State

Equivalence for Networks with Adversarial State Equivalence for Networks with Adversarial State Oliver Kosut Department of Electrical, Computer and Energy Engineering Arizona State University Tempe, AZ 85287 Email: okosut@asu.edu Jörg Kliewer Department

More information

On the Capacity of Free-Space Optical Intensity Channels

On the Capacity of Free-Space Optical Intensity Channels On the Capacity of Free-Space Optical Intensity Channels Amos Lapidoth TH Zurich Zurich, Switzerl mail: lapidoth@isi.ee.ethz.ch Stefan M. Moser National Chiao Tung University NCTU Hsinchu, Taiwan mail:

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 4, APRIL

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 4, APRIL IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 4, APRIL 2006 1545 Bounds on Capacity Minimum EnergyPerBit for AWGN Relay Channels Abbas El Gamal, Fellow, IEEE, Mehdi Mohseni, Student Member, IEEE,

More information

EECS 750. Hypothesis Testing with Communication Constraints

EECS 750. Hypothesis Testing with Communication Constraints EECS 750 Hypothesis Testing with Communication Constraints Name: Dinesh Krithivasan Abstract In this report, we study a modification of the classical statistical problem of bivariate hypothesis testing.

More information

IEEE Proof Web Version

IEEE Proof Web Version IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 10, OCTOBER 2009 1 A Neyman Pearson Approach to Universal Erasure List Decoding Pierre Moulin, Fellow, IEEE Abstract When information is to be transmitted

More information

Aalborg Universitet. Bounds on information combining for parity-check equations Land, Ingmar Rüdiger; Hoeher, A.; Huber, Johannes

Aalborg Universitet. Bounds on information combining for parity-check equations Land, Ingmar Rüdiger; Hoeher, A.; Huber, Johannes Aalborg Universitet Bounds on information combining for parity-check equations Land, Ingmar Rüdiger; Hoeher, A.; Huber, Johannes Published in: 2004 International Seminar on Communications DOI link to publication

More information

An Outer Bound for the Gaussian. Interference channel with a relay.

An Outer Bound for the Gaussian. Interference channel with a relay. An Outer Bound for the Gaussian Interference Channel with a Relay Ivana Marić Stanford University Stanford, CA ivanam@wsl.stanford.edu Ron Dabora Ben-Gurion University Be er-sheva, Israel ron@ee.bgu.ac.il

More information

THIS work is motivated by the goal of finding the capacity

THIS work is motivated by the goal of finding the capacity IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 53, NO. 8, AUGUST 2007 2693 Improved Lower Bounds for the Capacity of i.i.d. Deletion Duplication Channels Eleni Drinea, Member, IEEE, Michael Mitzenmacher,

More information

An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets

An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets Jing Guo University of Cambridge jg582@cam.ac.uk Jossy Sayir University of Cambridge j.sayir@ieee.org Minghai Qin

More information

LOW-density parity-check (LDPC) codes were invented

LOW-density parity-check (LDPC) codes were invented IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 54, NO 1, JANUARY 2008 51 Extremal Problems of Information Combining Yibo Jiang, Alexei Ashikhmin, Member, IEEE, Ralf Koetter, Senior Member, IEEE, and Andrew

More information

Optimal Power Allocation for Parallel Gaussian Broadcast Channels with Independent and Common Information

Optimal Power Allocation for Parallel Gaussian Broadcast Channels with Independent and Common Information SUBMIED O IEEE INERNAIONAL SYMPOSIUM ON INFORMAION HEORY, DE. 23 1 Optimal Power Allocation for Parallel Gaussian Broadcast hannels with Independent and ommon Information Nihar Jindal and Andrea Goldsmith

More information

ASIGNIFICANT research effort has been devoted to the. Optimal State Estimation for Stochastic Systems: An Information Theoretic Approach

ASIGNIFICANT research effort has been devoted to the. Optimal State Estimation for Stochastic Systems: An Information Theoretic Approach IEEE TRANSACTIONS ON AUTOMATIC CONTROL, VOL 42, NO 6, JUNE 1997 771 Optimal State Estimation for Stochastic Systems: An Information Theoretic Approach Xiangbo Feng, Kenneth A Loparo, Senior Member, IEEE,

More information

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@cam.ac.uk Alfonso Martinez Universitat Pompeu Fabra alfonso.martinez@ieee.org

More information

Shannon meets Wiener II: On MMSE estimation in successive decoding schemes

Shannon meets Wiener II: On MMSE estimation in successive decoding schemes Shannon meets Wiener II: On MMSE estimation in successive decoding schemes G. David Forney, Jr. MIT Cambridge, MA 0239 USA forneyd@comcast.net Abstract We continue to discuss why MMSE estimation arises

More information

Design of Optimal Quantizers for Distributed Source Coding

Design of Optimal Quantizers for Distributed Source Coding Design of Optimal Quantizers for Distributed Source Coding David Rebollo-Monedero, Rui Zhang and Bernd Girod Information Systems Laboratory, Electrical Eng. Dept. Stanford University, Stanford, CA 94305

More information

A Formula for the Capacity of the General Gel fand-pinsker Channel

A Formula for the Capacity of the General Gel fand-pinsker Channel A Formula for the Capacity of the General Gel fand-pinsker Channel Vincent Y. F. Tan Institute for Infocomm Research (I 2 R, A*STAR, Email: tanyfv@i2r.a-star.edu.sg ECE Dept., National University of Singapore

More information

Common Randomness and Secret Key Generation with a Helper

Common Randomness and Secret Key Generation with a Helper 344 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 46, NO. 2, MARCH 2000 Common Romness Secret Key Generation with a Helper Imre Csiszár, Fellow, IEEE, Prakash Narayan, Senior Member, IEEE Abstract We consider

More information

The Gaussian Many-Help-One Distributed Source Coding Problem Saurabha Tavildar, Pramod Viswanath, Member, IEEE, and Aaron B. Wagner, Member, IEEE

The Gaussian Many-Help-One Distributed Source Coding Problem Saurabha Tavildar, Pramod Viswanath, Member, IEEE, and Aaron B. Wagner, Member, IEEE 564 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 1, JANUARY 2010 The Gaussian Many-Help-One Distributed Source Coding Problem Saurabha Tavildar, Pramod Viswanath, Member, IEEE, and Aaron B. Wagner,

More information

Multiterminal Source Coding with an Entropy-Based Distortion Measure

Multiterminal Source Coding with an Entropy-Based Distortion Measure Multiterminal Source Coding with an Entropy-Based Distortion Measure Thomas Courtade and Rick Wesel Department of Electrical Engineering University of California, Los Angeles 4 August, 2011 IEEE International

More information

Graph Coloring and Conditional Graph Entropy

Graph Coloring and Conditional Graph Entropy Graph Coloring and Conditional Graph Entropy Vishal Doshi, Devavrat Shah, Muriel Médard, Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology Cambridge,

More information

On the Duality between Multiple-Access Codes and Computation Codes

On the Duality between Multiple-Access Codes and Computation Codes On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch

More information

Capacity Region of the Permutation Channel

Capacity Region of the Permutation Channel Capacity Region of the Permutation Channel John MacLaren Walsh and Steven Weber Abstract We discuss the capacity region of a degraded broadcast channel (DBC) formed from a channel that randomly permutes

More information

SHANNON S information measures refer to entropies, conditional

SHANNON S information measures refer to entropies, conditional 1924 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 43, NO. 6, NOVEMBER 1997 A Framework for Linear Information Inequalities Raymond W. Yeung, Senior Member, IEEE Abstract We present a framework for information

More information

THIS paper constitutes a systematic study of the capacity

THIS paper constitutes a systematic study of the capacity 4 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 1, JANUARY 2002 Capacities of TimeVarying MultipleAccess Channels With Side Information Arnab Das Prakash Narayan, Fellow, IEEE Abstract We determine

More information

On Competitive Prediction and Its Relation to Rate-Distortion Theory

On Competitive Prediction and Its Relation to Rate-Distortion Theory IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 49, NO. 12, DECEMBER 2003 3185 On Competitive Prediction and Its Relation to Rate-Distortion Theory Tsachy Weissman, Member, IEEE, and Neri Merhav, Fellow,

More information