SHANNON S information measures refer to entropies, conditional

Size: px
Start display at page:

Download "SHANNON S information measures refer to entropies, conditional"

Transcription

1 1924 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 43, NO. 6, NOVEMBER 1997 A Framework for Linear Information Inequalities Raymond W. Yeung, Senior Member, IEEE Abstract We present a framework for information inequalities, namely, inequalities involving only Shannon s information measures, for discrete rom variables. A region in IR 2 01, denoted by 0 3, is identified to be the origin of all information inequalities involving n rom variables in the sense that all such inequalities are partial characterizations of 0 3. A product from this framework is a simple calculus for verifying all unconstrained constrained linear information identities inequalities which can be proved by conventional techniques. These include all information identities inequalities of such types in the literature. As a consequence of this work, most identities inequalities involving a definite number of rom variables can now be verified by a software called ITIP which is available on the World Wide Web. Our work suggests the possibility of the existence of information inequalities which cannot be proved by conventional techniques. We also point out the relation between 0 3 some important problems in probability theory information theory. Index Terms Entropy, I-Measure, information identities, information inequalities, mutual information. Example 2: I. INTRODUCTION SHANNON S information measures refer to entropies, conditional entropies, mutual informations, conditional mutual informations. For information inequalities, we refer to those involving only Shannon s information measures for discrete rom variables. These inequalities play a central role in converse coding theorems for problems in information theory with discrete alphabets. This paper is devoted to a systematic study of these inequalities. We begin our discussion by examining the two examples below which exemplify what we call the conventional approach to proving such inequalities. Example 1: This is a version of the well-known data processing theorem. Let be rom variables such that form a Markov chain. Then In the above, the second equality follows from the Markov condition, while the inequality follows because is always nonnegative. Manuscript received August 10, 1995; revised February 10, The material in this paper was presented in part at the 1996 IEEE Information Theory Workshop, Haifa, Israel, June 9 13, The author is with the Department of Information Engineering, The Chinese University of Hong Kong, Shatin, N.T., Hong Kong. Publisher Item Identifier S (97) The inequalities above follow from the nonnegativity of,,, respectively. In the conventional approach, we invoke certain elementary identities inequalities in the intermediate steps of a proof. Some frequently invoked identities inequalities are if if Proving an identity or an inequality using the conventional approach can be quite tricky, because it may not be easy to see which elementary identity or inequality should be invoked next. For certain problems, like Example 1, we may rely on our insight to see how we should proceed in the proof. But of course, most of our insight in problems is developed from the hindsight. For other problems like or even more complicated than Example 2 (which involves only three rom variables), it may not be easy at all to work it out by brute force. The proof of information inequalities can be facilitated by the use of information diagrams 1 [25]. However, the use of such diagrams becomes very difficult when the number of rom variables is more than four. 1 It was called an I-diagram in [25], but we prefer to call it an information diagram to avoid confusion with an eye diagram in communication theory /97$ IEEE

2 YEUNG: A FRAMEWORK FOR LINEAR INFORMATION INEQUALITIES 1925 In the conventional approach, elementary identities inequalities are invoked in a sequential manner. In the new framework that we shall develop in this paper, all identities inequalities are considered simultaneously. Before we proceed any further, we would like to make a few remarks. Let be any expressions depending only on Shannon s information measures. We shall call them information expressions, specifically linear information expressions if they are linear combinations of Shannon s information measures. Likewise, we shall call inequalities involving only Shannon s information measures information inequalities. Now if only if. Therefore, if for any expression we can determine whether it is always nonnegative, then we can determine whether any particular inequality always holds. We note that if only if. Therefore, it suffices to study inequalities. The rest of the paper is organized as follows. In the next section, we first give a brief review of -Measure [25] on which a few proofs will be based. In Section III, we introduce the canonical form of an information expression discuss its uniqueness. We also define a region called which is central to the discussion in this paper. In Section IV, we present a simple calculus for verifying information identities inequalities which can be proved by conventional techniques. In Section V, we further elaborate the significance of by pointing out its relations with some important problems in probability theory information theory. Concluding remarks are given in Section VI. II. REVIEW OF THE THEORY OF -MEASURE In this section, we give a review of the main results regarding -Measure. For a detailed discussion of -Measure, we refer the reader to [25]. Further results on -Measure can be found in [7]. Let be jointly distributed discrete rom variables, be a set variable corresponding to a rom variable. Define the universal set to be let be the -field generated by. The atoms of have the form, where is either or. Let be the set of all atoms of except for, which is by construction because Note that. To simplify notations, we shall use to denote to denote. Let. It was shown in [25] that there exists a unique signed measure on which is consistent with all Shannon s information measures via the following formal substitution of symbols: (1) ( ), i.e., for any (not necessarily disjoint) When When, we interpret (2) as, (2) becomes When, (2) becomes Thus (2) covers all the cases of Shannon s information measures. Let (2) (3) (4) (5) for some (6) Note that. Let. Define arbitrary one-to-one mappings let (7) (8) where for. Then where is a unique matrix (independent of ) with if if (9) (10) An important characteristic of is that it is invertible [25], so we can write (11) In other words, is completely specified by the set of values,, namely, all the joint entropies involving, it follows from (5) that is the unique measure on which is consistent with all Shannon s information measures. Note that in general is not nonnegative. However, if form a Markov chain, is always nonnegative [7]. As a consequence of the theory of -Measure, the information diagram was introduced as a tool to visualize the relationship among information measures [25]. Applications of information diagrams can be found in [7], [25], [26].

3 1926 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 43, NO. 6, NOVEMBER 1997 III. THE CANONICAL FORM In the rest of the paper, we shall assume that are the rom variables involved in our discussion. We observe that conditional entropies, mutual informations, conditional mutual informations can be expressed as a linear combination of joint entropies by using the following identity: (12) where. Thus any information expression can be expressed in terms of the joint entropies. We call this the canonical form of an information expression. Now for any their joint entropies correspond to a vector in, where we regard as the coordinates of. On the other h, a vector in is said to be constructible if there exist whose joint entropies are given by. We are then motivated to define is constructible As we shall see, not only gives a complete characterization of all information inequalities, but it also is closely related to some important problems in probability theory information theory. Thus a complete characterization of is of fundamental importance. To our knowledge, there has not been such a characterization in the literature (see Section V). Now every information expression can be expressed in canonical form. A basic question to ask is in what sense the canonical form is unique. Toward this end, we shall first establish the following theorem. Theorem 1: Let be measurable such that has zero Lebesque measure. Then cannot be identically zero on. We shall need the following lemma which is immediate from the discussion in [26, Sec. 6]. The proof is omitted here. Lemma 1: Let is constructible (cf., (11)). Then the first quadrant of is a subset of. Proof of Theorem 1: If has positive Lebesque measure, since has zero Lebesque measure hence has zero Lebesque measure, has positive Lebesque measure. Then cannot be a subset of, which implies that cannot be identically zero on. Thus it suffices to prove that has positive Lebesque measure. Using the above Lemma, we see that the first quadrant of, which has positive Lebesque measure, is a subset of. Therefore has positive Lebesque measure. Since is an invertible linear transformation of, its Lebesque measure must also be positive. This proves the theorem. The uniqueness of the canonical form for very general classes of information expressions follows from this theorem. For example, suppose are two polynomials of the joint entropies such that for all. Let.If is not the zero function, then has zero Lebesque measure. By the theorem, cannot be identical to zero on, which is a contradiction. Therefore is the zero function, i.e.,. Thus we see that the canonical form is unique for polynomial information expressions. We note that the uniqueness of the canonical form for linear information expressions has been discussed in [4] [2, p. 51, Theorem 3.6]. The importance of the canonical form will become clear in the next section. An application of the canonical form to recognizing the symmetry of an information expression will be discussed in Appendix II-A. We note that any invertible linear transformation of the joint entropies can be used for the purpose of defining the canonical form. Nevertheless, the current definition of the canonical form has the advantage that if are two sets of rom variables such that, then the joint entropies involving the rom variables in is a subset of the joint entropies involving the rom variables in. IV. A CALCULUS FOR VERIFYING LINEAR IDENTITIES AND INEQUALITIES In this section, we shall develop a simple calculus for verifying all linear information identities inequalities involving a definite number of rom variables which can be proved by conventional techniques. All identities inequalities in this section are assumed to be linear unless otherwise specified. Although our discussion will primarily be on linear identities inequalities (possibly with linear constraints), our approach can be extended naturally to nonlinear cases. For nonlinear cases, the amount of computation required is larger. The question of what linear combinations of entropies are always nonnegative was first raised by Han [5]. A. Unconstrained Identities Due to the uniqueness of the canonical form for linear information expressions as discussed in the preceding section, it is easy to check whether two expressions are identical. All we need to do is to express in canonical form. If all the coefficients are zero, then are identical, otherwise they are not. B. Unconstrained Inequalities Since all information expressions can be expressed in canonical form, we shall only consider inequalities in this form. The following is a simple yet fundamental observation which apparently has not been discussed in the literature. For any, always holds if only if. This observation, which follows immediately from the definition of, gives a complete characterization of all unconstrained inequalities (not necessary linear) in terms of. From this point of view, an unconstrained inequality is simply a partial characterization of. The nonnegativity of all Shannon s information measures form a set of inequalities which we shall refer to as the basic

4 YEUNG: A FRAMEWORK FOR LINEAR INFORMATION INEQUALITIES 1927 inequalities. We observe that in the conventional approach to proving information inequalities, whenever we establish an inequality in an intermediate step, we invoke one of the basic inequalities. Therefore, all information inequalities conditional information identities which can be proved by conventional techniques are consequences of the basic inequalities. These inequalities, however, are not nonredundant. For example,, which are both basic inequalities of the rom variables, imply again a basic inequality of. We shall be dealing with linear combinations whose coefficients are nonnegative. We call such linear combinations nonnegative linear combinations. We observe that any Shannon s information measure can be expressed as a nonnegative linear combination of the following two elemental forms of Shannon s information measures: i) ii), where. This can be done by successive (if necessary) application(s) of the following identities: (13) (14) (15) (16) (17) (18) (Note that all the coefficients in the above identities are nonnegative.) It is easy to check that the total number of Shannon s information measures of the two elemental forms is equal to (19) The nonnegativity of the two elemental forms of Shannon s information measures form a proper subset of the set of basic inequalities. We call the inequalities in this smaller set the elemental inequalities. They are equivalent to the basic inequalities because each basic inequality which is not an elemental inequality can be obtained by adding a certain set of elemental inequalities in view of (13) (18). The minimality of the elemental inequalities is proved in Appendix I. If the elemental inequalities are expressed in canonical form, then they become linear inequalities in. Denote this set of inequalities by, where is an matrix, define (20) Since the elemental inequalities are satisfied by any, we have. Therefore, if then i.e., always holds. Let, be the column -vector whose th component is equal to all the other components are equal to. Since a joint entropy can be expressed as a nonnegative linear combination of the two elemental forms of Shannon s information measures, each can be expressed as a nonnegative linear combination of the rows of. This implies that is a pyramid in the positive quadrant. Let be any column -vector. Then, a linear combination of joint entropies, is always nonnegative if. This is equivalent to say that the minimum of the problem (Primal) Minimize subject to is zero. Since gives ( is the only corner of ), all we need to do is to apply the optimality test of the simplex method [19] to check whether the point is optimal. We can obtain further insight in the problem from the Duality Theorem in linear programming [19]. The dual of the above linear programming problem is (Dual) Maximize subject to where By the Duality Theorem, the maximum of the dual problem is also zero. Since the cost function in the dual problem is zero, the maximum of the dual problem is zero if only if the feasible region (21) is nonempty. Theorem 2: is nonempty if only if for some, where is a column -vector, i.e., is a nonnegative linear combination of the rows of. Proof: We omit the simple proof that is nonempty if only if for some, where is a column -vector. Let If for some, then. Let (22) Since can be expressed as a nonnegative linear combination of the rows of (23) can also be expressed as a nonnegative linear combinations of the rows of. By (22), this implies for some. Thus always holds (subject to ) if only if it is a nonnegative linear combination of the elemental inequalities (in canonical form).

5 1928 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 43, NO. 6, NOVEMBER 1997 We now summarize the results in this section. For information expressions, let be the cost function subject to the elemental inequalities. Then apply the optimality test of the simplex method to the point. If is optimal, then always holds. If not, then may or may not always hold. If it always holds, it is not implied by the elemental inequalities. In other words, it cannot be proved by conventional techniques, namely, invoking the elemental inequalities. Han has previously studied unconstrained information inequalities involving three rom variables [5] as well as information inequalities which are symmetrical in all the rom variables involved [6], explicit characterizations of such inequalities were obtained. A discussion of these results is found in Appendix II. C. Constrained Inequalities Linear constraints on arise frequently in information theory. Some examples are 1),, are mutually independent if only if 2),, are pairwise-independent if only if 3) is a function of if only if. 4) form a Markov chain if only if. In order to facilitate our discussion, we now introduce an alternative set of notations for. We do not distinguish elements singletons of, we write unions of subsets of as juxtapositions. For any nonempty, we use to denote, i.e., (refer to Section II for the definition of ). We also define for nonempty When is a subspace of, we can easily modify the method in the last subsection by taking advantage of the linear structure of the problem. Let the constraints on be given by (24) where is a matrix (i.e., there are constraints). Following our discussion in the last subsection, a linear combination of joint entropies is always nonnegative under the constraint if the minimum of the problem Minimize subject to is zero. Let be the rank of. Since is in the null space of, we can write (25) where is a matrix whose columns form a basis of the orthogonal complement of the row space of, is a column -vector. Then the elemental inequalities can be expressed as in terms of, becomes (26) (27) which is a pyramid in (but not necessarily in the positive quadrant). Likewise, can be expressed as. With the constraints all expressions in terms of, is always nonnegative under the constraint if the minimum of the problem Minimize subject to to simplify notations. In general, a constraint is given by a subset of. For instance, for the last example above, When, there is no constraint. (In fact, there is no constraint if.) Parallel to our discussion in the preceding subsection, we have the following more general observation: Under the constraint, for any, always hold if only if. Again, this gives a complete characterization of all constrained inequalities in terms of. Thus in fact is the origin of all constrained inequalities, with unconstrained inequalities being a special case. In this the next subsection, however, we shall confine our discussion to the linear case. is zero. Again, since gives ( is the only corner of ), all we need to do is to apply the optimality test of the simplex method to check whether the point is optimal. By imposing the constraints in (24), the number of elemental inequalities remains the same, while the dimension of the problem decreases from to. Again from the Duality Theorem, we see that is always nonnegative if for some, where is a column - vector, i.e., is a nonnegative linear combination of the elemental inequalities (in terms of ). We now summarize the results in this section. Let the constraints be given in (24). For expressions, let. Then let be the cost function subject to the elemental inequalities (in terms of ) apply the optimality test to the point.if is optimal, then always holds, otherwise it may or may not always hold. If it always holds, it is not implied by the elemental inequalities. In other words, it cannot be proved by conventional techniques.

6 YEUNG: A FRAMEWORK FOR LINEAR INFORMATION INEQUALITIES 1929 D. Constrained Identities We impose the constraints in (24) as in the last subsection. As we have pointed out at the beginning of the paper, two information expressions are identical if only if always hold. Thus we can apply the method in the last subsection to verify all constrained identities that can be proved by conventional techniques. When are unconstrained, the uniqueness of the canonical form for linear information expressions asserts that if only if. However, when the constraints in (24) on are imposed, does not imply. We give a simple example to illustrate this point. Suppose we impose the constraint. Then every information expression can be expressed in terms of. Now consider (28) Note that the coefficients in the above expression are nonzero. But from the elemental inequalities, we have (29) V. FURTHER DISCUSSION ON We have seen that, but it is not clear whether.ifso, hence all information inequalities are completely characterized by the elemental inequalities. In the following, we shall use the notations when we refer to for a specific. For In -Measure notations, the elemental inequalities are,,. It then follows from Lemma 1 that. Inspired by the current work, the characterization of has recently been investigated by Zhang Yeung. They have found that (therefore in general), but, the closure of, is equal to [29]. This implies that all unconstrained (linear or nonlinear) inequalities involving three rom variables are consequences of the elemental inequalities of the same set of rom variables. However, it is not clear whether the same is true for all constrained inequalities. They also have discovered the following conditional inequality involving four rom variables which is not implied by the elemental inequalities: If, then (30) which imply that. We now discuss a special application of the method described in this subsection. Let us consider the following problem which is typical in probability theory. Suppose we are given that form a Markov chain, that are independent. We ask whether are always independent. This problem can be formulated in information-theoretic terms with the constraints represented by,,, we want to know whether they imply. Problems of such kind can be hled by the method described in this subsection. Our method can prove any independence relation which can be proved by conventional information-theoretic techniques. The advantage of using an information-theoretic formulation of the problem is that we can avoid manipulations of the joint distribution directly, which is awkward [8], if not difficult. It may be difficult to devise a calculus to hle independence relations of rom variables in a general setting, 2 because an independence relation is discrete in the sense that it is either true or false. On the other h, the problem becomes a continuous one if it is formulated in informationtheoretic terms (because mutual informations are continuous functionals), continuous problems are in general less difficult to hle. From this point of view, the problem of determining independence of rom variables is a discrete problem embedded in a continuous problem. 2 A calculus for independence relations has been devised by Massey [9] for the special case when the rom variables have a causal interpretation. If, in addition, the above inequality implies that This is a conditional independence relation which is not implied by the elemental inequalities. However, whether remained an open problem. Subsequently, they have determined that by discovering the following unconstrained inequality involving four rom variables which is not implied by the elemental inequalities of the same set of rom variables [30]: The existence of the above two inequalities indicates that there may be a lot of information inequalities yet to be discovered. Since most converse coding theorems are proved by means of information inequalities, it is plausible that some of these inequalities yet to be discovered are needed to settle certain open problems in information theory. In the remainder of the section, we shall further elaborate on the significance of by pointing out its relations with some important problems in probability theory information theory.

7 1930 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 43, NO. 6, NOVEMBER 1997 A. Conditional Independence Relations For any fixed number of rom variables, a basic question is what sets of conditional independence relations are possible. In the recent work of Matúš Studený [17], this problem is formulated as follows. Recall that let be the family of all couples where is the union of two, not necessarily different, singletons of. Having a system of rom variables with subsystems,, we introduce the notation where is the abbreviation of the statement is conditionally independent of given. For, means is determined by. The subsystem is presumed to be constant. A subfamily is called probabilistically ( -) representable if there exists a system, called a -representation, such that. The problem is to characterize the class of all -representable relations. Note that this problem is more general than the application discussed in Section IV-D. Now is equivalent to. If is a proper subset of, i.e., is not of elemental form, then can be written as a nonnegative combination of the corresponding elemental forms of Shannon s information measure. We observe that if only if each of the corresponding elemental forms of Shannon s information measures vanishes, that an elemental form of Shannon s information measure vanishes if only if the corresponding conditional independence relation holds. Thus it is actually unnecessary to consider,, for separately because it is determined by the other conditional independence relations. Let us now look at some examples. For As pointed out in the last paragraph, the couples are actually redundant. Let be a system of rom variables such that is not deterministic,,, are not functions of each other. Then it is easy to see that Thus is -representable. On the other h, is not - representable, because imply. The recent studies on the problem of conditional independence relations was launched by a seminal paper by Dawid [3], in which he proposed four axioms as heuristic properties of conditional independence. In information-theoretic terms, these four axioms can be summarized by the following statement: Subsequent work on this subject has been done by Pearl his collaborators in the 1980 s, their work is summarized in the book by Pearl [18]. Their work has mainly been motivated by the study of the logic of integrity constraints from databases. Pearl conjectured that Dawid s four axioms completely characterize the conditional independence structure of any joint distribution. This conjecture, however, was refuted by the work of Studený [20]. Since then, Matúš Studený have written a series of papers on this problem [10] [17], [20] [24]. So far, they have solved the problem for three rom variables, but the problem for four rom variables remains open. The relation between this problem is the following. Suppose we want to determine whether a subfamily of is -representable. Now each corresponds to setting to zero in. Note that is a hyperplane containing the origin in. Thus is - representable if only if there exists a in such that for all. Therefore, the problem of conditional independence relations is a subproblem of the problem of characterizing. B. Optimization of Information Quantities Consider minimizing given (31) (32) (33) where. This problem is equivalent to the following minimization problem. Minimize subject to (34) (35) (36) (37) As no characterization of is available, this minimization problem cannot be solved. Nevertheless, since, if we replace by in the above minimization problem, it becomes a linear programming problem which renders a lower bound on the solution. C. Multiuser Information Theory The framework for information inequalities developed in this paper provides new tools for problems in multiuser information theory. Consider the source coding problem in Fig. 1, in which are source rom variables,

8 YEUNG: A FRAMEWORK FOR LINEAR INFORMATION INEQUALITIES 1931 Fig. 1. A multiterminal source coding problem. the blocks on the left right are encoders decoders, respectively. The rom variables,, are the outputs of the corresponding encoders. Given,,, where, we are interested in the admissible region of the triple. Evidently,,, give the number of bits needed for the encoders. From the encoding decoding requirements, we immediately have,,,,, equal to zero. Now there are five rom variables involved in this problem. Then the intersection of the set containing all such that is the set of all possible vectors of the joint entropies involving given that they satisfy the encoding decoding requirements of the problem as well as the constraints on the joint entropies involving. Then is given as the projection of this set on the coordinates,,. In the same spirit as that in the last subsection, an explicit outer bound of, denoted by, is given by replacing by. We refer to an outer bound such as as an LP (linear programming) bound. This is a new tool for proving converse coding theorems for problems in multiuser information theory. The LP bound already has found applications in the recent work of Yeung Zhang [28] on a new class of multiterminal source coding problems. We expect that this approach will have impact on other problems in multiuser information theory. VI. CONCLUDING REMARKS We have identified the region as the origin all information inequalities. Our work suggests the possibility of the existence of information inequalities which cannot be proved by conventional techniques, this has been confirmed by the recent results of Zhang Yeung [29], [30]. A product from the framework we have developed is a simple calculus for verifying all linear information inequalities involving a definite number of rom variables possibly with linear constraints which can be proved by conventional techniques; these include all inequalities of such type in the literature. Based on this calculus, a software running on MATLAB called ITIP (Information-Theoretic Inequality Prover) has been developed by Yeung Yan [27], it is available on World Wide Web. The following session from ITIP contains verifications of Example 1 2, respectively, in Section I. >> ITIP( I(Y; Z) >= I(X; Z), I(X; Z Y) = 0 ) True >> ITIP( H(X,Y) H(Y) I(Y; X,Z) H(Y Z) >= 0 ) True We see from (19) that the amount of computation required is moderate when. Our work gives a partial answer to Han s question of what linear combinations of entropies are always nonnegative [5]. A complete answer to this question is impossible without further characterization of. The characterization of is a very fundamental problem in information theory. However, in view of the difficulty of some special cases of this problem [15], [17], [29], [30], it is not very hopeful that this problem can be solved completely in the near future. Nevertheless, partial characterizations of may lead to the discovery of some new inequalities which make the solutions of certain open problems in information theory possible. APPENDIX I MINIMALITY OF THE ELEMENTAL INEQUALITIES The elemental inequalities in set-theoretic notations have one of the following two forms: 1) ; 2), where. They will be referred to as -inequalities -inequalities, respectively. We are to show that all the elemental inequalities are nonredundant, i.e., none of them is implied by the others. For an -inequality (38) since it is the only elemental inequality which involves the atom, it is clearly not implied by the other elemental inequalities. Therefore, we only need to show that all -inequalities are nonredundant. To show that a -inequality is nonredundant, it suffices to show that there exists a measure on for that one. We shall show that the which satisfies all other elemental inequalities except -inequality (39) is nonredundant. To facilitate our discussion, we denote by we let be the atoms in, where (40) We first consider the case when, i.e.,. We construct a measure by if otherwise (41) where. In other words, is the only

9 1932 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 43, NO. 6, NOVEMBER 1997 atom with measure ; all other atoms have measure. Then is trivially true. It is also trivial to check that for any (42) for any such that (43) if. On the other h, if is a proper subset of, then contains at least two atoms, therefore, (44) This completes the proof for the -inequality in (39) to be nonredundant when. We now consider the case when, or. We construct a measure as follows. For the atoms in, let. (45) For,if is odd, it is referred to as an odd atom of, if is even, it is referred to as an even atom of. For any atom, we let (46) Consider (51) The nonnegativity of the second term above follows from (46). For the first term, is nonempty if only if (52) If this condition is not satisfied, then the first term in (51) becomes, (50) follows immediately. Let us assume that the condition in (52) is satisfied. Then by simple counting, we see that the number atoms in is equal to, where For example, for, there are atoms in namely, where or for. We check that This completes the construction of. We first prove that (47) We first consider the case when, i.e., Consider Then where the last equality follows from the binomial formula (48) for. This proves (47). Next we prove that satisfies all -inequalities. We note that for any, the atom is not in. Thus contains exactly one atom. If this atom is an even atom of, then the first term in (51) is either or (cf., (45)), (50) follows immediately. If this atom is an odd atom of, then the first term in (51) is equal to. This happens if only if have one common element, which implies that is nonempty. Therefore, the second term in (51) is at least, hence (50) follows. Finally, we consider the case when. Using the binomial formula in (48), we see that the number of odd atoms even atoms of in (49) are the same. Therefore, the first term in (51) is equal to if It remains to prove that satisfies all -inequalities except for (39), i.e., for any such that (50) is equal to otherwise. The former is true if only if, which implies that

10 YEUNG: A FRAMEWORK FOR LINEAR INFORMATION INEQUALITIES 1933 is nonempty, or that the second term is at least. Thus in either case (50) is true. This completes the proof that (39) is nonredundant. APPENDIX II SOME SPECIAL FORMS OF UNCONSTRAINED INFORMATION INEQUALITIES In this appendix, we shall discuss some special forms of unconstrained linear information inequalities previously investigated by Han [5], [6]. Explicit necessary sufficient conditions for these inequalities to always hold have been obtained. The relation between these inequalities the results in the current paper will also be discussed. It follows trivially from the elemental inequalities that is a sufficient condition for to always hold. The necessity of this condition can be seen by noting the existence of rom variables for each such that for all. This implies that all unconstrained linear symmetrical information inequalities are consequences of the elemental inequalities. We refer the reader to [5] for a more detailed discussion of symmetrical information inequalities. B. Information Inequalities Involving Three Rom Variables Consider. Let A. Symmetrical Information Inequalities An information expression is said to be symmetrical if it is identical under every permutation among. For example, for, the expression is symmetrical. This can be seen by permuting symbolically in the expression. Now let us consider the expression. If we replace by each other, the expression becomes, which is symbolically different from the original expression. However, both expression are identical to. Therefore, the two expressions are in fact identical, the expression is actually symmetrical although it is not readily recognized symbolically. The symmetry of an information expression in general cannot be recognized symbolically. However, it is readily recognized symbolically if the expression is in canonical form. This is due to the uniqueness of the canonical form as discussed in Section III. Consider a linear symmetrical information expression (in canonical form). As seen in Section IV-B, can be expressed as a linear combination of the two elemental forms of Shannon s information measures. It was shown in [5] that every symmetrical expression can be written in the form let Since is an invertible linear transformation of, all linear information expression can be written as, where It was shown in [6] that always holds if only if the following conditions are satisfied: (53) In terms of, the elemental inequalities can be expressed as, where where, for, Note that is the sum of all Shannon s information measures of the first elemental form, for, is the sum of all Shannon s information measures of the second elemental form conditioning on rom variables. From the discussion in Section IV-B, we see that always holds if only if is a nonnegative linear combination of the rows of. We leave it as an exercise for the reader to show that is a nonnegative linear combination of the rows of if only if the conditions in (53) are satisfied. Therefore, all unconditional linear inequalities involving three rom variables are consequences of the elemental inequalities. This result also implies that is the smallest pyramid containing.

11 1934 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 43, NO. 6, NOVEMBER 1997 ACKNOWLEDGMENT The author wishes to acknowledge the help of a few individuals during the preparation of this paper. They include I. Csiszár, B. Hajek, F. Matúš, Y.-O. Yan, E.-h. Yang, Z. Zhang. REFERENCES [1] T. M. Cover J. A. Thomas, Elements of Information Theory. New York: Wiley, [2] I. Csiszár J. Körner, Information Theory: Coding Theorems for Discrete Memoryless Systems. New York: Academic, [3] A. P. Dawid, Conditional independence in statistical theory (with discussion), J. Roy. Statist. Soc., Ser. B, vol. 41, pp. 1 31, [4] T. S. Han, Linear dependence structure of the entropy space, Inform. Contr, vol. 29, pp , [5], Nonnegative entropy measures of multivariate symmetric correlations, Inform. Contr., vol. 36, pp , [6], A uniqueness of Shannon s information distance related nonnegativity problems, J. Combin.., Inform. Syst. Sci., vol. 6, no. 4, pp , [7] T. Kawabata R. W. Yeung, The structure of the I-Measure of a Markov chain, IEEE Trans. Inform. Theory, vol. 38, pp , May [8] J. L. Massey, Determining the independence of rom variables, in 1995 IEEE Int. Symp. on Information Theory (Whistler, BC, Canada, Sept , 1995). [9], Causal interpretations of rom variables, in 1995 IEEE Int. Symp. on Information Theory (Special session in honor of Mark Pinsker on the occasion of his 70th birthday) (Whistler, BC, Canada, Sept , 1995). [10] F. Matúš, Abstract functional dependency structures, Theor. Comput. Sci., vol. 81, pp , [11], On equivalence of Markov properties over undirected graphs, J. Appl. Probab., vol. 29, pp , [12], Ascending descending conditional independence relations, in Trans. 11th Prague Conf. on Information Theory, Statistical Decision Functions Rom Processes (Academia, Prague, 1992), vol. B, pp [13], Probabilistic conditional independence structures matroid theory: Background, Int. J. General Syst., vol. 22, pp , [14], Extreme convex set functions with many nonnegative differences, Discr. Math., vol. 135, pp , [15], Conditional independence among four rom variables II, Combin., Prob. Comput., to be published. [16], Conditional independence structures examined via minors, Ann. Math. Artificial Intell., submitted for publication. [17] F. Matúš M. Studený, Conditional independence among four rom variables I, Combin., Prob. Comput., to be published. [18] J. Pearl, Probabilistic Reasoning in Intelligent Systems. San Mateo, CA: Morgan Kaufman, [19] G. Strang, Linear Algebra Its Applications, 2nd ed. New York: Academic, [20] M. Studený, Attempts at axiomatic description of conditional independence, in Proc. Work. on Uncertainty Processing in Expert Systems, supplement to Kybernetika, vol. 25, nos. 1 3, pp , [21], Multiinformation the problem of characterization of conditional independence relations, Probl. Contr. Inform. Theory, vol. 18, pp. 3 16, [22], Conditional independence relations have no finite complete characterization, in Trans. 11th Prague Conf. on Information Theory, Statistical Decision Functions Rom Processes (Academia, Prague, 1992), vol. B, pp [23], Structural semigraphoids, Int. J. Gen. Syst., submitted for publication. [24], Descriptions of structures of stochastic independence by means of faces imsets (in three parts), Int. J. Gen. Syst., submitted for publication. [25], A new outlook on Shannon s information measures, IEEE Trans. Inform. Theory, vol. 37, pp , May [26], Multilevel diversity coding with distortion, IEEE Trans. Inform. Theory, vol. 41, pp , Mar [27] R. W. Yeung Y.-O. Yan, ITIP, [Online] Available WWW: ITIP. [28] R. W. Yeung Z. Zhang, Miltilevel distributed source coding, in 1997 IEEE Int. Symp. on Information Theory (Ulm, Germany, June 1997), p [29] Z. Zhang R. W. Yeung, A non-shannon type conditional information inequality, this issue, pp [30], On the characterization of entropy function via information inequalities, to be published in IEEE Trans. Inform. Theory.

1440 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 44, NO. 4, JULY On Characterization of Entropy Function via Information Inequalities

1440 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 44, NO. 4, JULY On Characterization of Entropy Function via Information Inequalities 1440 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 44, NO. 4, JULY 1998 On Characterization of Entropy Function via Information Inequalities Zhen Zhang, Senior Member, IEEE, Raymond W. Yeung, Senior Member,

More information

Amobile satellite communication system, like Motorola s

Amobile satellite communication system, like Motorola s I TRANSACTIONS ON INFORMATION THORY, VOL. 45, NO. 4, MAY 1999 1111 Distributed Source Coding for Satellite Communications Raymond W. Yeung, Senior Member, I, Zhen Zhang, Senior Member, I Abstract Inspired

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

32 Divisibility Theory in Integral Domains

32 Divisibility Theory in Integral Domains 3 Divisibility Theory in Integral Domains As we have already mentioned, the ring of integers is the prototype of integral domains. There is a divisibility relation on * : an integer b is said to be divisible

More information

A Generalized Uncertainty Principle and Sparse Representation in Pairs of Bases

A Generalized Uncertainty Principle and Sparse Representation in Pairs of Bases 2558 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 48, NO 9, SEPTEMBER 2002 A Generalized Uncertainty Principle Sparse Representation in Pairs of Bases Michael Elad Alfred M Bruckstein Abstract An elementary

More information

Symmetries in the Entropy Space

Symmetries in the Entropy Space Symmetries in the Entropy Space Jayant Apte, Qi Chen, John MacLaren Walsh Abstract This paper investigates when Shannon-type inequalities completely characterize the part of the closure of the entropy

More information

The small ball property in Banach spaces (quantitative results)

The small ball property in Banach spaces (quantitative results) The small ball property in Banach spaces (quantitative results) Ehrhard Behrends Abstract A metric space (M, d) is said to have the small ball property (sbp) if for every ε 0 > 0 there exists a sequence

More information

ASIGNIFICANT research effort has been devoted to the. Optimal State Estimation for Stochastic Systems: An Information Theoretic Approach

ASIGNIFICANT research effort has been devoted to the. Optimal State Estimation for Stochastic Systems: An Information Theoretic Approach IEEE TRANSACTIONS ON AUTOMATIC CONTROL, VOL 42, NO 6, JUNE 1997 771 Optimal State Estimation for Stochastic Systems: An Information Theoretic Approach Xiangbo Feng, Kenneth A Loparo, Senior Member, IEEE,

More information

On the Conditional Independence Implication Problem: A Lattice-Theoretic Approach

On the Conditional Independence Implication Problem: A Lattice-Theoretic Approach On the Conditional Independence Implication Problem: A Lattice-Theoretic Approach Mathias Niepert Department of Computer Science Indiana University Bloomington, IN, USA mniepert@cs.indiana.edu Dirk Van

More information

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels Representation of Correlated s into Graphs for Transmission over Broadcast s Suhan Choi Department of Electrical Eng. and Computer Science University of Michigan, Ann Arbor, MI 80, USA Email: suhanc@eecs.umich.edu

More information

The Capacity Region for Multi-source Multi-sink Network Coding

The Capacity Region for Multi-source Multi-sink Network Coding The Capacity Region for Multi-source Multi-sink Network Coding Xijin Yan Dept. of Electrical Eng. - Systems University of Southern California Los Angeles, CA, U.S.A. xyan@usc.edu Raymond W. Yeung Dept.

More information

Shannon meets Wiener II: On MMSE estimation in successive decoding schemes

Shannon meets Wiener II: On MMSE estimation in successive decoding schemes Shannon meets Wiener II: On MMSE estimation in successive decoding schemes G. David Forney, Jr. MIT Cambridge, MA 0239 USA forneyd@comcast.net Abstract We continue to discuss why MMSE estimation arises

More information

Decomposing Bent Functions

Decomposing Bent Functions 2004 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 49, NO. 8, AUGUST 2003 Decomposing Bent Functions Anne Canteaut and Pascale Charpin Abstract In a recent paper [1], it is shown that the restrictions

More information

Boolean Algebras. Chapter 2

Boolean Algebras. Chapter 2 Chapter 2 Boolean Algebras Let X be an arbitrary set and let P(X) be the class of all subsets of X (the power set of X). Three natural set-theoretic operations on P(X) are the binary operations of union

More information

Optimization in Information Theory

Optimization in Information Theory Optimization in Information Theory Dawei Shen November 11, 2005 Abstract This tutorial introduces the application of optimization techniques in information theory. We revisit channel capacity problem from

More information

The Reduction of Graph Families Closed under Contraction

The Reduction of Graph Families Closed under Contraction The Reduction of Graph Families Closed under Contraction Paul A. Catlin, Department of Mathematics Wayne State University, Detroit MI 48202 November 24, 2004 Abstract Let S be a family of graphs. Suppose

More information

On Lossless Coding With Coded Side Information Daniel Marco, Member, IEEE, and Michelle Effros, Fellow, IEEE

On Lossless Coding With Coded Side Information Daniel Marco, Member, IEEE, and Michelle Effros, Fellow, IEEE 3284 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 7, JULY 2009 On Lossless Coding With Coded Side Information Daniel Marco, Member, IEEE, Michelle Effros, Fellow, IEEE Abstract This paper considers

More information

arxiv: v1 [cs.sy] 2 Apr 2019

arxiv: v1 [cs.sy] 2 Apr 2019 On the Existence of a Fixed Spectrum for a Multi-channel Linear System: A Matroid Theory Approach F Liu 1 and A S Morse 1 arxiv:190401499v1 [cssy] 2 Apr 2019 Abstract Conditions for the existence of a

More information

4. Algebra and Duality

4. Algebra and Duality 4-1 Algebra and Duality P. Parrilo and S. Lall, CDC 2003 2003.12.07.01 4. Algebra and Duality Example: non-convex polynomial optimization Weak duality and duality gap The dual is not intrinsic The cone

More information

On Markov Properties in Evidence Theory

On Markov Properties in Evidence Theory On Markov Properties in Evidence Theory 131 On Markov Properties in Evidence Theory Jiřina Vejnarová Institute of Information Theory and Automation of the ASCR & University of Economics, Prague vejnar@utia.cas.cz

More information

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,

More information

Estimates for probabilities of independent events and infinite series

Estimates for probabilities of independent events and infinite series Estimates for probabilities of independent events and infinite series Jürgen Grahl and Shahar evo September 9, 06 arxiv:609.0894v [math.pr] 8 Sep 06 Abstract This paper deals with finite or infinite sequences

More information

ONE of the main applications of wireless sensor networks

ONE of the main applications of wireless sensor networks 2658 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 6, JUNE 2006 Coverage by Romly Deployed Wireless Sensor Networks Peng-Jun Wan, Member, IEEE, Chih-Wei Yi, Member, IEEE Abstract One of the main

More information

CHAPTER 0 PRELIMINARY MATERIAL. Paul Vojta. University of California, Berkeley. 18 February 1998

CHAPTER 0 PRELIMINARY MATERIAL. Paul Vojta. University of California, Berkeley. 18 February 1998 CHAPTER 0 PRELIMINARY MATERIAL Paul Vojta University of California, Berkeley 18 February 1998 This chapter gives some preliminary material on number theory and algebraic geometry. Section 1 gives basic

More information

WE start with a general discussion. Suppose we have

WE start with a general discussion. Suppose we have 646 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 43, NO. 2, MARCH 1997 Minimax Redundancy for the Class of Memoryless Sources Qun Xie and Andrew R. Barron, Member, IEEE Abstract Let X n = (X 1 ; 111;Xn)be

More information

Representing Independence Models with Elementary Triplets

Representing Independence Models with Elementary Triplets Representing Independence Models with Elementary Triplets Jose M. Peña ADIT, IDA, Linköping University, Sweden jose.m.pena@liu.se Abstract An elementary triplet in an independence model represents a conditional

More information

Isomorphisms between pattern classes

Isomorphisms between pattern classes Journal of Combinatorics olume 0, Number 0, 1 8, 0000 Isomorphisms between pattern classes M. H. Albert, M. D. Atkinson and Anders Claesson Isomorphisms φ : A B between pattern classes are considered.

More information

Chapter 7. Error Control Coding. 7.1 Historical background. Mikael Olofsson 2005

Chapter 7. Error Control Coding. 7.1 Historical background. Mikael Olofsson 2005 Chapter 7 Error Control Coding Mikael Olofsson 2005 We have seen in Chapters 4 through 6 how digital modulation can be used to control error probabilities. This gives us a digital channel that in each

More information

ORTHOGONAL ARRAYS OF STRENGTH 3 AND SMALL RUN SIZES

ORTHOGONAL ARRAYS OF STRENGTH 3 AND SMALL RUN SIZES ORTHOGONAL ARRAYS OF STRENGTH 3 AND SMALL RUN SIZES ANDRIES E. BROUWER, ARJEH M. COHEN, MAN V.M. NGUYEN Abstract. All mixed (or asymmetric) orthogonal arrays of strength 3 with run size at most 64 are

More information

Welsh s problem on the number of bases of matroids

Welsh s problem on the number of bases of matroids Welsh s problem on the number of bases of matroids Edward S. T. Fan 1 and Tony W. H. Wong 2 1 Department of Mathematics, California Institute of Technology 2 Department of Mathematics, Kutztown University

More information

On Conditional Independence in Evidence Theory

On Conditional Independence in Evidence Theory 6th International Symposium on Imprecise Probability: Theories and Applications, Durham, United Kingdom, 2009 On Conditional Independence in Evidence Theory Jiřina Vejnarová Institute of Information Theory

More information

4 CONNECTED PROJECTIVE-PLANAR GRAPHS ARE HAMILTONIAN. Robin Thomas* Xingxing Yu**

4 CONNECTED PROJECTIVE-PLANAR GRAPHS ARE HAMILTONIAN. Robin Thomas* Xingxing Yu** 4 CONNECTED PROJECTIVE-PLANAR GRAPHS ARE HAMILTONIAN Robin Thomas* Xingxing Yu** School of Mathematics Georgia Institute of Technology Atlanta, Georgia 30332, USA May 1991, revised 23 October 1993. Published

More information

Chapter 2 Metric Spaces

Chapter 2 Metric Spaces Chapter 2 Metric Spaces The purpose of this chapter is to present a summary of some basic properties of metric and topological spaces that play an important role in the main body of the book. 2.1 Metrics

More information

Tree sets. Reinhard Diestel

Tree sets. Reinhard Diestel 1 Tree sets Reinhard Diestel Abstract We study an abstract notion of tree structure which generalizes treedecompositions of graphs and matroids. Unlike tree-decompositions, which are too closely linked

More information

Planar and Affine Spaces

Planar and Affine Spaces Planar and Affine Spaces Pýnar Anapa İbrahim Günaltılı Hendrik Van Maldeghem Abstract In this note, we characterize finite 3-dimensional affine spaces as the only linear spaces endowed with set Ω of proper

More information

Variable Length Codes for Degraded Broadcast Channels

Variable Length Codes for Degraded Broadcast Channels Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates

More information

Chain Independence and Common Information

Chain Independence and Common Information 1 Chain Independence and Common Information Konstantin Makarychev and Yury Makarychev Abstract We present a new proof of a celebrated result of Gács and Körner that the common information is far less than

More information

WITH advances in communications media and technologies

WITH advances in communications media and technologies IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 45, NO. 6, SEPTEMBER 1999 1887 Distortion-Rate Bounds for Fixed- Variable-Rate Multiresolution Source Codes Michelle Effros, Member, IEEE Abstract The source

More information

On the mean connected induced subgraph order of cographs

On the mean connected induced subgraph order of cographs AUSTRALASIAN JOURNAL OF COMBINATORICS Volume 71(1) (018), Pages 161 183 On the mean connected induced subgraph order of cographs Matthew E Kroeker Lucas Mol Ortrud R Oellermann University of Winnipeg Winnipeg,

More information

On bounded redundancy of universal codes

On bounded redundancy of universal codes On bounded redundancy of universal codes Łukasz Dębowski Institute of omputer Science, Polish Academy of Sciences ul. Jana Kazimierza 5, 01-248 Warszawa, Poland Abstract onsider stationary ergodic measures

More information

Linear Programming Redux

Linear Programming Redux Linear Programming Redux Jim Bremer May 12, 2008 The purpose of these notes is to review the basics of linear programming and the simplex method in a clear, concise, and comprehensive way. The book contains

More information

Math 324 Summer 2012 Elementary Number Theory Notes on Mathematical Induction

Math 324 Summer 2012 Elementary Number Theory Notes on Mathematical Induction Math 4 Summer 01 Elementary Number Theory Notes on Mathematical Induction Principle of Mathematical Induction Recall the following axiom for the set of integers. Well-Ordering Axiom for the Integers If

More information

Optimal Decentralized Control of Coupled Subsystems With Control Sharing

Optimal Decentralized Control of Coupled Subsystems With Control Sharing IEEE TRANSACTIONS ON AUTOMATIC CONTROL, VOL. 58, NO. 9, SEPTEMBER 2013 2377 Optimal Decentralized Control of Coupled Subsystems With Control Sharing Aditya Mahajan, Member, IEEE Abstract Subsystems that

More information

6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011

6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011 6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011 On the Structure of Real-Time Encoding and Decoding Functions in a Multiterminal Communication System Ashutosh Nayyar, Student

More information

SUCCESSIVE refinement of information, or scalable

SUCCESSIVE refinement of information, or scalable IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 49, NO. 8, AUGUST 2003 1983 Additive Successive Refinement Ertem Tuncel, Student Member, IEEE, Kenneth Rose, Fellow, IEEE Abstract Rate-distortion bounds for

More information

Chapter 1. Preliminaries

Chapter 1. Preliminaries Introduction This dissertation is a reading of chapter 4 in part I of the book : Integer and Combinatorial Optimization by George L. Nemhauser & Laurence A. Wolsey. The chapter elaborates links between

More information

On Scalable Source Coding for Multiple Decoders with Side Information

On Scalable Source Coding for Multiple Decoders with Side Information On Scalable Source Coding for Multiple Decoders with Side Information Chao Tian School of Computer and Communication Sciences Laboratory for Information and Communication Systems (LICOS), EPFL, Lausanne,

More information

THE STRUCTURE OF 3-CONNECTED MATROIDS OF PATH WIDTH THREE

THE STRUCTURE OF 3-CONNECTED MATROIDS OF PATH WIDTH THREE THE STRUCTURE OF 3-CONNECTED MATROIDS OF PATH WIDTH THREE RHIANNON HALL, JAMES OXLEY, AND CHARLES SEMPLE Abstract. A 3-connected matroid M is sequential or has path width 3 if its ground set E(M) has a

More information

In this initial chapter, you will be introduced to, or more than likely be reminded of, a

In this initial chapter, you will be introduced to, or more than likely be reminded of, a 1 Sets In this initial chapter, you will be introduced to, or more than likely be reminded of, a fundamental idea that occurs throughout mathematics: sets. Indeed, a set is an object from which every mathematical

More information

Inferring the Causal Decomposition under the Presence of Deterministic Relations.

Inferring the Causal Decomposition under the Presence of Deterministic Relations. Inferring the Causal Decomposition under the Presence of Deterministic Relations. Jan Lemeire 1,2, Stijn Meganck 1,2, Francesco Cartella 1, Tingting Liu 1 and Alexander Statnikov 3 1-ETRO Department, Vrije

More information

SPARSE signal representations have gained popularity in recent

SPARSE signal representations have gained popularity in recent 6958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 10, OCTOBER 2011 Blind Compressed Sensing Sivan Gleichman and Yonina C. Eldar, Senior Member, IEEE Abstract The fundamental principle underlying

More information

Part V. 17 Introduction: What are measures and why measurable sets. Lebesgue Integration Theory

Part V. 17 Introduction: What are measures and why measurable sets. Lebesgue Integration Theory Part V 7 Introduction: What are measures and why measurable sets Lebesgue Integration Theory Definition 7. (Preliminary). A measure on a set is a function :2 [ ] such that. () = 2. If { } = is a finite

More information

Chapter 4. Measure Theory. 1. Measure Spaces

Chapter 4. Measure Theory. 1. Measure Spaces Chapter 4. Measure Theory 1. Measure Spaces Let X be a nonempty set. A collection S of subsets of X is said to be an algebra on X if S has the following properties: 1. X S; 2. if A S, then A c S; 3. if

More information

Communicating the sum of sources in a 3-sources/3-terminals network

Communicating the sum of sources in a 3-sources/3-terminals network Communicating the sum of sources in a 3-sources/3-terminals network Michael Langberg Computer Science Division Open University of Israel Raanana 43107, Israel Email: mikel@openu.ac.il Aditya Ramamoorthy

More information

Algebraic matroids are almost entropic

Algebraic matroids are almost entropic accepted to Proceedings of the AMS June 28, 2017 Algebraic matroids are almost entropic František Matúš Abstract. Algebraic matroids capture properties of the algebraic dependence among elements of extension

More information

5 Set Operations, Functions, and Counting

5 Set Operations, Functions, and Counting 5 Set Operations, Functions, and Counting Let N denote the positive integers, N 0 := N {0} be the non-negative integers and Z = N 0 ( N) the positive and negative integers including 0, Q the rational numbers,

More information

On the intersection of infinite matroids

On the intersection of infinite matroids On the intersection of infinite matroids Elad Aigner-Horev Johannes Carmesin Jan-Oliver Fröhlich University of Hamburg 9 July 2012 Abstract We show that the infinite matroid intersection conjecture of

More information

Part III. 10 Topological Space Basics. Topological Spaces

Part III. 10 Topological Space Basics. Topological Spaces Part III 10 Topological Space Basics Topological Spaces Using the metric space results above as motivation we will axiomatize the notion of being an open set to more general settings. Definition 10.1.

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

arxiv: v4 [cs.it] 17 Oct 2015

arxiv: v4 [cs.it] 17 Oct 2015 Upper Bounds on the Relative Entropy and Rényi Divergence as a Function of Total Variation Distance for Finite Alphabets Igal Sason Department of Electrical Engineering Technion Israel Institute of Technology

More information

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 2, FEBRUARY Uplink Downlink Duality Via Minimax Duality. Wei Yu, Member, IEEE (1) (2)

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 2, FEBRUARY Uplink Downlink Duality Via Minimax Duality. Wei Yu, Member, IEEE (1) (2) IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 2, FEBRUARY 2006 361 Uplink Downlink Duality Via Minimax Duality Wei Yu, Member, IEEE Abstract The sum capacity of a Gaussian vector broadcast channel

More information

An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets

An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets Jing Guo University of Cambridge jg582@cam.ac.uk Jossy Sayir University of Cambridge j.sayir@ieee.org Minghai Qin

More information

On minimal models of the Region Connection Calculus

On minimal models of the Region Connection Calculus Fundamenta Informaticae 69 (2006) 1 20 1 IOS Press On minimal models of the Region Connection Calculus Lirong Xia State Key Laboratory of Intelligent Technology and Systems Department of Computer Science

More information

The Shannon s basic inequalities refer to the following fundamental properties of entropy function:

The Shannon s basic inequalities refer to the following fundamental properties of entropy function: COMMUNICATIONS IN INFORMATION AND SYSTEMS c 2003 International Press Vol. 3, No. 1, pp. 47-60, June 2003 004 ON A NEW NON-SHANNON TYPE INFORMATION INEQUALITY ZHEN ZHANG Abstract. Recently, K. Makarychev,

More information

CS 6820 Fall 2014 Lectures, October 3-20, 2014

CS 6820 Fall 2014 Lectures, October 3-20, 2014 Analysis of Algorithms Linear Programming Notes CS 6820 Fall 2014 Lectures, October 3-20, 2014 1 Linear programming The linear programming (LP) problem is the following optimization problem. We are given

More information

On Linear Subspace Codes Closed under Intersection

On Linear Subspace Codes Closed under Intersection On Linear Subspace Codes Closed under Intersection Pranab Basu Navin Kashyap Abstract Subspace codes are subsets of the projective space P q(n), which is the set of all subspaces of the vector space F

More information

The Poisson Channel with Side Information

The Poisson Channel with Side Information The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH

More information

MATH 324 Summer 2011 Elementary Number Theory. Notes on Mathematical Induction. Recall the following axiom for the set of integers.

MATH 324 Summer 2011 Elementary Number Theory. Notes on Mathematical Induction. Recall the following axiom for the set of integers. MATH 4 Summer 011 Elementary Number Theory Notes on Mathematical Induction Principle of Mathematical Induction Recall the following axiom for the set of integers. Well-Ordering Axiom for the Integers If

More information

Alphabet Size Reduction for Secure Network Coding: A Graph Theoretic Approach

Alphabet Size Reduction for Secure Network Coding: A Graph Theoretic Approach ALPHABET SIZE REDUCTION FOR SECURE NETWORK CODING: A GRAPH THEORETIC APPROACH 1 Alphabet Size Reduction for Secure Network Coding: A Graph Theoretic Approach Xuan Guang, Member, IEEE, and Raymond W. Yeung,

More information

Lecture Notes 1 Basic Concepts of Mathematics MATH 352

Lecture Notes 1 Basic Concepts of Mathematics MATH 352 Lecture Notes 1 Basic Concepts of Mathematics MATH 352 Ivan Avramidi New Mexico Institute of Mining and Technology Socorro, NM 87801 June 3, 2004 Author: Ivan Avramidi; File: absmath.tex; Date: June 11,

More information

Lecture 3: Oct 7, 2014

Lecture 3: Oct 7, 2014 Information and Coding Theory Autumn 04 Lecturer: Shi Li Lecture : Oct 7, 04 Scribe: Mrinalkanti Ghosh In last lecture we have seen an use of entropy to give a tight upper bound in number of triangles

More information

Conditional Independence

Conditional Independence H. Nooitgedagt Conditional Independence Bachelorscriptie, 18 augustus 2008 Scriptiebegeleider: prof.dr. R. Gill Mathematisch Instituut, Universiteit Leiden Preface In this thesis I ll discuss Conditional

More information

Received: 1 September 2018; Accepted: 10 October 2018; Published: 12 October 2018

Received: 1 September 2018; Accepted: 10 October 2018; Published: 12 October 2018 entropy Article Entropy Inequalities for Lattices Peter Harremoës Copenhagen Business College, Nørre Voldgade 34, 1358 Copenhagen K, Denmark; harremoes@ieee.org; Tel.: +45-39-56-41-71 Current address:

More information

The Information Bottleneck Revisited or How to Choose a Good Distortion Measure

The Information Bottleneck Revisited or How to Choose a Good Distortion Measure The Information Bottleneck Revisited or How to Choose a Good Distortion Measure Peter Harremoës Centrum voor Wiskunde en Informatica PO 94079, 1090 GB Amsterdam The Nederlands PHarremoes@cwinl Naftali

More information

(Preprint of paper to appear in Proc Intl. Symp. on Info. Th. and its Applications, Waikiki, Hawaii, Nov , 1990.)

(Preprint of paper to appear in Proc Intl. Symp. on Info. Th. and its Applications, Waikiki, Hawaii, Nov , 1990.) (Preprint of paper to appear in Proc. 1990 Intl. Symp. on Info. Th. and its Applications, Waikiki, Hawaii, ov. 27-30, 1990.) CAUSALITY, FEEDBACK AD DIRECTED IFORMATIO James L. Massey Institute for Signal

More information

Information Theory CHAPTER. 5.1 Introduction. 5.2 Entropy

Information Theory CHAPTER. 5.1 Introduction. 5.2 Entropy Haykin_ch05_pp3.fm Page 207 Monday, November 26, 202 2:44 PM CHAPTER 5 Information Theory 5. Introduction As mentioned in Chapter and reiterated along the way, the purpose of a communication system is

More information

On Cryptographic Properties of the Cosets of R(1;m)

On Cryptographic Properties of the Cosets of R(1;m) 1494 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 47, NO. 4, MAY 2001 On Cryptographic Properties of the Cosets of R(1;m) Anne Canteaut, Claude Carlet, Pascale Charpin, and Caroline Fontaine Abstract

More information

Journal Algebra Discrete Math.

Journal Algebra Discrete Math. Algebra and Discrete Mathematics Number 2. (2005). pp. 20 35 c Journal Algebra and Discrete Mathematics RESEARCH ARTICLE On posets of width two with positive Tits form Vitalij M. Bondarenko, Marina V.

More information

K 4 -free graphs with no odd holes

K 4 -free graphs with no odd holes K 4 -free graphs with no odd holes Maria Chudnovsky 1 Columbia University, New York NY 10027 Neil Robertson 2 Ohio State University, Columbus, Ohio 43210 Paul Seymour 3 Princeton University, Princeton

More information

Index coding with side information

Index coding with side information Index coding with side information Ehsan Ebrahimi Targhi University of Tartu Abstract. The Index Coding problem has attracted a considerable amount of attention in the recent years. The problem is motivated

More information

Foundations of Mathematics MATH 220 FALL 2017 Lecture Notes

Foundations of Mathematics MATH 220 FALL 2017 Lecture Notes Foundations of Mathematics MATH 220 FALL 2017 Lecture Notes These notes form a brief summary of what has been covered during the lectures. All the definitions must be memorized and understood. Statements

More information

IN THIS PAPER, we consider a class of continuous-time recurrent

IN THIS PAPER, we consider a class of continuous-time recurrent IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II: EXPRESS BRIEFS, VOL. 51, NO. 4, APRIL 2004 161 Global Output Convergence of a Class of Continuous-Time Recurrent Neural Networks With Time-Varying Thresholds

More information

STAT 7032 Probability Spring Wlodek Bryc

STAT 7032 Probability Spring Wlodek Bryc STAT 7032 Probability Spring 2018 Wlodek Bryc Created: Friday, Jan 2, 2014 Revised for Spring 2018 Printed: January 9, 2018 File: Grad-Prob-2018.TEX Department of Mathematical Sciences, University of Cincinnati,

More information

Example: feasibility. Interpretation as formal proof. Example: linear inequalities and Farkas lemma

Example: feasibility. Interpretation as formal proof. Example: linear inequalities and Farkas lemma 4-1 Algebra and Duality P. Parrilo and S. Lall 2006.06.07.01 4. Algebra and Duality Example: non-convex polynomial optimization Weak duality and duality gap The dual is not intrinsic The cone of valid

More information

Duality in Linear Programming

Duality in Linear Programming Duality in Linear Programming Gary D. Knott Civilized Software Inc. 1219 Heritage Park Circle Silver Spring MD 296 phone:31-962-3711 email:knott@civilized.com URL:www.civilized.com May 1, 213.1 Duality

More information

Duality Between Channel Capacity and Rate Distortion With Two-Sided State Information

Duality Between Channel Capacity and Rate Distortion With Two-Sided State Information IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 6, JUNE 2002 1629 Duality Between Channel Capacity Rate Distortion With Two-Sided State Information Thomas M. Cover, Fellow, IEEE, Mung Chiang, Student

More information

Chapter 1 The Real Numbers

Chapter 1 The Real Numbers Chapter 1 The Real Numbers In a beginning course in calculus, the emphasis is on introducing the techniques of the subject;i.e., differentiation and integration and their applications. An advanced calculus

More information

Longest element of a finite Coxeter group

Longest element of a finite Coxeter group Longest element of a finite Coxeter group September 10, 2015 Here we draw together some well-known properties of the (unique) longest element w in a finite Coxeter group W, with reference to theorems and

More information

On Scalable Coding in the Presence of Decoder Side Information

On Scalable Coding in the Presence of Decoder Side Information On Scalable Coding in the Presence of Decoder Side Information Emrah Akyol, Urbashi Mitra Dep. of Electrical Eng. USC, CA, US Email: {eakyol, ubli}@usc.edu Ertem Tuncel Dep. of Electrical Eng. UC Riverside,

More information

(Reprint of pp in Proc. 2nd Int. Workshop on Algebraic and Combinatorial coding Theory, Leningrad, Sept , 1990)

(Reprint of pp in Proc. 2nd Int. Workshop on Algebraic and Combinatorial coding Theory, Leningrad, Sept , 1990) (Reprint of pp. 154-159 in Proc. 2nd Int. Workshop on Algebraic and Combinatorial coding Theory, Leningrad, Sept. 16-22, 1990) SYSTEMATICITY AND ROTATIONAL INVARIANCE OF CONVOLUTIONAL CODES OVER RINGS

More information

CHAPTER 2: CONVEX SETS AND CONCAVE FUNCTIONS. W. Erwin Diewert January 31, 2008.

CHAPTER 2: CONVEX SETS AND CONCAVE FUNCTIONS. W. Erwin Diewert January 31, 2008. 1 ECONOMICS 594: LECTURE NOTES CHAPTER 2: CONVEX SETS AND CONCAVE FUNCTIONS W. Erwin Diewert January 31, 2008. 1. Introduction Many economic problems have the following structure: (i) a linear function

More information

THIS paper is aimed at designing efficient decoding algorithms

THIS paper is aimed at designing efficient decoding algorithms IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 45, NO. 7, NOVEMBER 1999 2333 Sort-and-Match Algorithm for Soft-Decision Decoding Ilya Dumer, Member, IEEE Abstract Let a q-ary linear (n; k)-code C be used

More information

Executive Assessment. Executive Assessment Math Review. Section 1.0, Arithmetic, includes the following topics:

Executive Assessment. Executive Assessment Math Review. Section 1.0, Arithmetic, includes the following topics: Executive Assessment Math Review Although the following provides a review of some of the mathematical concepts of arithmetic and algebra, it is not intended to be a textbook. You should use this chapter

More information

The cocycle lattice of binary matroids

The cocycle lattice of binary matroids Published in: Europ. J. Comb. 14 (1993), 241 250. The cocycle lattice of binary matroids László Lovász Eötvös University, Budapest, Hungary, H-1088 Princeton University, Princeton, NJ 08544 Ákos Seress*

More information

Stat 451: Solutions to Assignment #1

Stat 451: Solutions to Assignment #1 Stat 451: Solutions to Assignment #1 2.1) By definition, 2 Ω is the set of all subsets of Ω. Therefore, to show that 2 Ω is a σ-algebra we must show that the conditions of the definition σ-algebra are

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

Boolean degree 1 functions on some classical association schemes

Boolean degree 1 functions on some classical association schemes Boolean degree 1 functions on some classical association schemes Yuval Filmus, Ferdinand Ihringer February 16, 2018 Abstract We investigate Boolean degree 1 functions for several classical association

More information

Diskrete Mathematik und Optimierung

Diskrete Mathematik und Optimierung Diskrete Mathematik und Optimierung Steffen Hitzemann and Winfried Hochstättler: On the Combinatorics of Galois Numbers Technical Report feu-dmo012.08 Contact: steffen.hitzemann@arcor.de, winfried.hochstaettler@fernuni-hagen.de

More information

Critical Reading of Optimization Methods for Logical Inference [1]

Critical Reading of Optimization Methods for Logical Inference [1] Critical Reading of Optimization Methods for Logical Inference [1] Undergraduate Research Internship Department of Management Sciences Fall 2007 Supervisor: Dr. Miguel Anjos UNIVERSITY OF WATERLOO Rajesh

More information

Characterization of Semantics for Argument Systems

Characterization of Semantics for Argument Systems Characterization of Semantics for Argument Systems Philippe Besnard and Sylvie Doutre IRIT Université Paul Sabatier 118, route de Narbonne 31062 Toulouse Cedex 4 France besnard, doutre}@irit.fr Abstract

More information