Cut-Set Bound and Dependence Balance Bound Lei Xiao lxiao@nd.edu 1 Date: 4 October, 2006 Reading: Elements of information theory by Cover and Thomas [1, Section 14.10], and the paper by Hekstra and Willems [2]. I. THE CUT-SET BOUND This section follows the discussion in [1, Section 14.10]. Consider m nodes in a memoryless network. Each node i transmit X (i) and receives Y (i). Communication from node i to node j is at rate R (ij). If we partition the network into a set of nodes S and its complement S c, the cut set bound gives us the following constraint on the possible achievable rates, in the sense that the probability of decoding errors goes to zeros as the codeword length approaches infinity. Theorem 1: If the information rates {R (ji) } are achievable, then there exists some joint probability distribution p(x (1), x (2),, x (m) ), such that R i S j S c (ij) I(X (S) ; Y (Sc) X (Sc) ) (1) for all S {1, 2,, m}. Thus the total rate of flow of information across a cut is bounded by the conditional mutual information. It is worth noting that cut-set bound are usually not achievable even for simple channels. For instance, the cut-set bound for multiple access channel takes the same form as the capacity, but fails to provide a restriction on the independence of the input distribution [1, Section 14.10].
2 II. THE DEPENDENCE BALANCE BOUND We start with the concept of K-information (also known as multiple information) i.e. the mutual information among K random variables and some of its properties. The dependence balance bound is then derived utilizing the results on K-information. A. The K-Information Definition 1: The K-information I K, is defined by ( ) K ( ) k 1 I K V1 ; V 2 ; ; V K = 1 k=1 } S {V 1,V 2,,V K S =k where V 1, V 2,, V K are K random variables (hence then name K-information); H(S) denotes the joint entropy of the random variables in the set S. S is the cardinality of the set S. The conditional K-information can be defined likewise ( ) K ( ) I K V1 ; V 2 ; ; V V0 k 1 K = 1 k=1 } S {V 1,V 2,,V K S =k H(S), (2) H(S V 0 ) (3) For the purpose of illustration, a few examples for small K s are given as follows: I 1 (A) = H(A) (4a) I 2 (A; B) = H(A) + H(B) H(A, B) = I(A; B) (4b) I 3 (A; B; C) = H(A) + H(B) + H(C) H(A, B) H(A, C) H(B, C) + H(A, B, C) (4c) = I(A; B) + I(C; B) I(A, C; B) (4d) Some properties of the K-information: K-information is symmetric in the variables. This is explained by the symmetry of the variables in the entropy function.
3 In contrast to the mutual information for two random variables, K-information can in general be negative. Chaining property ( ) I K (V1, V 2 ); V 3 ; V 4 ; ; V K+1 V 0 = I K (V 1 ; V 3 ; V 4 ; ; V K+1 V 0 ) + I K (V 2 ; V 3 ; V 4 ; ; V K+1 V 0, V 1 ) (5) Recursive relation ( ) I K V1 ; V 2 ; ; V K V 0 = I K 1 (V 1 ; V 2 ; ; V K 1 V 0 ) + I K 1 (V 1 ; V 2 ; ; V K 1 V 0, V K ) (6) For the case of K = 3, the above recursive relation gives I 3 (A; B; C D) = I 2 (A; B D) I 2 (A; B C, D) (7) More discussion (and references) on the subject of K-information can be found in Sunil Srinivasa s tutorial for EE80653 at http://www.nd.edu/ jnl/ee80653/tutorials/sunil.pdf (University of Notre Dame NetID and password required). B. The Dependence Balance Bound We consider the constraint on the input distribution on the two-input channel with noiseless feedback to the encoder, as depicted in Fig. 1. More specifically, the following are assumed: The two messages W 1 and W 2 at the input of the encoders are statistically independent, uniformly distributed, and of alphabet sizes M 1 and M 2, respectively. The messages are to be transmitted over the channel via N transmissions. Each encoder is completely described by N encoding functions, mapping the message, W 1 or W 2, and the previous channel outputs, Y n 1, into the next channel input, i.e. X 1n = f 1n (W 1, Y n 1 ) (8a) X 2n = f 2n (W 1, Y n 1 ) n = 1, 2,, N (8b)
4 W 1 ENC 1 X N 1 Two Input Channel Y N W 2 ENC 2 X N 2 Fig. 1. Two input, one output channel with feedback considered in dependence balance bound. Hence there is the Markov relation (W 1, W 2, Y n 1 ) (X 1n, X 2n ) Y n (9) The channel is discrete memoryless. It takes two inputs X 1, X 2 and generates an output Y via the conditional PDF P (y x 1, x 2 ). The dependence balance bound states that, for such an encoder pair with feedback, the input distribution must satisfy N ( I(X1n ; X 2n Y n ) I(X 1n ; X 2n Y n 1 ) ) = N I 3 (X 1n ; X 2n ; Y n Y n 1 ) 0 (10) Note that the amount of mutual information reflects the dependence among two random variables. The term I(X 1n ; X 2n Y n 1 ) can be interpretated as the dependence that is consumed in transmission, i.e. the the level of correlation the two seperate encoders can exploit to guess each other s input to the channel prior to that transmission; then the term I(X 1n ; X 2n Y n ) can be regarded as the dependence that is produced in that transmission. The dependence balance
5 bound indicates that each code has to produce at least the amount of dependence it consumes. Hence the name dependence balance bound. The dependence balance bound also gives K-information a physical interpretation for K = 3 as the dependence reduction [2]. III. APPLICATION: TWO WAY CHANNEL The dependence balance bound was derived in [2] to provide a tight outer bound for the common-output two way channel, as presented in Fig. 2. The error probabilities are defined as W 1 ENC 1 X 1 X 1 DEC 1 Ŵ 2 Two Input Channel Y W 2 ENC 2 X 2 X 2 DEC 2 Ŵ 1 Fig. 2. Single output two way configuration. P e1 = Pr{Ŵ1 W 1 } (11) P e2 = Pr{Ŵ2 W 2 } (12) It can be shown that [2] log(m 1 ) log(m 2 ) N I(X 1n ; Y n X 2n, Y n 1 ) + Nɛ 1 (13) N I(X 2n ; Y n X 1n, Y n 1 ) + Nɛ 2 (14)
6 Note that the two input channel with feedback should satisfy the dependence balance constraint in (10). Let S be a uniform random variable taking value from 1 to N, define T = (S, Y S 1 ) (15) X 1 = X 1S (16) X 2 = X 2S (17) Y = Y S (18) and the converse with the dependence balance bound can be stated as follows. Theorem 2: For each single output two way channel, the capacity region is upper bounded by 0 R 1 I(X 1 ; Y X 2, T ) (19) 0 R 2 I(X 2 ; Y X 1, T ) (20) for some p(t, x 1, x 2, y) = p(t, x 1, x 2 )p(y x 1, x 2 ) satisfying I(X 1 ; X 2 T ) I(X 1 ; X 2 Y, T ). The support lemma can bound the alphabet size of T to three. IV. COMPARISON OF THE TWO BOUNDS The cut-set bound and the dependence balance bound are different in a number of aspects. The following is a comparison between the two bounds. The cut-set bound is applicable to discrete memoryless network of any form (for example, with or without feedback, scalar output channels, or vector output channels). The dependence balance bound derived in [2] is only applicable to two input, single output channels with noiseless feedback to the two individual encoders. Kramer generalized the dependence balance bound to more than just two inputs in [3]. However, single output at the channel, and feedback of this single output to the encoders are necessary to apply the dependence balance bound. The cut set bound directly gives a upper bound on the rate, while the dependence balance bound only provides an additional constraint to the possible input distribution which help
7 tighten the outer bound due to a smaller region to maximize over. In this sense, the cut-set bound and dependence balance bound represent constraints on different quantities. For the multiple access channel with feedback, the cut-set bound coincide with the capacity region in form [1], but does not restrict the input distribution to be a product form. Using the dependence balance bound and a parallel channel, it is shown in [2] that the dependence balance bound can provide the product form constraint originally derived in [4] with a different approach. V. RECENT DEVELOPMENTS The refined and generalized dependence balance bounds are developed and applied to K- user Gaussian multiple access channel with feedback in [5]. Kramer and Savari studied two way channel networks with network coding in [6] and provides a new cut-set bound for such networks. REFERENCES [1] T.M.Cover and J.A.Thomas, Elements of Information Theory. New York: Wiley-Interscience, 1991. [2] A. P. Hekstra and F. M. J. Willems, Dependence balance bounds for single-output two-way channels, IEEE Trans. Inform. Theory, vol. 35, pp. 44 53, Jan. 1989. [3] G. Kramer, Capacity results for the discrete memoryless network, IEEE Trans. Inform. Theory, vol. 49, pp. 4 21, Jan. 2003. [4] F. M. J. Willems, The feedback capacity region of a class of discrete memoryless multiple access channels, IEEE Trans. Inform. Theory, vol. 28, pp. 93 95, Jan. 1982. [5] G. Kramer and M. Gastpar, Dependence balance and the gaussian multiaccess channel with feedback, in Proc. IEEE Information Theory Workshop, Punta del Este, Uruguay, Mar. 2006. [6] G. Kramer and S. A. Savari, Cut sets and information flow in networks of two-way channels, in Proc. IEEE International Symposium on Information Theory, Chicago, IL, June 2004.