The Capacity Region of a Class of Discrete Degraded Interference Channels

Size: px
Start display at page:

Download "The Capacity Region of a Class of Discrete Degraded Interference Channels"

Transcription

1 The Capacity Region of a Class of Discrete Degraded Interference Channels Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@umd.edu ulukus@umd.edu Abstract We provide a single-letter characterization for the capacity region of a class of discrete degraded interference channels (DDICs. The class of DDICs nsidered includes the discrete additive degraded interference channel (DADIC studied by Benzel ]. We show that for the class of DDICs studied, ender operation does not increase the capacity region, and therefore, the capacity region of the class of DDICs is the same as the capacity region of the rresponding degraded broadcast channel. I. INTRODUCTION In wireless mmunications, where multiple transmitter and receiver pairs share the same medium, interference is unavoidable. How to best manage interference ming from other users and how not to cause too much interference to other users while maintaining the quality of mmunication is a challenging question and of a great deal of practical interest. To be able to understand the effect of interference on mmunications better, interference channel (IC has been introduced in ]. The IC is a simple network nsisting of two pairs of transmitters and receivers. Each pair wishes to mmunicate at a certain rate with negligible probability of error. However, the two mmunications interfere with each other. To best understand the management of interference, we need to find the capacity region of the IC. However, the problem of finding the capacity region of the IC is essentially open except in some special cases, e.g., a class of deterministic ICs 3], discrete additive degraded interference channels (DADICs ], strong ICs 4], 5], ICs with statistically equivalent outputs 6] 8]. In this paper, we nsider a class of discrete degraded interference channels (DDICs. In a DDIC, only the bad receiver faces interference, while the good receiver has the ability to dede both messages and thus, behaves like the receiver of a multiple access channel. It is this fact that makes the DDIC easier to analyze as mpared to the IC, where both receivers are faced with interference. We provide a single-letter characterization for the capacity region of a class of DDICs. The class of DDICs includes the DADICs studied by Benzel ]. We show that for the class of DDICs studied, ender operation does not increase the capacity region, and therefore, the capacity region of the This work was presented at Allerton 6, and submitted to IEEE Trans. on Information Theory. This work was supported by NSF Grants CCR 3-3, CCF and CCF class of DDICs is the same as the capacity region of the rresponding degraded broadcast channel, which is known. II. SYSTEM MODEL A discrete memoryless IC nsists of two transmitters and two receivers. Transmitter has message W to send to receiver. Transmitter has message W to send to receiver. Messages W and W are independent. The channel nsists of two input alphabets, X and X, and two output alphabets, Y and Y. The channel transition probability is p(y, y x, x. In this paper, our definition of degradedness is in the stochastic sense, i.e., we say that an IC is DDIC if there exists a probability distribution p (y y such that p(y x, x = p(y x, x p (y y ( y Y for all x X, x X and y Y. However, we note that for any DDIC, we can form another DDIC (physically degraded by p(y, y x, x = p(y x, x p (y y ( which has the same marginals, p(y x, x and p(y x, x, as the original DDIC. Since the receivers do not operate in an IC, similar to the case of the broadcast channel 9, Problem 4.], the capacity region is only a function of the marginals, p(y x, x and p(y x, x, and the rate pairs in the capacity region can be achieved by the same achievability scheme for different ICs with the same marginals. Hence, the capacity results that we obtain for DDICs which satisfy ( will be valid for any DDIC that has the same marginals, p(y x, x and p(y x, x. Thus, without loss of generality, from now on, we may restrict ourselves to studying DDICs that satisfy (. A DDIC is characterized by two transition probabilities, p (y y and p(y x, x. For notational nvenience, let T denote the Y Y matrix of transition probabilities p (y y, and T x denote the Y X matrix of transition probabilities p(y x, x, for all x X. Throughout the paper, n will denote the probability simplex n (p, p,, p n p i =, p i, i =,,, n i= (3

2 and J n will denote the representation of the symmetric group of permutations of n objects by the n n permutation matrices. The class of DDICs we nsider in this paper satisfies the following nditions: T is input symmetric. Let the input symmetry group be G. For any x, x X, there exists a permutation matrix G G, such that T x = GT x (4 3 H(Y X = x, X = x = η, independent of x, x. 4 p(y x, x satisfies x p(y x, x = X Y, x X, y Y (5 5 Let p x,x be the Y dimensional vector of probabilities p(y x, x for a given x, x. Then, there exists an x X, such that a x,x p x,x : a x,x =, a x,x x,x x (,x G b x p x, x : b x =, b x, G G The definition of an input symmetric channel is given in, Section II.D]. For mpleteness, we repeat it here. For an m n stochastic matrix T (an n input, m output channel, the input symmetry group G is defined as (6 G = G J n : Π J m, T G = ΠT (7 i.e., G is the set of permutation matrices G such that the lumn permutations of T with G may be achieved with rresponding row permutations. T is input symmetric, if G is transitive, i.e., any element of,,, n can be mapped to every other element of,,, n by some member of G. G being a transitive subgroup means that the output entropy of channel T is maximized when the input distribution is chosen to be the uniform distribution, i.e., max p n H(T p = H(T u (8 where u denotes the uniform distribution in n. This is because, for any p n, if we let q = G G G Gp, then we have ( H(T q = H = H ( G G G T Gp G G G Π G T p (9 ( G G G H (Π G T p ( = H(T p ( where ( follows from the fact that G G, and ( follows from the ncavity of the entropy function. Note that for any G G, G q = q (3 by the fact that G is a group. Since G is also transitive, q = u. Condition implies that for any p(x, H(Y X = x does not depend on x. Combined with ndition, ndition further implies that H(Y X = x does not depend on x either. These two facts will be proved and utilized in other proofs later. A sufficient ndition for ndition 3 to hold is that the vectors p(y X = x, X = x for all (x, x X X are permutations of each other. This is true for instance when the channel from Y to Y is additive ]. By ndition 4, we can show that when X takes the uniform distribution, Y will also be uniformly distributed. Combined with ndition, ndition 4 implies that when X takes the uniform distribution, H(Y is maximized, irrespective of p(x. In ndition 5, the first line of (6 denotes the set of all nvex mbinations of vectors p x,x for all x, x X X, while the send line denotes all nvex mbinations, and their permutations with G G, of vectors p x, x for all x X, but for a fixed x X. Therefore, this ndition means that all nvex mbinations of p x,x may be obtained by a mbination of nvex mbinations of p x, x for a fixed x, and permutations in G. The DADICs nsidered in ] satisfy nditions -5, as we will show in Section VI-A. The aim of this paper is to provide a single-letter characterization for the capacity region of DDICs that satisfy nditions -5, and we will follow the proof technique of ] with appropriate generalizations. III. THE OUTER BOUND (CONVERSE When we assume that the enders are able to fully operate, i.e., both enders know both messages W and W, we get a rresponding degraded broadcast channel with input x = (x, x. The capacity region of the rresponding degraded broadcast channel serves as an outer bound on the capacity region of the DDIC. The capacity region of the degraded broadcast channel is known 9], ], ], and thus, a single-letter outer bound on the capacity region of the DDIC is (R, R : R I(X, X ; Y U p(u,p(x,x u R I(U; Y ] (4 where denotes the closure of the nvex hull operation, and the auxiliary random variable U, which satisfies the Markov chain U (X, X Y Y, has cardinality

3 bounded by U min ( Y, Y, X X. More specifically, for DDICs that satisfy ndition 3, (4 can be written as p(u,p(x,x u Let us define T (c as T (c = (R, R : R H(Y U η max p(up(x, x u H(Y U = c U min ( Y, Y, X X R I(U; Y ] (5 I(U; Y (6 where the entropies are calculated acrding to the distribution p(u, x, x, y, y = p(up(x, x up(y x, x p (y y (7 Using ndition 3, we can show that η c log Y. T (c is ncave in c ], 3], and therefore, (5 can also be written as (R, R : R c η η c log Y R T (c IV. AN ACHIEVABLE REGION (8 Based on 7, Theorem 4], the following region is achievable, (R, R : R I(X ; Y X p(x,p(x R I(X ; Y ] (9 which rresponds to the achievability scheme that the bad receiver treats the signal for the good receiver as pure noise, and the good receiver dedes both messages as if it is the receiver in a multiple access channel. For DDICs that satisfy ndition 3, (9 reduces to p(x,p(x (R, R : R H(Y X η ] R H(Y H(Y X ( We note that ( remains an achievable region if we choose p(x to be the uniform distribution. Furthermore, by choosing p(x as the uniform distribution, we have p(y = p(y x, x p(x X x,x ( = p(x p(y x, x X x x ( = Y (3 where (3 uses ndition 4. Thus, when p(x is chosen as the uniform distribution, p(y results in a uniform distribution as well. Let us define τ as τ = max H(T p (4 p Y Using the fact that the DDIC under nsideration satisfies ndition, i.e., it satisfies (8, we have that when p(x is uniform, and nsequently p(y is uniform, H(Y = τ (5 Hence, choosing p(x to be the uniform distribution in (, yields the following as an achievable region, (R, R : R H(Y X = x η X p(x x R τ ] H(Y X = x X x (6 Due to ndition, for any p(x = p and any x, x X, there exists a permutation matrix G G such that H(Y X = x = H(T x p (7 = H(GT x p (8 = H(T x p (9 = H(Y X = x (3 which means that for any p(x, H(Y X = x does not depend on x. Furthermore, for any p(x = p and any x, x X, there exist permutation matrices G G and Π, of order Y and Y respectively, such that H(Y X = x = H(T T x p (3 = H(T GT x p (3 = H(ΠT T x p (33 = H(T T x p (34 = H(Y X = x (35 where (33 follows from the fact that G G. (35 means that for any p(x, H(Y X = x does not depend on x either. Hence, the achievable region in (6 can further be written as (R, R : R H(Y X = x η p(x R τ H(Y X = x ] (36

4 for any x X. Since we will use ndition 5 later, we choose to write the region of (36 as (R, R : R H(Y X = x η p(x where x is given in ndition 5. Let us define F (c as F (c = min p(x H(Y X = x = c R τ H(Y X = x ] (37 H(Y X = x (38 where the entropies are calculated acrding to the distribution p(y, y, x x = p(x p(y x, x p (y y (39 In (38, we can write min instead of inf by the same reasoning as in 4, Section I]. Note that F (c is not a function of x because of (3 and (35. Again, by ndition 3, we can show that η c log Y. Hence, the achievable region in (37 can be written as, (R, R : R c η η c log Y R τ F (c ] (4 which by, Facts 4 and 5], can further be written as (R, R : R c η η c log Y R τ envf (c (4 where envf ( denotes the lower nvex envelope of the function F (. V. THE CAPACITY REGION In this section, we show that the achievable region in (4 ntains the outer bound in (8, and thus, (8 and (4 are both, in fact, single-letter characterizations of the capacity region of DDICs satisfying nditions -5. To show this, it suffices to prove that T (c τ envf (c, η c log Y (4 Let us fix a c η, log Y ]. Let p (u, p (x, x u be the distributions that achieve the maximum in (6, i.e., H(Y U = c (43 I(U; Y = T (c (44 Using ndition 5, for each u U, there exists a p u (x = p u and a permutation matrix G u G, such that x,x p (x, x U = up x,x = G u T x p u (45 Thus, we have H(Y U = u = H (G u T x p u = H (T x p u (46 (46 means that p u is in the feasible set of the optimization in (38 when c = H(Y U = u. Hence, We have F (H (Y U = u H (T T x p u (47 H(Y U = u = H (T G u T x p u (48 = H (Π u T T x p u (49 = H (T T x p u (5 F (H (Y U = u (5 where (48, (49 and (5 follow from (45, the fact that G G, and (47, respectively. Thus, H(Y U = u u P (U = uh(y U = u (5 P (U = uf (H (Y U = u (53 P (U = uenvf (H (Y U = u (54 u ( envf P (U = uh (Y U = u (55 u = envf (H (Y U (56 = envf (c (57 where (53 follows from (5, (54 follows from the definition of env, and (55 follows from nvexity of envf (. Finally, for η c log Y, we have T (c = I(U; Y (58 = H(Y H(Y U (59 τ envf (c (6 where (6 follows from (57 and the definition of τ in (4. Therefore, we nclude that the single-letter characterization of the capacity region of DDICs satisfying nditions -5 is (4, and also (8. To achieve point (R, R on the boundary of the capacity region, if R and R are such that R = c η, R = τ F (c (6 for some η c log Y, transmitters and generate random debooks acrding to p (x, which is the minimizer of F (R + η, and p (x, which is the uniform distribution, respectively, and transmit the dewords rresponding to the realizations of their own messages. Receiver performs successive deding, in the order of message, and then message. Receiver dedes its own message treating interference from transmitter as pure noise. To achieve point (R, R on the capacity region, where R and R do not satisfy (6, time-sharing should be used. Furthermore, we note that for these DDICs, ender operation cannot increase the capacity region.

5 VI. EXAMPLES In this section, we will provide three examples of DDICs for which nditions -5 are satisfied. The first example is the channel model adopted in ], for which the capacity region is already known. In the send and third examples, the capacity regions are previously unknown, and using the results of this paper, we are able to determine the capacity regions. A. Example A DADIC is defined as ] where Y = X X V (6 Y = X X V V (63 X = X = Y = Y = S =,,, s (64 and denotes modulo-s sum, and V and V are independent noise random variables defined over S with distributions p i = (p i (, p i (,, p i (s, i =, (65 Since Y = Y V, matrix T is circulant, and thus input symmetric, Section II.D]. Hence, ndition is satisfied. It is straightforward to check that nditions -5 are also satisfied. For example, when s = 3, we have p ( p ( p ( T = p ( p ( p ( (66 p ( p ( p ( and the input symmetry group for T is G = G =, G =, G = (67 which is transitive, i.e., G, G 3, G, G 3, 3 G, 3 G. From (6, we write p ( p ( p ( T = p ( p ( p ( (68 p ( p ( p ( p ( p ( p ( T = p ( p ( p ( (69 p ( p ( p ( p ( p ( p ( T = p ( p ( p ( (7 p ( p ( p ( Conditions -4 are satisfied because T = G T, T = G T (7 η = H(V (7 x p(y x, x = p ( + p ( + p ( = (73 Next, we check ndition 5. a x,x p x,x : a x,x =, a x,x x,x x,x p ( p ( p ( = a p ( + b p ( + c p ( : p ( p ( p ( a + b + c =, a, b, c (74 (75 because even though (74 is a nvex mbination of 9 vectors, due to vectors repeating themselves in the lumns of T, T and T, the set, in fact, nsists of nvex mbinations of only 3 vectors. On the other hand, for x =, ( G b x p x, x : b x =, b x, G = G (76 p ( p ( p ( = a p ( + b p ( + c p ( : p ( p ( p ( a + b + c =, a, b, c (77 because (76 is the nvex mbinations of the lumns of T, with the unitary permutation. Thus, a x,x p x,x : a x,x =, a x,x x,x x (,x = G b x p x, x : b x =, b x, G = G ( G b x p x, x : b x =, b x, G G and ndition 5 is satisfied. B. Example (78 (79 Next, we nsider the following DDIC. We have X = X = Y =, Y = 3, and p(y x, x is characterized by Y = X X V (8 where V is Bernoulli with p. p (y y is an erasure channel with parameter α, i.e., the transition probability matrix is α T = α α (8 α Thus, the channel is such that the bad receiver cannot receive all the bits that the good receiver receives. More specifically, α proportion of the time, whether the bit is a or is unregnizable, and thus denoted as an erasure e.

6 It is easy to see that T is input symmetric because the input symmetry group ] ] G =, (8.8 cba bca is transitive. Conditions -5 are satisfied because p(y x, x is the same as in Example in Section VI-A. C. Example 3 Let a, b, c, d, e, f be non-negative numbers such that a+b+ c = and d+e+f = /. We have X = 4, X = Y = 3, and Y = 6. The DDIC is described as d e f e f d T = d f e f e d (83 e d f f d e T = (84 T = (85 T = (86 It is straightforward to see that T is input symmetric because the input symmetry group G = G =, G =, G =, G 3 =, G 4 =, G 5 = (87 is transitive. Conditions -4 are satisfied because T = G T, T = G T (88 η = a log a b log b c log c (89 x p(y x, x = a + b + c = (9 To show ndition 5, we use Figure. The set on the first line of (6 in ndition 5 is the nvex mbination of the following six points, a a c b b c b, c, a, a, c, b (9 c b b c a a abc.4 acb cab Fig.. Explanation of ndition 5 in example 3..4 bac resulting in all the points within the hexagon in Figure. The three sets ( G b x p x, x : b x =, b x, G = G = µ b + µ c + µ 3 a + µ 4 b : 4 µ i =, µ i (9 i= and ( G b x p x, x : b x =, b x, G = G = µ a + µ b + µ 3 c + µ 4 c : 4 µ i =, µ i (93 i= and ( G b x p x, x : b x =, b x, G = G = µ c + µ a + µ 3 b + µ 4 a : 4 µ i =, µ i (94 i= rrespond to the points in the three shaded areas, abc, cba, bca, cab], acb, abc, bca, cab], and bac, cab, abc, bca], respectively. Since the three shaded areas ver the entire hexagon, and G, G, G G, ndition 5 is satisfied..

7 VII. CONCLUSION We provide a single-letter characterization for the capacity region of a class of DDICs, which is more general than the class of DADICs studied by Benzel ]. We show that for the class of DDICs studied, ender operation does not increase the capacity region, and the best way to manage the interference is through random debook design and treating the signal for the good receiver as pure noise at the bad receiver. REFERENCES ] R. Benzel, The capacity region of a class of discrete additive degraded interference channels, IEEE Trans. on Information Theory, vol. 5, pp. 8 3, March 979. ] C. E. Shannon, Two-way mmunication channels, Proc. 4th Berkeley Symp. Math. Stat. Prob., pp , 96. 3] A. El Gamal and M. Zahedi, The capacity region of a class of deterministic interference channels, IEEE Trans. on Information Theory, vol. 8, no., pp , March 98. 4] H. Sato, The capacity of the Gaussian interference channel under strong interference, IEEE Trans. on Information Theory, vol. 7, pp , November 98. 5] M. Costa and A. El Gamal, The capacity region of the discrete memoryless interference channel with strong interference, IEEE Trans. on Information Theory, vol. 33, no. 5, pp. 7 7, September ] A. B. Carleial, Interference channels, IEEE Trans. on Information Theory, vol. 4, pp. 6 7, January ] H. Sato, The two-user mmunication channels, IEEE Trans. on Information Theory, vol. 3, pp , May ] R. Ahlswede, Multi-way mmunication channels, in Proc. nd Int. Symp. Inform. Theory, H. A. Sci., Ed., Tsahkadsor, Armenian S.S.R., 97, pp ] T. M. Cover and J. A. Thomas, Elements of Information Theory. Wiley- Interscience, 99. ] H. Witsenhausen and A. Wyner, A nditional entropy bound for a pair of discrete random variables, IEEE Trans. on Information Theory, vol., no. 5, pp , September 975. ] T. M. Cover, Broadcast channels, IEEE Trans. on Information Theory, vol. 8, no., pp. 4, January 97. ] R. G. Gallager, Capacity and ding for degraded broadcast channels, Problemy Peredaci Informaccii, vol., no. 3, pp. 3 4, ] R. Ahlswede and J. Korner, Source ding with side information and a nverse for degraded broadcast channels, IEEE Trans. on Information Theory, vol., pp , November ] H. Witsenhausen, Entropy inequalities for discrete channels, IEEE Trans. on Information Theory, vol., no. 5, pp. 6 66, September 974.

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

On the Rate-Limited Gelfand-Pinsker Problem

On the Rate-Limited Gelfand-Pinsker Problem On the Rate-Limited Gelfand-Pinsker Problem Ravi Tandon Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 ravit@umd.edu ulukus@umd.edu Abstract

More information

On the Capacity Region of the Gaussian Z-channel

On the Capacity Region of the Gaussian Z-channel On the Capacity Region of the Gaussian Z-channel Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@eng.umd.edu ulukus@eng.umd.edu

More information

Bounds and Capacity Results for the Cognitive Z-interference Channel

Bounds and Capacity Results for the Cognitive Z-interference Channel Bounds and Capacity Results for the Cognitive Z-interference Channel Nan Liu nanliu@stanford.edu Ivana Marić ivanam@wsl.stanford.edu Andrea J. Goldsmith andrea@wsl.stanford.edu Shlomo Shamai (Shitz) Technion

More information

The Capacity Region of the Cognitive Z-interference Channel with One Noiseless Component

The Capacity Region of the Cognitive Z-interference Channel with One Noiseless Component 1 The Capacity Region of the Cognitive Z-interference Channel with One Noiseless Component Nan Liu, Ivana Marić, Andrea J. Goldsmith, Shlomo Shamai (Shitz) arxiv:0812.0617v1 [cs.it] 2 Dec 2008 Dept. of

More information

On Gaussian MIMO Broadcast Channels with Common and Private Messages

On Gaussian MIMO Broadcast Channels with Common and Private Messages On Gaussian MIMO Broadcast Channels with Common and Private Messages Ersen Ekrem Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 20742 ersen@umd.edu

More information

On Dependence Balance Bounds for Two Way Channels

On Dependence Balance Bounds for Two Way Channels On Dependence Balance Bounds for Two Way Channels Ravi Tandon Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 20742 ravit@umd.edu ulukus@umd.edu

More information

A Single-letter Upper Bound for the Sum Rate of Multiple Access Channels with Correlated Sources

A Single-letter Upper Bound for the Sum Rate of Multiple Access Channels with Correlated Sources A Single-letter Upper Bound for the Sum Rate of Multiple Access Channels with Correlated Sources Wei Kang Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

Capacity of a Class of Semi-Deterministic Primitive Relay Channels

Capacity of a Class of Semi-Deterministic Primitive Relay Channels Capacity of a Class of Semi-Deterministic Primitive Relay Channels Ravi Tandon Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 2742 ravit@umd.edu

More information

The Gallager Converse

The Gallager Converse The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing

More information

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,

More information

Optimal Encoding Schemes for Several Classes of Discrete Degraded Broadcast Channels

Optimal Encoding Schemes for Several Classes of Discrete Degraded Broadcast Channels Optimal Encoding Schemes for Several Classes of Discrete Degraded Broadcast Channels Bike Xie, Student Member, IEEE and Richard D. Wesel, Senior Member, IEEE Abstract arxiv:0811.4162v4 [cs.it] 8 May 2009

More information

A Comparison of Two Achievable Rate Regions for the Interference Channel

A Comparison of Two Achievable Rate Regions for the Interference Channel A Comparison of Two Achievable Rate Regions for the Interference Channel Hon-Fah Chong, Mehul Motani, and Hari Krishna Garg Electrical & Computer Engineering National University of Singapore Email: {g030596,motani,eleghk}@nus.edu.sg

More information

On Multiple User Channels with State Information at the Transmitters

On Multiple User Channels with State Information at the Transmitters On Multiple User Channels with State Information at the Transmitters Styrmir Sigurjónsson and Young-Han Kim* Information Systems Laboratory Stanford University Stanford, CA 94305, USA Email: {styrmir,yhk}@stanford.edu

More information

Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels

Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels Bike ie, Student Member, IEEE and Richard D. Wesel, Senior Member, IEEE Abstract Certain degraded broadcast channels

More information

Variable Length Codes for Degraded Broadcast Channels

Variable Length Codes for Degraded Broadcast Channels Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates

More information

An Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel

An Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 4, APRIL 2012 2427 An Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel Ersen Ekrem, Student Member, IEEE,

More information

Lecture 10: Broadcast Channel and Superposition Coding

Lecture 10: Broadcast Channel and Superposition Coding Lecture 10: Broadcast Channel and Superposition Coding Scribed by: Zhe Yao 1 Broadcast channel M 0M 1M P{y 1 y x} M M 01 1 M M 0 The capacity of the broadcast channel depends only on the marginal conditional

More information

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR 1 Stefano Rini, Daniela Tuninetti and Natasha Devroye srini2, danielat, devroye @ece.uic.edu University of Illinois at Chicago Abstract

More information

Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel

Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel Jun Chen Dept. of Electrical and Computer Engr. McMaster University Hamilton, Ontario, Canada Chao Tian AT&T Labs-Research 80 Park

More information

Energy State Amplification in an Energy Harvesting Communication System

Energy State Amplification in an Energy Harvesting Communication System Energy State Amplification in an Energy Harvesting Communication System Omur Ozel Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland College Park, MD 20742 omur@umd.edu

More information

Degrees of Freedom Region of the Gaussian MIMO Broadcast Channel with Common and Private Messages

Degrees of Freedom Region of the Gaussian MIMO Broadcast Channel with Common and Private Messages Degrees of Freedom Region of the Gaussian MIMO Broadcast hannel with ommon and Private Messages Ersen Ekrem Sennur Ulukus Department of Electrical and omputer Engineering University of Maryland, ollege

More information

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel

More information

ProblemsWeCanSolveWithaHelper

ProblemsWeCanSolveWithaHelper ITW 2009, Volos, Greece, June 10-12, 2009 ProblemsWeCanSolveWitha Haim Permuter Ben-Gurion University of the Negev haimp@bgu.ac.il Yossef Steinberg Technion - IIT ysteinbe@ee.technion.ac.il Tsachy Weissman

More information

Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel

Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel Pritam Mukherjee Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 074 pritamm@umd.edu

More information

On the Capacity of the Interference Channel with a Relay

On the Capacity of the Interference Channel with a Relay On the Capacity of the Interference Channel with a Relay Ivana Marić, Ron Dabora and Andrea Goldsmith Stanford University, Stanford, CA {ivanam,ron,andrea}@wsl.stanford.edu Abstract Capacity gains due

More information

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Nevroz Şen, Fady Alajaji and Serdar Yüksel Department of Mathematics and Statistics Queen s University Kingston, ON K7L 3N6, Canada

More information

The Capacity of the Semi-Deterministic Cognitive Interference Channel and its Application to Constant Gap Results for the Gaussian Channel

The Capacity of the Semi-Deterministic Cognitive Interference Channel and its Application to Constant Gap Results for the Gaussian Channel The Capacity of the Semi-Deterministic Cognitive Interference Channel and its Application to Constant Gap Results for the Gaussian Channel Stefano Rini, Daniela Tuninetti, and Natasha Devroye Department

More information

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels Representation of Correlated s into Graphs for Transmission over Broadcast s Suhan Choi Department of Electrical Eng. and Computer Science University of Michigan, Ann Arbor, MI 80, USA Email: suhanc@eecs.umich.edu

More information

Optimum Power Allocation in Fading MIMO Multiple Access Channels with Partial CSI at the Transmitters

Optimum Power Allocation in Fading MIMO Multiple Access Channels with Partial CSI at the Transmitters Optimum Power Allocation in Fading MIMO Multiple Access Channels with Partial CSI at the Transmitters Alkan Soysal Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland,

More information

Feedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case

Feedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case 1 arxiv:0901.3580v1 [cs.it] 23 Jan 2009 Feedback Capacity of the Gaussian Interference Channel to Within 1.7075 Bits: the Symmetric Case Changho Suh and David Tse Wireless Foundations in the Department

More information

A Comparison of Superposition Coding Schemes

A Comparison of Superposition Coding Schemes A Comparison of Superposition Coding Schemes Lele Wang, Eren Şaşoğlu, Bernd Bandemer, and Young-Han Kim Department of Electrical and Computer Engineering University of California, San Diego La Jolla, CA

More information

Fading Wiretap Channel with No CSI Anywhere

Fading Wiretap Channel with No CSI Anywhere Fading Wiretap Channel with No CSI Anywhere Pritam Mukherjee Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 7 pritamm@umd.edu ulukus@umd.edu Abstract

More information

Capacity Region of the Permutation Channel

Capacity Region of the Permutation Channel Capacity Region of the Permutation Channel John MacLaren Walsh and Steven Weber Abstract We discuss the capacity region of a degraded broadcast channel (DBC) formed from a channel that randomly permutes

More information

The Role of Directed Information in Network Capacity

The Role of Directed Information in Network Capacity The Role of Directed Information in Network Capacity Sudeep Kamath 1 and Young-Han Kim 2 1 Department of Electrical Engineering Princeton University 2 Department of Electrical and Computer Engineering

More information

An Outer Bound for the Gaussian. Interference channel with a relay.

An Outer Bound for the Gaussian. Interference channel with a relay. An Outer Bound for the Gaussian Interference Channel with a Relay Ivana Marić Stanford University Stanford, CA ivanam@wsl.stanford.edu Ron Dabora Ben-Gurion University Be er-sheva, Israel ron@ee.bgu.ac.il

More information

Bounds on Achievable Rates for General Multi-terminal Networks with Practical Constraints

Bounds on Achievable Rates for General Multi-terminal Networks with Practical Constraints Bounds on Achievable Rates for General Multi-terminal Networs with Practical Constraints Mohammad Ali Khojastepour, Ashutosh Sabharwal, and Behnaam Aazhang Department of Electrical and Computer Engineering

More information

Cut-Set Bound and Dependence Balance Bound

Cut-Set Bound and Dependence Balance Bound Cut-Set Bound and Dependence Balance Bound Lei Xiao lxiao@nd.edu 1 Date: 4 October, 2006 Reading: Elements of information theory by Cover and Thomas [1, Section 14.10], and the paper by Hekstra and Willems

More information

Symmetric Characterization of Finite State Markov Channels

Symmetric Characterization of Finite State Markov Channels Symmetric Characterization of Finite State Markov Channels Mohammad Rezaeian Department of Electrical and Electronic Eng. The University of Melbourne Victoria, 31, Australia Email: rezaeian@unimelb.edu.au

More information

Joint Source-Channel Coding for the Multiple-Access Relay Channel

Joint Source-Channel Coding for the Multiple-Access Relay Channel Joint Source-Channel Coding for the Multiple-Access Relay Channel Yonathan Murin, Ron Dabora Department of Electrical and Computer Engineering Ben-Gurion University, Israel Email: moriny@bgu.ac.il, ron@ee.bgu.ac.il

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels

A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels Mehdi Mohseni Department of Electrical Engineering Stanford University Stanford, CA 94305, USA Email: mmohseni@stanford.edu

More information

On the Capacity of Interference Channels with Degraded Message sets

On the Capacity of Interference Channels with Degraded Message sets On the Capacity of Interference Channels with Degraded Message sets Wei Wu, Sriram Vishwanath and Ari Arapostathis arxiv:cs/060507v [cs.it] 7 May 006 Abstract This paper is motivated by a sensor network

More information

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying Ahmad Abu Al Haija, and Mai Vu, Department of Electrical and Computer Engineering McGill University Montreal, QC H3A A7 Emails: ahmadabualhaija@mailmcgillca,

More information

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 Capacity Theorems for Discrete, Finite-State Broadcast Channels With Feedback and Unidirectional Receiver Cooperation Ron Dabora

More information

Multicoding Schemes for Interference Channels

Multicoding Schemes for Interference Channels Multicoding Schemes for Interference Channels 1 Ritesh Kolte, Ayfer Özgür, Haim Permuter Abstract arxiv:1502.04273v1 [cs.it] 15 Feb 2015 The best known inner bound for the 2-user discrete memoryless interference

More information

On the Optimality of Treating Interference as Noise in Competitive Scenarios

On the Optimality of Treating Interference as Noise in Competitive Scenarios On the Optimality of Treating Interference as Noise in Competitive Scenarios A. DYTSO, D. TUNINETTI, N. DEVROYE WORK PARTIALLY FUNDED BY NSF UNDER AWARD 1017436 OUTLINE CHANNEL MODEL AND PAST WORK ADVANTAGES

More information

An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel

An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel Forty-Ninth Annual Allerton Conference Allerton House, UIUC, Illinois, USA September 8-3, An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel Invited Paper Bernd Bandemer and

More information

A Simple Converse Proof and a Unified Capacity Formula for Channels with Input Constraints

A Simple Converse Proof and a Unified Capacity Formula for Channels with Input Constraints A Simple Converse Proof and a Unified Capacity Formula for Channels with Input Constraints Youjian Eugene Liu Department of Electrical and Computer Engineering University of Colorado at Boulder eugeneliu@ieee.org

More information

A Summary of Multiple Access Channels

A Summary of Multiple Access Channels A Summary of Multiple Access Channels Wenyi Zhang February 24, 2003 Abstract In this summary we attempt to present a brief overview of the classical results on the information-theoretic aspects of multiple

More information

arxiv: v1 [cs.it] 4 Jun 2018

arxiv: v1 [cs.it] 4 Jun 2018 State-Dependent Interference Channel with Correlated States 1 Yunhao Sun, 2 Ruchen Duan, 3 Yingbin Liang, 4 Shlomo Shamai (Shitz) 5 Abstract arxiv:180600937v1 [csit] 4 Jun 2018 This paper investigates

More information

Reliable Computation over Multiple-Access Channels

Reliable Computation over Multiple-Access Channels Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,

More information

X 1 : X Table 1: Y = X X 2

X 1 : X Table 1: Y = X X 2 ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access

More information

The Poisson Channel with Side Information

The Poisson Channel with Side Information The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH

More information

Sum Capacity of General Deterministic Interference Channel with Channel Output Feedback

Sum Capacity of General Deterministic Interference Channel with Channel Output Feedback Sum Capacity of General Deterministic Interference Channel with Channel Output Feedback Achaleshwar Sahai Department of ECE, Rice University, Houston, TX 775. as27@rice.edu Vaneet Aggarwal Department of

More information

Secret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper

Secret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper Secret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper Mohsen Bahrami, Ali Bereyhi, Mahtab Mirmohseni and Mohammad Reza Aref Information Systems and Security

More information

An Achievable Rate for the Multiple Level Relay Channel

An Achievable Rate for the Multiple Level Relay Channel An Achievable Rate for the Multiple Level Relay Channel Liang-Liang Xie and P. R. Kumar Department of Electrical and Computer Engineering, and Coordinated Science Laboratory University of Illinois, Urbana-Champaign

More information

Can Feedback Increase the Capacity of the Energy Harvesting Channel?

Can Feedback Increase the Capacity of the Energy Harvesting Channel? Can Feedback Increase the Capacity of the Energy Harvesting Channel? Dor Shaviv EE Dept., Stanford University shaviv@stanford.edu Ayfer Özgür EE Dept., Stanford University aozgur@stanford.edu Haim Permuter

More information

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview AN INTRODUCTION TO SECRECY CAPACITY BRIAN DUNN. Overview This paper introduces the reader to several information theoretic aspects of covert communications. In particular, it discusses fundamental limits

More information

On the Duality between Multiple-Access Codes and Computation Codes

On the Duality between Multiple-Access Codes and Computation Codes On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch

More information

On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers

On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers Albrecht Wolf, Diana Cristina González, Meik Dörpinghaus, José Cândido Silveira Santos Filho, and Gerhard Fettweis Vodafone

More information

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels S. Sandeep Pradhan a, Suhan Choi a and Kannan Ramchandran b, a {pradhanv,suhanc}@eecs.umich.edu, EECS Dept.,

More information

Interference channel capacity region for randomized fixed-composition codes

Interference channel capacity region for randomized fixed-composition codes Interference channel capacity region for randomized fixed-composition codes Cheng Chang D. E. Shaw & Co, New York, NY. 20 West 45th Street 39th Floor New York, NY 0036 cchang@eecs.berkeley.edu Abstract

More information

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Vincent Y. F. Tan (Joint work with Silas L. Fong) National University of Singapore (NUS) 2016 International Zurich Seminar on

More information

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12 Lecture 1: The Multiple Access Channel Copyright G. Caire 12 Outline Two-user MAC. The Gaussian case. The K-user case. Polymatroid structure and resource allocation problems. Copyright G. Caire 13 Two-user

More information

Upper Bounds on the Capacity of Binary Intermittent Communication

Upper Bounds on the Capacity of Binary Intermittent Communication Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,

More information

Capacity of a Class of Cognitive Radio Channels: Interference Channels with Degraded Message Sets

Capacity of a Class of Cognitive Radio Channels: Interference Channels with Degraded Message Sets Capacity of a Class of Cognitive Radio Channels: Interference Channels with Degraded Message Sets Wei Wu, Sriram Vishwanath and Ari Arapostathis Abstract This paper is motivated by two different scenarios.

More information

On the Secrecy Capacity of the Z-Interference Channel

On the Secrecy Capacity of the Z-Interference Channel On the Secrecy Capacity of the Z-Interference Channel Ronit Bustin Tel Aviv University Email: ronitbustin@post.tau.ac.il Mojtaba Vaezi Princeton University Email: mvaezi@princeton.edu Rafael F. Schaefer

More information

Coding Techniques for Primitive Relay Channels

Coding Techniques for Primitive Relay Channels Forty-Fifth Annual Allerton Conference Allerton House, UIUC, Illinois, USA September 26-28, 2007 WeB1.2 Coding Techniques for Primitive Relay Channels Young-Han Kim Abstract We give a comprehensive discussion

More information

Interactive Decoding of a Broadcast Message

Interactive Decoding of a Broadcast Message In Proc. Allerton Conf. Commun., Contr., Computing, (Illinois), Oct. 2003 Interactive Decoding of a Broadcast Message Stark C. Draper Brendan J. Frey Frank R. Kschischang University of Toronto Toronto,

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

On Scalable Coding in the Presence of Decoder Side Information

On Scalable Coding in the Presence of Decoder Side Information On Scalable Coding in the Presence of Decoder Side Information Emrah Akyol, Urbashi Mitra Dep. of Electrical Eng. USC, CA, US Email: {eakyol, ubli}@usc.edu Ertem Tuncel Dep. of Electrical Eng. UC Riverside,

More information

Secret Key Agreement Using Asymmetry in Channel State Knowledge

Secret Key Agreement Using Asymmetry in Channel State Knowledge Secret Key Agreement Using Asymmetry in Channel State Knowledge Ashish Khisti Deutsche Telekom Inc. R&D Lab USA Los Altos, CA, 94040 Email: ashish.khisti@telekom.com Suhas Diggavi LICOS, EFL Lausanne,

More information

Survey of Interference Channel

Survey of Interference Channel Survey of Interference Channel Alex Dytso Department of Electrical and Computer Engineering University of Illinois at Chicago, Chicago IL 60607, USA, Email: odytso2 @ uic.edu Abstract Interference Channel(IC)

More information

Source-Channel Coding Theorems for the Multiple-Access Relay Channel

Source-Channel Coding Theorems for the Multiple-Access Relay Channel Source-Channel Coding Theorems for the Multiple-Access Relay Channel Yonathan Murin, Ron Dabora, and Deniz Gündüz Abstract We study reliable transmission of arbitrarily correlated sources over multiple-access

More information

Online Scheduling for Energy Harvesting Broadcast Channels with Finite Battery

Online Scheduling for Energy Harvesting Broadcast Channels with Finite Battery Online Scheduling for Energy Harvesting Broadcast Channels with Finite Battery Abdulrahman Baknina Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park,

More information

Random Access: An Information-Theoretic Perspective

Random Access: An Information-Theoretic Perspective Random Access: An Information-Theoretic Perspective Paolo Minero, Massimo Franceschetti, and David N. C. Tse Abstract This paper considers a random access system where each sender can be in two modes of

More information

Simultaneous Nonunique Decoding Is Rate-Optimal

Simultaneous Nonunique Decoding Is Rate-Optimal Fiftieth Annual Allerton Conference Allerton House, UIUC, Illinois, USA October 1-5, 2012 Simultaneous Nonunique Decoding Is Rate-Optimal Bernd Bandemer University of California, San Diego La Jolla, CA

More information

Equivalence for Networks with Adversarial State

Equivalence for Networks with Adversarial State Equivalence for Networks with Adversarial State Oliver Kosut Department of Electrical, Computer and Energy Engineering Arizona State University Tempe, AZ 85287 Email: okosut@asu.edu Jörg Kliewer Department

More information

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute ENEE 739C: Advanced Topics in Signal Processing: Coding Theory Instructor: Alexander Barg Lecture 6 (draft; 9/6/03. Error exponents for Discrete Memoryless Channels http://www.enee.umd.edu/ abarg/enee739c/course.html

More information

Duality Between Channel Capacity and Rate Distortion With Two-Sided State Information

Duality Between Channel Capacity and Rate Distortion With Two-Sided State Information IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 6, JUNE 2002 1629 Duality Between Channel Capacity Rate Distortion With Two-Sided State Information Thomas M. Cover, Fellow, IEEE, Mung Chiang, Student

More information

Interference Channel aided by an Infrastructure Relay

Interference Channel aided by an Infrastructure Relay Interference Channel aided by an Infrastructure Relay Onur Sahin, Osvaldo Simeone, and Elza Erkip *Department of Electrical and Computer Engineering, Polytechnic Institute of New York University, Department

More information

Draft. On Limiting Expressions for the Capacity Region of Gaussian Interference Channels. Mojtaba Vaezi and H. Vincent Poor

Draft. On Limiting Expressions for the Capacity Region of Gaussian Interference Channels. Mojtaba Vaezi and H. Vincent Poor Preliminaries Counterexample Better Use On Limiting Expressions for the Capacity Region of Gaussian Interference Channels Mojtaba Vaezi and H. Vincent Poor Department of Electrical Engineering Princeton

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

Primary Rate-Splitting Achieves Capacity for the Gaussian Cognitive Interference Channel

Primary Rate-Splitting Achieves Capacity for the Gaussian Cognitive Interference Channel Primary Rate-Splitting Achieves Capacity for the Gaussian Cognitive Interference Channel Stefano Rini, Ernest Kurniawan and Andrea Goldsmith Technische Universität München, Munich, Germany, Stanford University,

More information

Common Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014

Common Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014 Common Information Abbas El Gamal Stanford University Viterbi Lecture, USC, April 2014 Andrew Viterbi s Fabulous Formula, IEEE Spectrum, 2010 El Gamal (Stanford University) Disclaimer Viterbi Lecture 2

More information

On the K-user Cognitive Interference Channel with Cumulative Message Sharing Sum-Capacity

On the K-user Cognitive Interference Channel with Cumulative Message Sharing Sum-Capacity 03 EEE nternational Symposium on nformation Theory On the K-user Cognitive nterference Channel with Cumulative Message Sharing Sum-Capacity Diana Maamari, Daniela Tuninetti and Natasha Devroye Department

More information

Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel

Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel Shahid Mehraj Shah and Vinod Sharma Department of Electrical Communication Engineering, Indian Institute of

More information

i.i.d. Mixed Inputs and Treating Interference as Noise are gdof Optimal for the Symmetric Gaussian Two-user Interference Channel

i.i.d. Mixed Inputs and Treating Interference as Noise are gdof Optimal for the Symmetric Gaussian Two-user Interference Channel iid Mixed nputs and Treating nterference as Noise are gdof Optimal for the Symmetric Gaussian Two-user nterference Channel Alex Dytso Daniela Tuninetti and Natasha Devroye University of llinois at Chicago

More information

Information Theory Meets Game Theory on The Interference Channel

Information Theory Meets Game Theory on The Interference Channel Information Theory Meets Game Theory on The Interference Channel Randall A. Berry Dept. of EECS Northwestern University e-mail: rberry@eecs.northwestern.edu David N. C. Tse Wireless Foundations University

More information

Error Exponent Region for Gaussian Broadcast Channels

Error Exponent Region for Gaussian Broadcast Channels Error Exponent Region for Gaussian Broadcast Channels Lihua Weng, S. Sandeep Pradhan, and Achilleas Anastasopoulos Electrical Engineering and Computer Science Dept. University of Michigan, Ann Arbor, MI

More information

Ahmed S. Mansour, Rafael F. Schaefer and Holger Boche. June 9, 2015

Ahmed S. Mansour, Rafael F. Schaefer and Holger Boche. June 9, 2015 The Individual Secrecy Capacity of Degraded Multi-Receiver Wiretap Broadcast Channels Ahmed S. Mansour, Rafael F. Schaefer and Holger Boche Lehrstuhl für Technische Universität München, Germany Department

More information

ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016

ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016 ECE598: Information-theoretic methods in high-dimensional statistics Spring 06 Lecture : Mutual Information Method Lecturer: Yihong Wu Scribe: Jaeho Lee, Mar, 06 Ed. Mar 9 Quick review: Assouad s lemma

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

Note that the new channel is noisier than the original two : and H(A I +A2-2A1A2) > H(A2) (why?). min(c,, C2 ) = min(1 - H(a t ), 1 - H(A 2 )).

Note that the new channel is noisier than the original two : and H(A I +A2-2A1A2) > H(A2) (why?). min(c,, C2 ) = min(1 - H(a t ), 1 - H(A 2 )). l I ~-16 / (a) (5 points) What is the capacity Cr of the channel X -> Y? What is C of the channel Y - Z? (b) (5 points) What is the capacity C 3 of the cascaded channel X -3 Z? (c) (5 points) A ow let.

More information

Computation of Information Rates from Finite-State Source/Channel Models

Computation of Information Rates from Finite-State Source/Channel Models Allerton 2002 Computation of Information Rates from Finite-State Source/Channel Models Dieter Arnold arnold@isi.ee.ethz.ch Hans-Andrea Loeliger loeliger@isi.ee.ethz.ch Pascal O. Vontobel vontobel@isi.ee.ethz.ch

More information

On Two-user Fading Gaussian Broadcast Channels. with Perfect Channel State Information at the Receivers. Daniela Tuninetti

On Two-user Fading Gaussian Broadcast Channels. with Perfect Channel State Information at the Receivers. Daniela Tuninetti DIMACS Workshop on Network Information Theory - March 2003 Daniela Tuninetti 1 On Two-user Fading Gaussian Broadcast Channels with Perfect Channel State Information at the Receivers Daniela Tuninetti Mobile

More information

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University) Performance-based Security for Encoding of Information Signals FA9550-15-1-0180 (2015-2018) Paul Cuff (Princeton University) Contributors Two students finished PhD Tiance Wang (Goldman Sachs) Eva Song

More information

Generalized Writing on Dirty Paper

Generalized Writing on Dirty Paper Generalized Writing on Dirty Paper Aaron S. Cohen acohen@mit.edu MIT, 36-689 77 Massachusetts Ave. Cambridge, MA 02139-4307 Amos Lapidoth lapidoth@isi.ee.ethz.ch ETF E107 ETH-Zentrum CH-8092 Zürich, Switzerland

More information