A Comparison of Two Achievable Rate Regions for the Interference Channel

Similar documents
Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

A Comparison of Superposition Coding Schemes

Survey of Interference Channel

On the Capacity of the Interference Channel with a Relay

An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Simultaneous Nonunique Decoding Is Rate-Optimal

Interference Channel aided by an Infrastructure Relay

Interference channel capacity region for randomized fixed-composition codes

On Multiple User Channels with State Information at the Transmitters

The Capacity Region of the Cognitive Z-interference Channel with One Noiseless Component

Bounds and Capacity Results for the Cognitive Z-interference Channel

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

On Network Interference Management

On the Capacity Region of the Gaussian Z-channel

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR

Shannon s noisy-channel theorem

Primary Rate-Splitting Achieves Capacity for the Gaussian Cognitive Interference Channel

Joint Source-Channel Coding for the Multiple-Access Relay Channel

Multicoding Schemes for Interference Channels

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels

arxiv: v1 [cs.it] 4 Jun 2018

Information Theory Meets Game Theory on The Interference Channel

On Gaussian MIMO Broadcast Channels with Common and Private Messages

Draft. On Limiting Expressions for the Capacity Region of Gaussian Interference Channels. Mojtaba Vaezi and H. Vincent Poor

820 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 2, FEBRUARY Stefano Rini, Daniela Tuninetti, and Natasha Devroye

List of Figures. Acknowledgements. Abstract 1. 1 Introduction 2. 2 Preliminaries Superposition Coding Block Markov Encoding...

Degrees of Freedom Region of the Gaussian MIMO Broadcast Channel with Common and Private Messages

The Capacity Region of a Class of Discrete Degraded Interference Channels

Cognitive Multiple Access Networks

i.i.d. Mixed Inputs and Treating Interference as Noise are gdof Optimal for the Symmetric Gaussian Two-user Interference Channel

Feedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case

The Poisson Channel with Side Information

On the Duality of Gaussian Multiple-Access and Broadcast Channels

IET Commun., 2009, Vol. 3, Iss. 4, pp doi: /iet-com & The Institution of Engineering and Technology 2009

Lecture 10: Broadcast Channel and Superposition Coding

The Gallager Converse

Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel

Optimal Power Allocation for Parallel Gaussian Broadcast Channels with Independent and Common Information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

The Capacity of the Semi-Deterministic Cognitive Interference Channel and its Application to Constant Gap Results for the Gaussian Channel

Interference Channels with Source Cooperation

Sum Capacity of General Deterministic Interference Channel with Channel Output Feedback

A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels

A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality

Accessible Capacity of Secondary Users

THE practical bottleneck of today s communication networks

Secret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper

On the Capacity of Cognitive Radios in Multiple Access Networks

An Achievable Rate for the Multiple Level Relay Channel

Error Exponent Region for Gaussian Broadcast Channels

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)

Communication Games on the Generalized Gaussian Relay Channel

II. THE TWO-WAY TWO-RELAY CHANNEL

Bounds on Achievable Rates for General Multi-terminal Networks with Practical Constraints

Random Access: An Information-Theoretic Perspective

Diversity-Multiplexing Tradeoff of the Two-User Interference Channel

A Summary of Multiple Access Channels

4 An Introduction to Channel Coding and Decoding over BSC

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12

Lecture 3: Channel Capacity

On the Rate-Limited Gelfand-Pinsker Problem

An Outer Bound for the Gaussian. Interference channel with a relay.

Capacity Bounds for. the Gaussian Interference Channel

Capacity bounds for multiple access-cognitive interference channel

Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels

ELEC546 Review of Information Theory

On the Optimality of Treating Interference as Noise in Competitive Scenarios

On the Capacity and Degrees of Freedom Regions of MIMO Interference Channels with Limited Receiver Cooperation

Variable Length Codes for Degraded Broadcast Channels

On the Duality between Multiple-Access Codes and Computation Codes

Duality, Achievable Rates, and Sum-Rate Capacity of Gaussian MIMO Broadcast Channels

Decentralized Simultaneous Energy and Information Transmission in Multiple Access Channels

National University of Singapore Department of Electrical & Computer Engineering. Examination for

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel

Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities

The Capacity Region for Multi-source Multi-sink Network Coding

Approximate Capacity of Fast Fading Interference Channels with no CSIT

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels

Capacity of a Class of Deterministic Relay Channels

arxiv: v2 [cs.it] 28 May 2017

Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel

Interference Decoding for Deterministic Channels Bernd Bandemer, Student Member, IEEE, and Abbas El Gamal, Fellow, IEEE

Research Article Multiaccess Channels with State Known to Some Encoders and Independent Messages

Distributed Lossless Compression. Distributed lossless compression system

On Achievability for Downlink Cloud Radio Access Networks with Base Station Cooperation

On the Capacity of the Two-Hop Half-Duplex Relay Channel

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 4, APRIL

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels

Cut-Set Bound and Dependence Balance Bound

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Equivalence for Networks with Adversarial State

The Nash Equilibrium Region of the Linear Deterministic Interference Channel with Feedback

X 1 : X Table 1: Y = X X 2

Soft Covering with High Probability

Approximately achieving the feedback interference channel capacity with point-to-point codes

Source-Channel Coding Theorems for the Multiple-Access Relay Channel

On the Gaussian Z-Interference channel

Transcription:

A Comparison of Two Achievable Rate Regions for the Interference Channel Hon-Fah Chong, Mehul Motani, and Hari Krishna Garg Electrical & Computer Engineering National University of Singapore Email: {g030596,motani,eleghk}@nus.edu.sg Abstract A recent result for the general interference channel is presented. The encoding technique is a generalization of the technique employed by Han and Kobayashi. Coupled with an improvement in the computation of the error probabilities, fewer constraints are necessary to define the achievable rate region. We further compare the two achievable rate regions and discuss whether the two achievable rate regions are equivalent. I. INTRODUCTION An interference channel (IC) models the situation where M unrelated senders try to communicate their separate information to M different receivers via a common channel. There is no cooperation between any of the receivers or senders. Hence, transmission of information from each sender to its corresponding receiver interferes with the communication between the other senders and their receivers. In this paper, we consider only the two-user IC. The study of the IC was first initiated by Shannon [1], and was further studied by Ahlswede []. Carleial [3] established several fundamental results, and also determined an improved achievable rate region for the IC. Han and Kobayashi introduced a superior decoder and established the best achievable rate region to date for the general IC [4]. Except for the Gaussian IC under strong interference ( [5], [6], [4]), a class of discrete additive degraded IC [7], a class of deterministic IC [8] and the discrete memoryless IC with strong interference [9], the capacity of the general IC remains unknown to date. The purpose of this paper is to compare two achievable rate regions for the general IC. In Section II, we first give a description of the mathematical model for the discrete memoryless IC. In Section III, we review the best achievable rate region for the general IC to date by Han and Kobayashi. We call this the Han-Kobayashi region. Next, in Section IV, we derive an achievable rate region for the general IC which contains the Han-Kobayashi region. We term this the Chong-Motani- Garg region. Finally, in Section V, we provide comparisons between the Han-Kobayashi region and the Chong-Motani- Garg region. II. MATHEMATICAL PRELIMINARIES A. Discrete Memoryless Interference Channel A two-user discrete interference channel consists of four finite sets X 1, X, Y 1, Y, and conditional probability distributions p (.,. x 1, x ) on Y 1 Y, where (x 1, x ) X 1 X. The interference channel is said to be memoryless if n p (y1 n, y n x n 1, x n ) = p (y1i, n yi x n n 1i, x n i). (1) A ( nr1, nr, n ) code for an interference channel with independent information consists of two encoders and two decoding functions f 1 : { 1,..., nr1} X n 1 f : { 1,..., nr} X n g 1 :Y n 1 { 1,..., nr1} g :Y n { 1,..., nr}. () The average probability of error is defined as the probability the decoded message is not equal to the transmitted message, i.e., P (n) e = Pr (g 1 (Y n 1 ) W 1 or g (Y n ) W ) (3) where (W 1, W ) are assumed to be uniformly distributed over nr1 nr. A rate pair (R 1, R ) is said to be achievable for the interference channel if there exists a sequence of ( nr1, nr, n ) codes with P e (n) 0. We also denote by A ɛ (n) the set of jointly ɛ-typical sequences. B. Gaussian Interference Channel The discrete-time additive white Gaussian IC, shown in Fig. 1, is described by Y 1 = c 11 X 1 c 1 X Z 1 (4) Y = c 1 X 1 c X Z (5) where the input and output signals are real, the coefficients c ij are real constants, and the noise terms Z 1 and Z are zeromean Gaussian random variables. Also, the mean value of X1 and X cannot exceed P 1 and P respectively. In [3], it was shown that any Gaussian IC can be reduced to its standard form, where c 11 = c = 1 and E [ [ ] Z1] = E Z = 1. The capacity of the Gaussian IC is not known, except for the case of no interference, where c 1 = c 1 = 0, for the case of strong interference, where c 1 1 and c 1 1 and a mixture of these two cases, where c 1 = 0 and c 1 1 or c 1 = 0 and c 1 1.

Theorem 1: For a fixed P P, let R HK (P ) be the set of (R 1, R ) satisfying R 1 R 11 R 1 (7) R R 1 R (8) where Fig. 1. The Gaussian interference channel III. HAN-KOBAYASHI RATE REGION In [4], the authors established an achievable rate region for the general interference channel. They considered 5 auxiliary random variables Q, U 1, W 1, U and W, defined on arbitrary finite sets Q, U 1, W 1, U and W respectively. X 1 and X are the input random variables defined on the input alphabet sets X 1 and X respectively, while Y 1 and Y are the output random variables defined on the output alphabet sets Y 1 and Y respectively. Each of the encoders splits its respective rates into two parts, where the auxiliary random variables U 1 and U serve as cloud centers that can be distinguished by both receivers. Hence, U 1 represents information intended for receiver 1 but that can also be decoded by receiver, while W 1 represents information intended for receiver 1 but that cannot be decoded by receiver. The same applies to auxiliary random variables U and W. This is basically an application of the superposition coding technique of Cover [10] and was first applied by Carleial [3] to the Gaussian IC. However, Carleial made use of a sequential decoder, decoding U 1 and U first before decoding W 1 and W. Han and Kobayashi applied a more powerful decoding technique known as simultaneous decoding. Receiver 1 decodes U 1 and W 1 simultaneously, while receiver decodes U and W simultaneously. Moreover, they introduced a time-sharing parameter Q instead of using the convex-hull operation. This time-sharing parameter Q will also include the time division multiplex/frequency division multiplex strategy introduced by Carleial [3]. We next describe this achievable rate region. Let P be the set of probability distributions P (.) that factor as P (q, u 1, w 1, u, w, x 1, x ) =p (q) p (u 1 q) p (w 1 q) p (u q) p (w q) p (x 1 u 1, w 1, q) p (x u, w, q) (6) R 11 I (W 1 ; Y 1 U 1 U Q) (9) R 1 I (U 1 ; Y 1 W 1 U Q) (10) R 1 I (U ; Y 1 W 1 U 1 Q) (11) R 11 R 1 I (U 1 W 1 ; Y 1 U Q) (1) R 11 R 1 I (W 1 U ; Y 1 U 1 Q) (13) R 1 R 1 I (U 1 U ; Y 1 W 1 Q) (14) R 11 R 1 R 1 I (U 1 W 1 U ; Y 1 Q) (15) R I (W ; Y U U 1 Q) (16) R 1 I (U ; Y W U 1 Q) (17) R 1 I (U 1 ; Y W U Q) (18) R 1 R I (U W ; Y U 1 Q) (19) R R 1 I (W U 1 ; Y U Q) (0) R 1 R 1 I (U U 1 ; Y W Q) (1) R 1 R R 1 I (U W U 1 ; Y Q) () Then the following set P P R HK (P ) (3) is an achievable rate region for the discrete memoryless IC. The bounds (7)-() can be simplified by using Fourier- Motzkin elimination [4, Theorem 4.1]. Moreover, X 1 is a deterministic function of U 1, W 1 and Q and X is a deterministic function of U, W and Q, i.e., X 1 = f 1 (U 1, W 1, Q) (4) X = f (U, W, Q). (5) Then, we can describe the Han-Kobayashi region R HK as follows. Theorem : For a fixed P P, let R HK (P ) be the set of (R 1, R ) satisfying R 1 ρ 1 (6) R ρ (7) R 1 R ρ 1 (8) R 1 R ρ 10 (9) R 1 R ρ 0 (30)

where ρ 1 = σ 1 I (X 1 ; Y 1 U 1 U Q) (31) ρ = σ I (X ; Y U 1 U Q) (3) ρ 1 = σ 1 I (X 1 ; Y 1 U 1 U Q) I (X ; Y U 1 U Q) (33) ρ 10 = σ 1 I (X 1 ; Y 1 U 1 U Q) I (X ; Y U 1 U Q) [σ 1 I (U 1 ; Y U Q)] I (U ; Y U 1 Q), min I (U ; Y Q) [I (U 1 ; Y U Q) σ 1 ], I (U ; Y 1 U 1 Q), I (U 1 U ; Y 1 Q) σ 1 (34) ρ 0 = σ I (X ; Y U 1 U Q) I (X 1 ; Y 1 U 1 U Q) [σ I (U ; Y 1 U 1 Q)] I (U 1 ; Y 1 U Q), min I (U 1 ; Y 1 Q) [I (U ; Y 1 U 1 Q) σ ], I (U 1 ; Y U Q), I (U 1 U ; Y Q) σ (35) σ 1 = min {I (U 1 ; Y 1 U Q), I (U 1 ; Y X Q)} (36) σ = min {I (U ; Y U 1 Q), I (U ; Y 1 X 1 Q)} (37) I (U 1 U ; Y 1 Q), I (U 1 U ; Y Q), σ 1 = min I (U 1 ; Y 1 U Q) I (U ; Y U 1 Q), I (U 1 ; Y U Q) I (U ; Y 1 U 1 Q) (38) Here, [x] = 0 if x 0, [x] = x if x > 0. Then, we have R HK (P ). (39) P P Even though the resulting bounds for Theorem may seem very complicated, we give a simplified version in Section V. This is to ease comparison with the Chong-Motani-Garg region which will be derived in the next section. IV. CHONG-MOTANI-GARG RATE REGION We now establish a potentially new achievable rate region for the interference channel. We make the observation that since both receivers are not interested in the message of the non-intended transmitters, constraints (11) and (18) are unnecessary to drive the probability of error to zero. Moreover, instead of 5 auxiliary random variables, we only consider 3 auxiliary random variables Q, U 1 and U defined on arbitrary finite sets Q, U 1 and U. Again, the auxiliary random variables U 1 and U will serve as cloud centers that can be distinguished by both receivers. For transmitter 1, instead of generating two independent codebooks with codewords U n 1 (j) and W1 n (k), for each codeword U n 1 (j), we generate a codebook with codewords W1 n (j, k), where j { } { } 1,,..., nr1 and k 1,,..., nr 11. Due to our code construction, the constraints (10), (14), (17) and (1) are unnecessary to drive the probability of error to zero. Let P1 be the set of probability distributions P 1 (.) that factor as P 1 (q, u 1, u, x 1, x ) =p (q) p (u 1 x 1 q) p (u x q) (40) Theorem 3: For a fixed P1 P1, let R CMG (P1 ) be the set of (R 1, R ) satisfying where Then the set given by R 1 R 11 R 1 (41) R R 1 R (4) R 11 I (X 1 ; Y 1 U 1 U Q) (43) R 11 R 1 I (U X 1 ; Y 1 U 1 Q) (44) R 11 R 1 I (X 1 ; Y 1 U Q) (45) R 11 R 1 R 1 I (U X 1 ; Y 1 Q) (46) R I (X ; Y U 1 U Q) (47) R R 1 I (U 1 X ; Y U Q) (48) R 1 R I (X ; Y U 1 Q) (49) R 1 R 1 R I (U 1 X ; Y Q) (50) R CMG = R CMG (P 1 ) (51) is an achievable rate region for the discrete memoryless IC. A. Proof of Theorem 3 Codebook Generation: Generate a codeword Q n of length n, generating each element i.i.d according to n p (q). For the codeword Q n, generate nr1 independent codewords U n 1 (j), j { } 1,,..., nr1, generating each element i.i.d according to p (u 1i q i ). For the codeword Q n, and each of the codeword U n 1 (j), generate nr11 i.i.d codewords X n 1 (j, k), k { } 1,,..., nr11, generating each element i.i.d according to p (x 1i q i, u 1i (j)). For the codeword Q n, generate nr1 independent codewords U n (l), l { 1,,..., nr1 }, generating each element i.i.d according to p (u i q i ). For the codeword Q n, and each of the codeword U n (l), generate nr i.i.d codewords X n (l, m), m { } 1,,..., nr, generating each element i.i.d according to p (x i q i, u i (l)). Encoding: For encoder 1, to send the codeword pair (j, k), send the corresponding codeword X n 1 (j, k). For encoder, to send the codeword pair (l, m), send the corresponding codeword X n (l, m). Decoding: Receiver 1 determines the unique and a ˆl such that ( (ĵ) (ˆl) ) U n 1, X n 1, U n, Y1 n A ɛ (n). (5) Receiver determines the unique (ˆl, ˆm) and a ĵ such that ( (ˆl) (ĵ) ) U n, X n (ˆl, ˆm), U n 1, Y n A ɛ (n). (53)

Analysis of the Probability of Error: We consider only the decoding error probability for receiver 1. The same analysis applies for receiver. By the symmetry of the random code construction, the conditional probability of error does not depend on which pair of indices is sent. Thus the conditional probability of error is the same as the unconditional probability of error. So, without loss of generality, we assume that (j, k) = (1, 1) and (l, m) = (1, 1) was sent. We have an error if the correct codewords, {U n 1 (1), X n 1 (1, 1), U n (1)} are not jointly typical with the received{ sequence. (ĵ) An error is also (ˆl)} declared if incorrect codewords U n 1, X n 1, U n where ĵ 1 or ˆk 1 are jointly typical with the received codeword. However, )} no error is declared if {U n 1 (1), X n 1 (1, 1), U (ˆl 1 are jointly typical with the received sequence. Define the following event { E jkl = (U n 1 (j), X n 1 (j, k), U n (l), Y n 1 ) A (n) ɛ Then by the union of events bound, ( ) P e (n) = P E111 c (j,k) (1,1) E jkl P (E111) c P (E j11 ) P (n) e j 1,k=1,l 1 j=1,k 1,l 1 j 1,k 1,l 1 j 1,k=1,l=1 P (E j1l ) P (E 1kl ) P (E jkl ) j=1,k 1,l=1 j 1,k 1,l=1 P (E c 111 ) nr1 n(i(x1;y1 UQ) 4ɛ) n(r1r1) n(i(ux1;y1 Q) 4ɛ) nr11 n(i(x1;y1 U1UQ) 4ɛ) n(r11r1) n(i(ux1;y1 U1Q) 4ɛ) n(r11r1) n(i(x1;y1 UQ) 4ɛ) }. (54) P (E 1k1 ) P (E jk1 ) n(r11r1r1) n(i(ux1;y1 Q) 4ɛ). (55) Since ɛ > 0 is arbitrary, the conditions of Theorem 3 imply that each term tends to 0 as n. The above bound shows that the average probability of error, averaged over all choices of codebooks in the random code construction, is arbitrarily small. Hence there exists at least one code C with arbitrarily small probability of error. Since we can choose a fixed P 1 such that P1 (q, u 1, u, x 1, x ) = P (q, u 1, u, w 1, w, x 1, x ) (56) w 1 W 1,w W we readily see that R HK (P ) R CMG (P1 ), and hence R HK R CMG. The bounds (41)-(50) can be again be simplified using Fourier-Motzkin elimination [11, Theorem 3]. Theorem 4: For a fixed P1 P1, let R CMG (P1 ) be the set of (R 1, R ) satisfying R 1 I (X 1 ; Y 1 U Q) (57) R I (X ; Y U 1 Q) (58) R 1 R I (X 1 U ; Y 1 Q) I (X ; Y U 1 U Q) (59) R 1 R I (X 1 ; Y 1 U 1 U Q) I (X U 1 ; Y Q) (60) R 1 R I (X 1 U ; Y 1 U 1 Q) I (X U 1 ; Y U Q) (61) R 1 R I (X 1 U ; Y 1 Q) I (X 1 ; Y 1 U 1 U Q) I (X U 1 ; Y U Q) (6) R 1 R I (X ; Y U 1 U Q) I (X U 1 ; Y Q) Then we have I (X 1 U ; Y 1 U 1 Q) (63) R CMG = R CMG (P 1 ). (64) V. COMPARING THE HAN-KOBAYASHI AND CHONG-MOTANI-GARG RATE REGIONS To facilitate comparison of the Han-Kobayashi rate region with the Chong-Motani-Garg rate region, we simplify the bounds in Theorem. In fact, many of the bounds are redundant and can be further reduced as in the following theorem. Theorem 5: For a fixed P 1 P 1, let R HK (P 1 ) be the set of (R 1, R ) satisfying R 1 min {I (U 1 ; Y 1 U Q), I (U 1 ; Y X Q)} I (X 1 ; Y 1 U 1 U Q) (65) R min {I (U ; Y U 1 Q), I (U ; Y 1 X 1 Q)} I (X ; Y U 1 U Q) (66) R 1 R I (X 1 U ; Y 1 Q) I (X ; Y U 1 U Q) (67) R 1 R I (X 1 ; Y 1 U 1 U Q) I (X U 1 ; Y Q) (68) R 1 R I (X 1 U ; Y 1 U 1 Q) I (X U 1 ; Y U Q) (69) R 1 R I (X 1 U ; Y 1 Q) I (X 1 ; Y 1 U 1 U Q) I (X U 1 ; Y U Q) (70) R 1 R I (X ; Y U 1 U Q) I (X U 1 ; Y Q) Finally, we have I (X 1 U ; Y 1 U 1 Q) (71) R HK (P 1 ). (7) The proof for the bounds on R 1 R can be found in [11]. In addition, given the constraints (65)-(69), it can be easily shown that all the other constraints in Theorem, except for (70)- (71), are redundant. Comparing the Han-Kobayashi region given by Theorem 5 with the Chong-Motani-Garg region given by Theorem 4, we again see that R HK R CMG. Moreover, the only difference lies in the bound for R 1 and R. In [11], Kramer asked whether there are ICs for which there exists P1 P 1 such that R HK (P1 ) R CMG (P1 ). For the Gaussian IC, when we set Q = 1, we can easily determine

1 0.9 0.8 0.7 Chong Motani Garg Region Han Kobayashi Region 0.6 R 0.5 0.4 0.3 0. 0.1 0 0 0.1 0. 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 R 1 Fig.. Two rate regions for the Gaussian IC when P 1 = P = 6, c 1 = c 1 = 0.4 and α = β = 0.5 parameters where R HK (P1 ) R CMG (P1 ). We assume U 1, U, X 1 and X are Gaussian random variables where E [ ] [ ] U1 E [X1 ] = α, E U E [X ] = β (73) such that α [0, 1], β [0, 1], E [ X 1] = P1 and E [ X ] = P. From Fig., we see that when P 1 = P = 6, c 1 = c 1 = 0.4 and α = β = 0.5, R HK (P 1 ) R CMG (P 1 ). However, this is only for particular choices of P 1 P 1. Our numerical simulations seem to indicate that R CMG for the Gaussian IC. Currently, we have still been unable to prove that R CMG or that R HK R CMG for the general IC, or even for the Gaussian IC. We conjecture that R CMG for the Gaussian IC, but that there may be other ICs where R HK R CMG. VI. CONCLUSION We have reviewed two rate regions for the general IC and made comparisons between the Han-Kobayashi rate region and the Chong-Motani-Garg rate region. We have reduced the bounds for the two rate regions to their simplest forms for comparison. Even though the Chong-Motani-Garg rate region will always include the Han-Kobayashi rate region, it is not known whether the two rate regions are equivalent for all ICs. REFERENCES [1] C. E. Shannon, Two-way communication channels, in Proc. 4th Berkeley Symp. on Mathematical Statistics and Probability, vol. 1. Berkeley, CA: Univ. California Press, 1961, pp. 611 644. [] R. Ahlswede, The capacity region of a channel with two senders and two receivers, Annals Probabil., vol., no. 5, pp. 805 814, 1974. [3] A. B. Carleial, Interference channels, IEEE Trans. Inform. Theory, vol. 4, no. 1, pp. 60 70, Jan. 1978. [4] T. S. Han and K. Kobayashi, A new achievable rate region for the interference channel, IEEE Trans. Inform. Theory, vol. 7, no. 1, pp. 49 60, Jan. 1981. [5] A. B. Carleial, A case where interference does not reduce capacity, IEEE Trans. Inform. Theory, vol. 1, no. 5, pp. 569 570, Sept. 1975. [6] H. Sato, The capacity of the Gaussian interference channel under strong interference, IEEE Trans. Inform. Theory, vol. 7, no. 6, pp. 786 788, Nov. 1981. [7] R. Benzel, The capacity region of a class of discrete additive degraded interference channels, IEEE Trans. Inform. Theory, vol. 5, no., pp. 8 31, March 1979. [8] A. A. E. Gamal and M. H. M. Costa, The capacity region of a class of determinstic interference channels, IEEE Trans. Inform. Theory, vol. 8, no., pp. 343 346, March 198. [9] M. H. M. Costa and A. A. E. Gamal, The capacity region of the discrete memoryless interference channel with strong interference, IEEE Trans. Inform. Theory, vol. 33, no. 5, pp. 710 711, Sept. 1987. [10] T. M. Cover, An achievable rate region for the broadcasting channel, IEEE Trans. Inform. Theory, vol. 1, pp. 399 404, July 1975. [11] G. Kramer, Review of rate regions for interference channels, International Zurich Seminar, Feb. 006.