IN OPTICAL communications, intensity modulation and

Size: px
Start display at page:

Download "IN OPTICAL communications, intensity modulation and"

Transcription

1 IEEE TRANSACTIONS ON INFORMATION THEORY VOL. 6 NO. JANUARY On the Capacity Bounds for Poisson Interference Channels Lifeng Lai Member IEEE Yingbin Liang Member IEEE and Shlomo Shamai (Shitz Fellow IEEE Abstract The Poisson interference channel which models optical communication systems with multiple transceivers in the short-noise-limited regime is investigated. Conditions for the strong interference regime are characterized and the corresponding capacity region is derived which is the same as that of the compound Poisson multiple-access channel with each receiver decoding both messages. For the cases when the strong interference conditions are not satisfied inner and outer bounds on the capacity region are derived. The inner bound is derived via approximating the Poisson interference channel by a binary interference channel and then evaluating the corresponding Han Kobayashi region. The outer bounds are obtained via various techniques including noise reduction genie-aided scheme and channel transformation. The Poisson Z-interference channel is then studied. The sum rate capacity is obtained when the cross link coefficient is either sufficiently small or sufficiently large. Index Terms Capacity region interference channel Poisson channel sum rate capacity Z-interference channel. I. INTRODUCTION IN OPTICAL communications intensity modulation and direct detection (IM/DD are widely used. In the thermalnoise-limited regime one can use Gaussian distribution to model the noise. However in the short-noise-limited regime the Poisson channel is a suitable model [3]. The point-topoint Poisson channel has been intensively studied both in continuous time [4] [7] and discrete time [8] []. However the study of multiuser Poisson channels is limited with few exceptions on the multiple-access channel (MAC [] on the Manuscript received May 7 23; revised June 24; accepted October Date of publication November 2 24; date of current version December L. Lai was supported in part by the National Science Foundation (NSF CAREER under Award CCF and in part by the NSF under Grant CCF Y. Liang was supported in part by the NSF CAREER under Award CCF and in part by the NSF under Grant CCF and Grant CCF S. Shamai was supported in part by the NSF under Grant CCF in part by the Israel Science Foundation (ISF and in part by the European Commission within the framework of the Network of Excellence in Wireless Communications (NEWCOM#. This paper was presented at the 2 IEEE Symposium on Information Theory [] and 23 IEEE International on Communications in China [2]. L. Lai is with the Department of Electrical and Computer Engineering Worcester Polytechnic Institute Worcester MA 69 USA ( llai@wpi.edu. Y. Liang is with the Department of Electrical Engineering and Computer Science Syracuse University Syracuse NY 32 USA ( yliang6@syr.edu. S. Shamai is with the Department of Electrical Engineering Technion Israel Institute of Technology Haifa 32 Israel ( sshlomo@ee.technion.ac.il. Communicated by W. Yu Associate Editor for Shannon Theory. Color versions of one or more of the figures in this paper are available online at Digital Object Identifier.9/TIT broadcast channel [2] and on the Poisson channel with side information at the transmitter [3]. The optical interference channel was discussed in [4] in which several open questions were posted. In this paper we study the Poisson interference channel that models simultaneous transmission of multiple optical communication links. The Poisson interference channel differs from the standard Gaussian interference channel well-studied in [5] [8] in the following aspects. Unlike the Gaussian channel the Poisson channel is not scale-invariant. Hence one cannot arbitrarily multiply the input signal with a constant and later on divide the output with the same constant while preserving the mutual information between the input and the output. Furthermore in the Gaussian interference channel once the receiver decodes a signal it can completely cancel the effects of this signal. This is not the case for the Poisson channel anymore. As a result certain techniques used to derive various capacity bounds for the Gaussian interference channel are not applicable for the Poisson interference channel. On the other hand the Poisson channel still has the properties of infinite divisibility and independent increments that the Gaussian channel has [9]. This similarity allows us to apply certain ideas designed for the Gaussian interference channel to the Poisson interference channel with appropriate modifications. In this paper we first identify the conditions for the strong interference regime and characterize the corresponding capacity region which is the same as that of the compound Poisson MAC with each receiver decoding both messages. The basic idea involves carefully constructing a new Poisson process based on the decoded signal to enhance the receivers and then performing a proper thinning operation in which we randomly and independently erase points from the Poisson process. We then study the case in which the strong interference conditions are not satisfied. We derive an inner bound on the capacity region by first approximating the Poisson interference channel with a binary interference channel via the approach introduced by Wyner [4] and then evaluating the Han-Kobayashi [2] region for the binary interference channel under certain input distributions. Furthermore we derive several outer bounds on the capacity region. These bounds are derived based on reducing noise level at the receiver providing various side-information to receivers and channel transformation. We further study the Poisson Z-interference channel in which one of the cross link coefficients is zero. For this channel we determine the sum rate capacity when the remaining cross link coefficient is either sufficiently small or IEEE. Personal use is permitted but republication/redistribution requires IEEE permission. See for more information.

2 224 IEEE TRANSACTIONS ON INFORMATION THEORY VOL. 6 NO. JANUARY 25 sufficient large. We show that when the remaining cross link coefficient is sufficiently small the sum capacity is achieved by treating the interference (if any as noise. In this case to maximize the sum rate capacity the transmitter whose corresponding receiver does not experience the interference does not necessarily transmit at the maximum possible rate. This is in contrast to the corresponding case of the Gaussian Z-interference channel in which the user not experiencing the interference always transmits at the full rate in order to achieve the sum rate capacity. When the cross link coefficient is sufficiently large the sum capacity is achieved by letting the receiver experiencing interference decode both messages. The remainder of the paper is organized as follows. In Section II we introduce the Poisson interference channel model. In Section III we derive conditions for the strong interference channel. In Section IV we show that the capacity region of the Poisson interference channel remains unchanged if we approximate it with a binary input binary output discrete memoryless channel. We derive inner bound and out bounds for the capacity region for the general Poisson interference channel in Section V. In Section VI we study the sum capacity for the Poisson Z-interference channel. We present numerical examples in Section VII. Finally we conclude the paper with a few remarks in Section VIII. II. SYSTEM MODEL A. Poisson Interference Channel We study the two-user Poisson interference channel in which inputs X (t and X 2 (t from the transmitters satisfy the following conditions: X i (t A i for all t and for i = 2 ( which means that the input of the Poisson channel is nonnegative and less than a threshold due to safety considerations. The output Y (t for t T at receiver is a doubly stochastic Poisson process with the instantaneous rate ax (t + bx 2 (t + λ and the output Y 2 (t for t T at receiver 2 is another doubly stochastic Poisson process with the instantaneous rate cx (t + dx 2 (t + λ 2. More precisely for all t < s T Pr {Y (s Y (t = k X ( T X 2 ( T } = k! k e k =... Pr {Y 2 (s Y 2 (t = k X ( T X 2 ( T } = k! k 2 e 2 k =... in which = 2 = s t s t (ax (τ + bx 2 (τ + λ dτ (cx (τ + dx 2 (τ + λ 2 dτ In this paper only peak power constraint is considered. Certain results can be extended to the case with additional average power constraints. Fig.. The Poisson interference channel. and X i ( T denotes {X i (t : t T }. WeuseY = P(aX ( T + bx 2 ( T + λ and Y 2 = P(cX ( T + dx 2 ( T + λ 2 to denote the output processes at receivers and 2 respectively. Figure illustrates the Poisson interference channel under consideration. It is easy to see that the received signals at receivers are non-decreasing step functions. We use M i M i to denote the message of transmitter i. A code (T M M 2 μ μ 2 is defined to include Two message sets M i ={... M i } for i = 2; 2 Two sets of waveforms X imi ( form i {... M i } and i = 2 each of which satisfies the power constraint (; 3 Two encoders f i at encoder i for i = 2 each of which maps each message m i to a waveform x imi ( ; 4 Two decoders D i at receiver i for i = 2: Y i ( T ˆM i {... M i } each of which maps a received signal Y i ( T to a message ˆm i ; 5 Two average error probabilities: μ i = Pr{ ˆM i = M i } for i = 2. A rate pair (R R 2 is said to be achievable if for any ɛ> there exists a sequence of codes (T M M 2 μ μ 2 with M i 2 TR i and μ i ɛ for i = 2. The capacity region C is the closure of the set of all achievable rate pairs. We are also interested in the sum rate capacity defined as C sum = max {R + R 2 }. (R R 2 C In order to simplify the notation we define p = p for p and define the following functions that are used throughout the paper: ψ(αβ; γλ := (α + β + λ ln(α + β + λ (γ + β + λ ln(γ + β + λ (2 φ(p Aλ := pλ ln λ + p(a + λ ln(a + λ (Ap + λ ln(ap + λ. (3 In addition we use Ỹ = Y p to denote the thinning operation with the parameter pinwhich we obtain a process Ỹ by randomly and independently erasing points from a Poisson process Y with the probability p (thus keeping each point of Y with the probability p. In the definition above if we require both decoders to decode both messages we then have the Poisson compound MAC channel whose capacity region is clearly outer-bounded by that of the Poisson interference channel.

3 LAI et al.: ON THE CAPACITY BOUNDS FOR POISSON INTERFERENCE CHANNELS 225 Fig. 2. The Poisson Z-interference Channel. Fig. 3. The channel transformations at receivers: (a transformation at receiver ; (b transformation at receiver 2. B. Poisson Z-Interference Channel In this paper we will also consider the Poisson Z-interference channel shown in Figure 2. In the Poisson Z-interference channel one of the receivers (say receiver does not experience any interference. One can get the Poisson Z-interference channel model by setting b = inthe Poisson interference channel discussed in Section II-A. For the Poisson Z-interference channel we will focus on the sum rate capacity. C. Differences Between Poisson and Gaussian Interference Channels In the following we summarize the differences between the Poisson and Gaussian interference channels which helps to explain why the analysis of the Poisson interference channel requires development of new techniques. There are two main differences between the Poisson interference channel and the Gaussian interference channel. Firstly the Poisson channel is not scale-invariant. More specifically for a constant a = ap(x ( T + λ /a is different from P(aX ( T + λ and as a result I (X ( T ; ap(x ( T + λ /a is different from I (X ( T ; P(aX ( T + λ. Secondly in the Poisson interference channel even if X ( T is known it is not possible to construct P(bX 2 ( T + λ from P(aX ( T + bx 2 ( T + λ i.e. it is not possible to completely cancel the effects of X ( T from the output even if its value is known. Hence complete interference cancellation is not possible in the Poisson interference channel. On the other hand the Poisson channel also has the infinite divisibility and independent increments properties that the Gaussian channel has. This similarity allows us to apply certain ideas from the Gaussian interference channels to the Poisson interference channel. III. CAPACITY REGION FOR THE STRONG INTERFERENCE CHANNEL In this section we characterize the capacity region of the Poisson interference channel in the so-called strong interference regime. We obtain the capacity region result by showing that if the channel parameters satisfy certain conditions the capacity region of the Poisson interference channel is the same as that of a compound Poisson MAC. The central argument of our proof is to show that if certain conditions are satisfied any rate pair (R R 2 achievable for the Poisson interference channel is also achievable for the Poisson compound MAC. Since it is clear that rate pairs achievable for the compound MAC is achievable for the interference channel the equivalence of the two capacity regions is established. The argument for the equivalence of these two channels is slightly different from that of the Gaussian interference channel [6] due to the differences between the Gaussian channel and the Poisson channel discussed in Section II. As detailed in the proof it involves enhancement and thinning arguments. Theorem 3.: If b d λ a λ 2 c (4 and b d a c (5 the capacity region of the Poisson interference channel is the convex hull of R pq with R pq being: pq R min { Eψ(aA X b ba 2X2 b ; aa pλ Eψ(cA X b da 2 X2 b ; ca pλ 2 } R 2 min { Eψ(bA 2 X2 b aa X b ; ba 2qλ Eψ(dA 2 X2 b ca X b ; da 2qλ 2 } R + R 2 min { Eψ(aA X b + ba 2X2 b ; aa p + ba 2 qλ Eψ(cA X b + da 2X2 b ; ca p + da 2 qλ 2 } (6 in which the expectation is computed for binary input distributions with P[X b = ] = P[X b = ] = p and P[X2 b = ] = 2 b = ] =q. Proof: As discussed above it suffices to show that if the conditions in the theorem are satisfied any rate pair achievable in the Poisson interference channel is also achievable in the Poisson compound MAC. We show this by certain enhancement and thinning arguments. Figure 3(a shows the operations involved at receiver. Suppose that a rate pair (R R 2 is achievable for the Poisson interference channel. Hence there exists a code (T M M 2 μ μ 2 such that T log M i R i ɛ for any ɛ > and for sufficiently large T. Furthermore decoder can decode X ( T with an average error probability less than μ.oncex ( T is successfully decoded decoder generates a doubly stochastic Poisson process P((bc/d ax ( T + bλ 2 /d λ which is independent of all other processes considered in the problem given X ( T with the instantaneous rate (bc/d ax (t+ bλ 2 /d λ. We can do this as long as bc/d a and bλ 2 /d λ. Decoder then adds this process to

4 226 IEEE TRANSACTIONS ON INFORMATION THEORY VOL. 6 NO. JANUARY 25 Y and obtain Ỹ = Y + P((bc/d ax ( T + bλ 2 /d λ. Due to the properties of the Poisson process [2] Ỹ is a doubly stochastic Poisson process with the instantaneous rate bcx (t/d + bx 2 (t + bλ 2 /d. Finally decoder constructs a process Ỹ 2 = Ỹ (d/b. This can be done if d/b. Again due to the properties of the Poisson process [2] Ỹ 2 is a doubly stochastic Poisson process with the instantaneous rate d/b(bcx (t/d + bx 2 (t + bλ 2 /d = cx (t + dx 2 (t + λ 2. Hence the joint statistic of (Ỹ 2 (t X (t X 2 (t is the same as that of (Y 2 (t X (t X 2 (t. Since decoder 2 is able to decode X 2 ( T based on Y 2 with an error probability less than μ 2 decoder is able to decode X 2 ( T based on Ỹ 2 with an error probability less than μ 2 as well as long as it decodes X successfully. Hence receiver is able to decode X 2 based on Y with an error probability less than μ + μ 2. To carry out the operations above we require that bc d a bλ 2 d λ d. (7 b Similarly Figure 3 (b illustrates the operations involved at receiver 2. If the following conditions are satisfied decoder 2 is able to decode both X with an error probability less than μ + μ 2 and X 2 with an error probability less than μ 2 based on Y 2 : bc a d cλ a λ a 2. (8 c It is clear that conditions (7 and (8 are equivalent to (4 and (5. Hence if these conditions are satisfied the capacity region of the Poisson interference channel is the same as that of the compound MAC. The Poisson MAC has been studied in []. Using [ eq. (2.6] we know the capacity of the compound MAC is contained in the union of all rate pairs that satisfy the following conditions: R min { Eψ(aA X (t ba 2 X 2 (t; aa pλ dt T } Eψ(cA X b da 2 X2 b ; ca pλ 2 dt R 2 min { Eψ(bA 2 X 2 (t aa X (t; ba 2 qλ dt T } Eψ(cA X (t da 2 X 2 (t; ca pλ 2 dt R + R 2 min T { Eψ(aA X (t + ba 2 X 2 (t ; aa p + ba 2 qλ dt } Eψ(cA X (t + da 2 X 2 (t ; ca p + da 2 qλ 2 dt. Furthermore following [] we know that binary quickly varying inputs can achieve the equalities above. We use Xi b to denote the input to the binary MAC obtained by this approximation. Using [ eqs. (3.4 (3.9] we conclude that the capacity region of a compound MAC is given by (6. IV. DISCRETE-TIME BINARY CHANNEL APPROXIMATION In this section we show that a Poisson interference channel can be approximated by a discrete-time binary input binary output (BIBO memoryless interference channel that has the same capacity region (although other quantities such as the error exponent might be different. This result will be used for deriving capacity bounds in the following sections. Following [4] we approximate Poisson interference channels by BIBO discrete memoryless channels as follows. We fix some small constant > and divide the transmission time into intervals with each having duration. We further require that The transmitted signal X i (t takes only values and A i and is constant over the interval (n (n+ ]forevery n = 2...; Receiver i observes only samples Y i (n n = 2...; or alternatively the increments Ŷ in = Y i (n Y i ((n. Furthermore receiver i interprets Ŷ in 2 (a rare event when is small as being the same as Ŷ in =. Thus receiver i obtains an output { Ŷin = Y in = Ŷ in =. (9 Although the capacity regions of both the general Poisson interference channel and the binary interference channel are unknown we have the following result that shows that this approximation does not reduce the capacity region of the Poisson interference channel. Theorem 4.: Let {X imi ( } M i i = 2 be any code with parameters (T M M 2 μ μ 2 for the Poisson interference channel. Let ɛ> be arbitrary. Then there must exist a code { ˆX imi ( } M i with parameters (T M M 2 ˆμ ˆμ 2 that satisfy ˆμ i μ i +ɛ and the properties that for some N and = T/N a the decoder mapping ˆD i :{Y ik } k= N { 2...M i }; b for each m i the waveform ˆX imi (t is constant on the interval [(n n n N; and ˆX imi (t takes only the values or A i ; Proof: The proof adapts Wyner s proof for the optimality of binary approximation for the point-to-point Poisson channel [4] to take into account of the multiuser nature of the interference channel. Details can be found in Appendix A. With this approximation the Poisson interference channel can be converted into a binary discrete-time memoryless channel. In the following we use Xi b and Yi b i = 2to denote the inputs and outputs of the corresponding binary interference channel respectively. Based on Theorem 4. the transition probabilities V (Y b X b X 2 b (to receiver and V 2 (Y2 b X b X 2 b (to receiver 2 of the binary interference channel are given by: V ( = V ( = exp λ = λ + O( 2 V ( = V ( = exp (aa +λ = (aa + λ + O( 2 V ( = V ( = exp (ba 2+λ = (ba 2 + λ + O( 2

5 LAI et al.: ON THE CAPACITY BOUNDS FOR POISSON INTERFERENCE CHANNELS 227 V ( = V ( = exp (aa +ba 2 +λ = (aa + ba 2 + λ + O( 2 V 2 ( = V 2 ( = exp λ 2 = λ 2 + O( 2 V 2 ( = V 2 ( = exp (ca +λ 2 = (ca + λ 2 + O( 2 V 2 ( = V 2 ( = exp (da 2+λ 2 = (da 2 + λ 2 + O( 2 V 2 ( = V 2 ( = exp (ca +da 2 +λ 2 = (ca + da 2 + λ 2 + O( 2. V. CAPACITY BOUNDS In this section we derive inner and outer bounds on the capacity region for general Poisson interference channels that do not satisfy the strong interference conditions (4 and (5. A. Inner Bound for General Poisson Interference Channels After the binary approximation we can obtain an achievable region for the Poisson interference channel by evaluating the corresponding binary discrete-time interference channel based on the Han-Kobayashi region [2] for certain input distributions. We evaluate the simplified Han-Kobayashi region [22] of the binary interference channel using Q = (i.e. no time-sharing W and W 2 as binary random variables with P W ( = P W ( = p and P W2 ( = P W2 ( = p 2. We further assume that X b is generated from W via a testing binary symmetric channel with parameter q and X2 b is generated from W 2 via a testing binary symmetric channel with parameter q 2. Using this input distribution we can compute various mutual information terms involved in the Han-Kobayashi region. To obtain the achievable rate region of the original Poisson interference channel from the achievable region from this binary interference channel we then need a proper normalization /. The computation of various mutual information term using this input distribution can be found in Appendix C. B. Outer Bounds for General Poisson Interference Channels In the section we derive several outer bounds on the capacity region for the Poisson interference channel when one or both of the strong interference conditions (4 and (5 are not satisfied. The following table lists the main propositions the corresponding scenarios and the main techniques used for each case. Proposition Condition Technique Proposition 5. (4 does not hold (5 holds noise reduction Proposition 5.2 (4 holds (5 does not hold channel transformation Proposition 5.4 neither (4 nor (5 genie aided holds Proposition 5.5 both (4 and (5 hold partially genie aided Outer Bound When (5 Holds But (4 Does Not Hold: The first outer bound is derived to be applicable when (5 is satisfied but (4 is not satisfied. In this case we have either λ /λ 2 > b/d or a/c >λ /λ 2. We consider the case λ /λ 2 > b/d the other case is similar. We derive an outer bound based on reducing the intensity of the noise. Consider an enhanced Poisson interference channel with output process at receiver being Ỹ = P(aX ( T + bx 2 ( T + d b λ 2 and the output process at receiver 2 being the same as Y 2.Sinceλ /λ 2 > b/d wehave d b λ 2 <λ.itis clear that any rate pair (R R 2 achievable for the original Poisson interference channel is also achievable for the channel specified by (X X 2 (Ỹ Y 2 because one can generate (Ŷ Y 2 whose joint statistics with (X X 2 is the same as that of (Y Y 2 X X 2 by adding Ỹ with a randomly generated process P(λ d b λ 2. Now the channel (X X 2 (Ỹ Y 2 is a strong interference channel since the condition in (4 is satisfied with the reduced noise. As a result we have the following proposition. Proposition 5.: When (5 is satisfied and λ /λ 2 > b/d the capacity region of the Poisson interference channel is inside of the convex hull of R pq with R pq being: pq { ( R min Eψ aa X b ba 2X2 b ; aa p bλ 2 /d } (ca X b da 2 X2 b ; ca pλ 2 R 2 min Eψ { ( Eψ ba 2 X2 b aa X b ; ba 2q bλ 2 /d } Eψ(dA 2 X2 b ca X b ; da 2qλ 2 R + R 2 min { Eψ ( aa X b +ba 2X2 b ; aa p+ba 2 q bλ 2 /d } Eψ(cA X b +da X2 b ; ca p+da 2 qλ 2 ( with the expectation being computed for binary input distributions with P[X b = ] = P[X b = ] = p and P[X2 b = ] = 2 b = ] =q. 2 Outer Bound When (4 Holds But (5 Does Not Hold: The second outer bound is applicable when (4 is satisfied but (5 is not satisfied. In this case we have either > b/d a/c or b/d a/c >. We consider the case > b/d a/c and the other case is similar. We derive an outer bound using arguments similar to that of Theorem 3.. If the condition (4 and > b/d a/c are satisfied we have that all operations in Figure 3 (b are still valid. And hence under these conditions receiver 2 is able to decode both X and X 2. As a result the capacity region of the Poisson interference channel remains the same if we require receiver 2 decode both messages. Hence we have the following result. Proposition 5.2: When (4 is satisfied and > b/d the capacity region of the original Poisson interference channel is inside of the convex hull of R pq with R pq being: pq { R min Eψ ( aa X b ; aa pλ Eψ ( ca X b da 2X b 2 ; ca pλ 2 }

6 228 IEEE TRANSACTIONS ON INFORMATION THEORY VOL. 6 NO. JANUARY 25 Proof: The proof follows closely from that of [] and [23] and details are provided in Appendix B. Based on the above lemma we have the following proposition. Proposition 5.4: When a/c > and a/c λ /λ 2 the capacity region of the Poisson interference channel is inside of the convex hull of R pq with R pq being: pq Fig. 4. The channel transformation at receiver. R 2 Eψ(dA 2 X2 b ca X b ; da 2qλ 2 ( R + R 2 Eψ(cA X b + da 2X2 b ; ca p + da 2 qλ 2 with the expectation being computed for binary input distributions with P[X b = ] = P[X b = ] = p and P[X2 b = ] = 2 b = ] =q. In the bound above the first term on R is obtained from the fact that it is upper-bounded by the rate achievable in an interference free link the second term comes from the fact that receiver 2 is also able to decode X as discussed above. 3 Outer Bound When Neither (5 Nor (4 Holds: The third outer bound is applicable when a/c > b/d. We derive an outer bound based on a genie aided argument. We first consider the situation when a/c > anda/c λ /λ 2. We derive this bound by providing extra information to receiver. Suppose a genie generates a doubly stochastic Poisson process Y = P((ad/c bx 2( T + (λ 2 cλ /a which is independent of other processes given X 2 ( T and provides this information to receiver (See Figure 4 for an illustration.. Now based on Y and this additional information receiver can generate a Poisson process Ỹ 2 = (Y + Y c a. It is easy to verify that the process Ỹ 2 is a doubly stochastic Poisson process with instantaneous rate cx (t + dx 2 (t + λ 2 and hence the joint statistics of (Ỹ 2 X X 2 is the same as that of (Y 2 X X 2. Since receiver 2 can decode X 2 from Y 2 with an error probability less than μ 2 receiver is also able to decode X 2 from Ỹ 2. As a result the capacity region of the original Poisson interference channel should be inside of the capacity region of the new MAC with receiver having two observation processes (Y Y. We first have the following lemma. Lemma 5.3: The capacity region of the MAC (X X 2 (Y Y is the convex hull of R pq with R pq being: pq R Eψ(aA X b ba 2X2 b ; aa pλ R 2 Eψ(bA 2 X2 b aa X b ; ba 2qλ + Eψ ( (ad/c b A 2 X2 b ; (ad/c b A 2qλ 2 cλ /a R + R 2 Eψ(aA X b + ba 2X2 b ; aa p + ba 2 qλ + Eψ ( (ad/c b A 2 X2 b ; (ad/c b A 2qλ 2 cλ /a with the expectation being computed for binary input distributions with P[X b = ] = P[X b = ] = p and P[X2 b = ] = 2 b = ] =q. R Eψ(aA X b ba 2X2 b ; aa pλ R 2 min { Eψ(dA 2 X2 b ; da 2qλ 2 Eψ(bA 2 X2 b aa X b ; ba 2qλ + Eψ ( (ad/c b A 2 X2 b ; (ad/c b A 2 qλ 2 cλ /a } R + R 2 Eψ(aA X b + ba 2X2 b ; aa p + ba 2 qλ +Eψ ( (ad/c b A 2 X2 b ; (ad/c b A 2qλ 2 cλ /a (2 with the expectation being computed for binary input distributions with P[X b = ] = P[X b = ] = p and P[X2 b = ] = 2 b = ] =q. Proof: As a/c > anda/c λ /λ 2 all bounds in Lemma 5.3 hold. In addition R 2 Eψ(dA 2 X b 2 ; da 2qλ 2 as R 2 has to be smaller than the capacity of the interference free point-to-point channel from transmitter 2 to receiver 2 i.e. the channel X 2 P(dX 2 + λ 2. For the case when a/c > b/d i.e.d/b > c/a we reverse the roles of receiver and receiver 2 in the argument above. Namely a genie provides receiver 2 with extra information for it to decode X. Then the capacity region of the original Poisson interference channel is contained in the capacity region of the enhanced MAC. 4 Outer Bound When (5 and (4 Hold Partially: The genie aided outer bound developed in Section V-B3 can be extended to be applicable for when (5 and (4 hold partially. Suppose a genie provides a doubly stochastic Poisson process Ŷ = P(δ X 2 ( T + δ 2 to receiver. Here given X 2 ( T Ŷ is independent of all other processes of interest to this problem. Obviously this does not decrease the capacity region of the Poisson interference channel. Since receiver can decode X ( T with an error probability less than μ from the process Y receiver can also generate a process Ỹ = P(δ 3 X ( T that is independent of all other processes given X ( T. Now we construct a process Ŷ 2 by setting Ŷ 2 = (Y + Ŷ + Ỹ δ. Then Ŷ 2 is a doubly stochastic Poisson process with instantaneous rate a + δ 3 δ X (t + b + δ δ X 2 (t + λ + δ 2. δ

7 LAI et al.: ON THE CAPACITY BOUNDS FOR POISSON INTERFERENCE CHANNELS 229 Now if we set δ 3 = cδ a δ = dδ b δ 2 = λ 2 δ λ then the process Ŷ 2 has the same joint statistics with (X X 2 as that of (Y 2 X X 2. Hence from Ŷ 2 receiver is able to decode X 2 with an error probability less than μ 2. To carry out the operations above we require that δ max { a c b d λ λ 2 As a result receiver should be able to decode both X and X 2 from (Y Ŷ. Using the result in Lemma 5.3 we have the following proposition. Proposition 5.5: For general parameters the capacity region of the Poisson interference channel is inside of the convex hull of R pq with R pq being: pq R Eψ(aA X b ba 2X2 b ; aa pλ R 2 min { Eψ(dA 2 X2 b ; da 2qλ 2 Eψ(bA 2 X2 b aa X b ; ba 2qλ + Eψ ( (dδ ba 2 X2 b ; (dδ ba } 2qλ 2 δ λ R + R 2 Eψ(aA X b + ba 2X2 b ; aa p + ba 2 qλ + Eψ ( (dδ ba 2 X2 b ; (dδ ba 2qλ 2 δ λ (3 with the expectation being computed for binary input distributions with P[X b = ] = P[X b = ] = p and P[X2 b = ] = P[X { 2 b = ] =q andδ is any parameter that satisifies δ max a c d b λ λ 2 }. One can follow the same steps above and derive another outer-bound by providing genie-aided information to receiver 2. VI. POISSON Z-INTERFERENCE CHANNEL In this section we study the Poisson Z-interference channel illustrated in Figure 2. The sum capacity of the Gaussian Z-interference channel has been determined in [24] for all parameter ranges. The argument relies on the ability to completely cancel the interference once the signal is decoded in the Gaussian channel. However such an argument does not work for the Poisson channel. In the following we characterize the sum capacity of the Z-interference channel in certain parameter range by studying the corresponding discrete-time memoryless binary Z-interference channel and thus leveraging the capacity results for the discrete-time memoryless Z-interference channel [25]. We first determine the sum capacity of the Poisson Z-interference channel when c is sufficiently small i.e. in the weak interference regime. Theorem 6.: For the Poisson Z-interference channel if }. c < min {a aλ 2 /λ } (4 Fig. 5. Construction of a degraded Z-interference channel. then the sum capacity is given by C sum = lim [φ(p aa λ + u + da 2 ( + da 2 /u u/(da 2 p e + u ( + u/(da 2 ln ( + da 2 /u] with u = pca + λ 2. Furthermore the sum capacity is achieved by receiver 2 treating the signal of transmitter as noise. Proof: First following from Theorem 4. the capacity region of the Poisson Z-interference is the same as the capacity region of the following discrete-time memoryless binary Z-interference channel (after normalizing by / when tends to zero. Again we use Xi b and Yi b i = 2 to denote the input and output of the corresponding binary Z-interference channel. Using the binary approximation discussed in Section V-A the transition probabilities V (Y b X b (to receiver as it is independent of X 2 band V 2 (Y2 b X b X 2 b (to receiver 2 of the binary interference channel are given by: V ( = V ( = λ + O( 2 (5 V ( = V ( = (aa + λ + O( 2 (6 V 2 ( = V 2 ( = λ 2 + O( 2 (7 V 2 ( = V 2 ( = (da 2 + λ 2 + O( 2 (8 V 2 ( = V 2 ( = (ca + λ 2 + O( 2 (9 V 2 ( = V 2 ( = (ca + da 2 + λ 2 + O( 2. (2 Using these transition probabilities in the following we show that if c < min {a aλ 2 /λ } then there exists a distribution W(Y2 b X 2 b Y b such that V 2 (y2 b x b x 2 b = V (y b x b W(yb 2 x 2 b yb (x b x 2 b yb 2. y b (2 We can characterize W(Y2 b X 2 b Y b by the following parameters q = W( = W( q 2 = W( = W( q 3 = W( = W( q 4 = W( = W(. The construction is illustrated in Figure 5 (in the figure we show only the transition probabilities to. To satisfy

8 23 IEEE TRANSACTIONS ON INFORMATION THEORY VOL. 6 NO. JANUARY 25 the conditions in (2 q i i =...4 should be chosen such that (ignoring O( 2 terms: V 2 ( = λ 2 = q ( λ + q 3 λ V 2 ( = (da 2 + λ 2 = q 2 ( λ + q 4 λ V 2 ( = (ca + λ 2 = q ( (aa + λ + q 3 (aa + λ V 2 ( = (ca + da 2 + λ 2 = q 2 ( (aa + λ + q 4 (aa + λ. From these equations we obtain q = (aλ 2 cλ a q 2 = (ada 2 + aλ 2 cλ a q 3 = c + (aλ 2 cλ a q 4 = c + (ada 2 + aλ 2 cλ. a Now one can verify that if (4 holds then we have < q i < (we need strict inequalities to take care of O( 2 terms when is sufficiently small. Since the capacity region of the Z-interference channel depends only on the marginal distributions of two receivers the capacity region of the channel specified in (5 (2 is the same as the capacity region of the channel specified in Figure 5. The channel specified in Figure 5 satisfies X b (X 2 b Y b Y 2 b. The sum capacity of a discrete-time memoryless Z-interference channel that satisfies this Markov chain property has been determined in [25]. Applying the result in [25] we have the sum capacity of the Poisson Z-interference channel given by C sum = lim max{i (X b pq ; Y b + I (X 2 b ; Y 2 b } = max {I X pq ;Y + I X ;Y 2 } (22 in which p = Pr{X b = } and q = Pr{X 2 b = } and I X ;Y lim I X ;Y 2 lim I (X b ; Y b I (X 2 b ; Y 2 b. This sum capacity is achieved by treating the other user s signal (if any as noise. It is easy to verify that (22 can be written as C sum = lim [φ(p aa λ + φ(q da 2 pca +λ 2 ]. pq (23 To solve this optimization problem we observe that φ(p aa λ does not depend on q. Foranyp byasimple calculation it can be shown that the optimal value of q that maximizes φ(q da 2 pca + λ 2 is given by q = u [ ( + da + u ] da 2 2. (24 da 2 e u Fig. 6. Fig. 7. The transformation for the Poisson Z-interference channel. The sum capacity for the Poisson Z-interference channel. After detailed calculation by plugging (24 in (23 we have C sum = lim [φ(p aa λ + u + da 2 ( + da 2 /u u/(da 2 p e + u ( + u/(da 2 ln ( + da 2 /u] (25 which concludes the proof. The condition in Theorem 6. can also be understood based on the transformation illustrated in Figure 6. From the figure we can see that the capacity regions of these two channels are equivalent if the conditions in Theorem 6. are satisfied. Furthermore we have X (X 2 Y Y 2 in the transformed channel. In all numerical tests we have tried the objective function in (25 is concave. However we are not able to analytically show whether the function to be optimized is concave or not because the form is too complicated. For the Gaussian Z-interference channel with weak interference in order to achieve the sum capacity the transmitter whose corresponding receiver does not experience interference should transmit at the full possible rate [24]. The following example shows that this is not the case for the Poisson Z-interference channel anymore. In this example we set A = A 2 = c = d = λ = λ 2 = and a = 2.. It is easy to verify that these choices of parameters satisfy the conditions in Theorem 6.. Figure 7 plots the values of R R 2 and R sum as functions of p. To maximize R i.e. to maximize the rate of the user not experiencing interference

9 LAI et al.: ON THE CAPACITY BOUNDS FOR POISSON INTERFERENCE CHANNELS 23 we should choose p =.455. However to maximize the sum rate we should choose p =.439. Hence for the Poisson Z-interference channel with weak interference to achieve the sum capacity the user not experiencing interference does not necessarily transmit at the maximum rate. It is also easy to show that the optimal value of psum that maximizes the sum rate is always less or equal to the optimal value of p that maximizes R of user only. We next characterize the sum capacity when the interference is sufficiently strong i.e. in the strong interference regime. Theorem 6.2: Define c as the minimal value of c such that min {φ(p ca da 2 + λ 2 φ(p aa λ }=. (26 p For the Poisson Z-interference channel if c c the capacity region is the convex hull of R pq with pq R pq = (φ(p aa λ pφ(q da 2 λ 2 + pφ(q da 2 ca + λ 2 and the sum capacity is given by C sum = max {φ(p aa λ + pφ(q da 2 λ 2 pq +pφ(q da 2 ca + λ 2 }. (27 Remark 6.3: We note that in (27 φ(p aa λ is not a function of q while pφ(q da 2 λ 2 + pφ(q da 2 ca + λ 2 is a concave function of q for a given p. Hence the optimization problem in (27 can be converted to a one-dimensional optimization problem by first solving the optimal value of q as a function of p that maximizes pφ(q da 2 λ 2 + pφ(q da 2 ca +λ 2 and then optimizing the resulting objective function over p. Proof: We consider the binary Z-interference channel obtained via the binary approximation. From Theorem 4. the capacity region of the Poisson Z-interference channel is the same as the binary Z-interference channel. For any p q we have I X ;Y := lim I (X b ; Y b = φ(p aa λ I X ;Y 2 := lim I X ;Y 2 X 2 := lim I X2 ;Y 2 X := lim I (X b ; Y 2 b = φ(p ca qda 2 + λ 2 I (X b ; Y 2 b X 2 b = qφ(p ca λ 2 + qφ(p ca da 2 + λ 2 (28 I (X 2 b ; Y 2 b X b = pφ(q da 2 λ 2 + pφ(q da 2 ca + λ 2. Converse: We first have a simple outer bound on the capacity region. For any given p q define R pq be the set of (R R 2 satisfying the following conditions: R I X ;Y (29 R 2 I X2 ;Y 2 X (3 It is easy to see that the convex hull of R pq is an outer pq bound of the capacity region. Achievability: If we additionally require receiver 2 to decode messages from transmitter then any rate pair achievable with this additional requirement is also achievable for the Poisson Z-interference channel. For any given p q let R a pq be the set of (R R 2 satisfying the following conditions: R I X ;Y (3 R I X ;Y 2 X 2 (32 R 2 I X2 ;Y 2 X (33 R + R 2 I X X 2 ;Y 2. (34 As the first link is interference-free if R satisfies (3 receiver is able to decode the first message. The channel between two transmitters and receiver 2 can be viewed as a multiple access channel. Therefore as long as the bounds in (32 (34 are satisfied receiver 2 can decode both messages and hence the second message. Hence the convex hull of R a pq is an inner bound for the capacity region of the pq Poisson Z-interference channel. Now we simplify R a pq.firstwehave φ(p Aλ = p ln λ + p ln(a + λ ln(ap + λ. (35 λ which implies that φ(p Aλ is a non-increasing function of λ. Hence φ(p ca λ 2 φ(p ca da 2 + λ 2. Coupled with (28 this implies that for any p wehave min I X ;Y 2 X 2 = φ(p ca da 2 + λ 2. q As a result if min {φ(p ca da 2 + λ 2 φ(p aa λ }= p we have I X ;Y 2 X 2 I X ;Y for all p q (36 which implies that (32 is redundant. In addition from (3 and (33 we have where (a follows because R + R 2 I X ;Y + I X2 ;Y 2 X (a I X ;Y 2 + I X2 ;Y 2 X = I X X 2 ;Y 2 I X ;Y 2 = φ(p ca qda 2 + λ 2 (b φ(p ca da 2 + λ 2 (c φ(p aa λ = I X ;Y in which (b is true due to (35 (c is true if (26 holds. This implies that (34 is redundant. As a result the region R a pq is the same as R I X ;Y R 2 I X2 ;Y 2 X.

10 232 IEEE TRANSACTIONS ON INFORMATION THEORY VOL. 6 NO. JANUARY 25 Fig. 8. Capacity region bounds when λ =.5; λ 2 = ; A = ; A 2 = ; a = ; b =.3; c =.2; d =. Fig. 9. Capacity region bounds when λ = ; λ 2 =.2; A = ; A 2 = ; a = ; b = ; c =.3; d =.. Since the simplified R a pq is the same as R pq thesumrate (actually the entire region can be achieved by the scheme that transmitter transmits at a rate that both receiver and receiver 2 can decode (when receiver 2 decodes it treats signals from transmitter as interference the message from transmitter then transmitter 2 sends at the rate I X2 ;Y 2 X and receiver 2 decodes the message from transmitter 2 after first decoding the message from transmitter. Hence the sum rate capacity is given by C sum = max {φ(p aa λ + pφ(q da 2 λ 2 pq + pφ(q da 2 ca + λ 2 }. Remark 6.4: The scenario is similar to that of the strong Poisson interference channel discussed in Theorem 3. in the sense that the cross link coefficient is sufficiently large. However since b = the construction in the proof of Theorem 3. does not work here anymore. Fig.. Capacity region bounds when a = 5 b = c = 3 d = λ = λ 2 = 2; A = ; A 2 =. VII. NUMERICAL EXAMPLES In this section we provide several numerical examples to illustrate the results derived in the paper. Figure 8 plots the bounds for the Poisson interference channel with λ =.5; λ 2 = ; A = ; A 2 = ; a = ; b =.3; c =.2; d = which satisfy the condition that λ /λ 2 b/d a/c. In this case the outer bound derived in Proposition 5. is applicable. Figure 9 plots the bounds when λ = ; λ 2 =.2; A = ; A 2 = ; a = ; b = ; c =.3; d =. which satisfy > b/d λ /λ 2 a/c. Hence the outer bound derived in Proposition 5.2 is applicable. Figure plots the capacity region bounds when a = 5 b = c = 3 d = λ = λ 2 = 2; A = ; A 2 =. It is easy to check that for these parameters a/c > max{b/d λ /λ 2 } hence the outer-bound derived in Proposition 5.4 is applicable. Figure plots the capacity region bounds with a =.5 b =. c = 3 d = λ = λ 2 = ; A = ; A 2 =. One can check that in this case both (4 and (5 are partially satisfied. Hence in generating the figure we use the outer bound derived in Proposition 5.5. In all examples we can see that the gaps between the inner bound and outer bound are small. Fig.. Capacity region bounds when a =.5 b =. c = 3 d = λ = λ 2 = ; A = ; A 2 =. Figure 2 illustrates the sum capacity of the Poisson Z-interference channel in the weak interference regime. In the simulation we set a = 2. b = d = λ = λ 2 = ; A = ; A 2 =. Therefore as long as c min {a aλ 2 /λ }=2. the channel is in the weak interference regime. From the figure we can observe that the sum rate decreases as the cross-link coefficient increases. Figure 3 illustrates the relationship between a and the value c defined in Theorem 6.2 i.e. the minimum value of c so that the channel is in the strong interference regime. In the simulation we set b = d = λ = λ 2 = ; A = ; A 2 = and let the value of a change. One can observe that as a increases the value of c increases. From the figure we know that when

11 LAI et al.: ON THE CAPACITY BOUNDS FOR POISSON INTERFERENCE CHANNELS 233 max {φ(p aa λ + pc 2 }inwhichc 2 is the capacity of p the channel X 2 Y 2 when there is no interference. Clearly this value is less than the sum of the capacity of X Y and the interference free channel X 2 Y 2. VIII. CONCLUSIONS Fig. 2. C sum of the Poisson Z-interference channel vs c in the weak interference regime. Fig. 3. c defined in Theorem 6.2 vs a. We have studied the Poisson interference channel a suitable model for multiuser optical communications in the short-noiselimited regime. We have derived the conditions for the strong interference channel under which the capacity region remains the same even if we require both decoders to decode both messages. We have also derived several inner and outer bounds for the capacity region. We have also characterized the sum rate capacity for the Poisson Z-interference channel when the cross link coefficient is either sufficiently small or sufficiently large. For the future work it will be interesting to obtain finite gap results similar to that of the Gaussian interference channel [8]. It is also of interest to investigate band-limited channels which have been studied in the point-to-point setup in [7] in this multi-user setup. Another direction to pursue is to exploit the relationship between the mutual information and conditional mean estimations in Poisson channels established in [9] and [28]. The relationship between mutual information and conditional mean estimations for the Gaussian channel [26] [28] has provided important insights on the Gaussian Z-interference channels [27]. Although Poisson channels and Gaussian channels are different in terms of neutralizing interference the unifying framework proposed in [29] will be useful for this line of investigation. It is also of interest to extend the results to different optical communications models as those accounting for optical amplifiers and in general to identify regions of similar asymptotic behavior of different models such as the Gaussian and Poisson channels. APPENDIX A PROOF OF THEOREM 4. Fig. 4. C sum of the Poisson Z-interference channel vs c in the strong interference regime. a = 2. the value of c is 2.8. Hence as long as c > 2.8 we obtain the sum capacity. From the discussion of Figure 2 we know that if c 2. we also obtain the sum capacity. Hence for the set of parameters used in the simulation we characterize the sum capacity for the full range of c except for a small interval 2. < c < 2.8. Figure 4 illustrates the relationship between C sum and the value c in the strong interference regime. In the simulation we set a = 2. b = d = λ = λ 2 = ; A = ; A 2 =. From the figure we observe that as c increases the sum capacity decreases and converges to a constant. The convergence behavior can be explained from (27. As c increases φ(q da 2 ca + λ 2. Hence (27 converges to The proof follows that of [4 Th. 2.] with adaptations. In particular the proof first follows the construction steps in [4] until (39. However to set up the notations for the entire proof and for the completeness of the proof we repeat them here with slightly different notations. Let {X imi ( } M i be the code in the hypothesis of the theorem and let D i : S i (T {... M i } be the corresponding decoding mapping. Here S i (T is the space of step functions Y i ( T observed by receiver i. Letɛ> be given. For m i M i letb imi be the set of sample paths mapped by D i to m i i.e. B imi ={Y i ( : D i (Y i = m i }. For any event B i in S i (T and m i M i we write Pr{B i X imi ( } =P mi (B i.

12 234 IEEE TRANSACTIONS ON INFORMATION THEORY VOL. 6 NO. JANUARY 25 Hence μ i = M i Pr{D i (Y i = m i X imi ( } M i m i = = M i P mi (B imi. M i m i = Let A i (T be the σ -field of events in S i (T generated by Aˆ i (T which are the class of events defined by finite sets of samples of Y i ( T i.e.(y i (t...y i (t K wheret k [ T ] and k < K <. From the argument based on measure theory [3 p. 56] we have that for any B i A i (T and arbitrary δ> there exists a B i Aˆ i (T such that the symmetric difference P mi (B i B i δ in which the symmetric difference is defined as B i B i = (B i \ B i ( B i \ B i. It then follows that for a set of events {B imi } and arbitrary ɛ> there exist a corresponding set of events { B imi } defined by a suitably rich set of samples such that for all m i m i [ M i ] P mi (B im i B im ɛ/(2 M i. (37 i Corresponding to the events { B imi } we define the disjoint sets { ˆB imi } by ˆB i = B i ˆB imi = B imi m i <m i ˆB m 2 m i M i. i The disjoint set of events { ˆB imi } define the decoder ˆD i if ˆD i (Y i ( T = m i wheny i ( T ˆB imi.furthermorea finite set {t k } exists with t t K T such that the ˆB imi are determined by the samples {Y i (t k } K k=. We then have ˆμ i = M i P mi ( ˆB imi M i m i = M i P mi (B imi ˆB imi M i m i = = M i {P mi (B imi P mi (B imi ˆB im c M i i } m i = = μ i + M i P mi (B imi ˆB im c M i i. m i = It is easy to check that ˆB imi = B imi m i <m i = B imi = B imi m i <m i m i <m i ˆB m i B m i B c im. (38 i Thus using (37 (38 and following [4] we have P mi (B imi ˆB im c i ɛ. We now construct a new decoder D i.letn = KLwhere L is a large integer and let = T/N. The domain of ˆD i is {Y i (t k } k= K. For k =...K letn k be the integer which satisfies (n k < t k n k. The domain of D i is {Y i (n } n= N.Let D i ({Y i(n } n= N = ˆD i ({Y ink } k= K (39 in which Y in is defined in (9. For decoder m M P m (D (Y ( T = ˆD (Y ( T ( K ( K P m {Y (t k = Y (n k } + P m {Y (n k 2} = k= k= K [P m (Y (t k = Y (n k + P m (Y (n k 2] k= K k= + [ { exp nk t k }] (ax (t + bx 2 (t + λ dt K [ { } exp m n k m n k exp { }] m n k k= in which m n k = nk n k (ax (t + bx 2 (t + λ dt (4 (aa + ba 2 + λ. (4 Continuing from (4 we have P m (D (Y ( T = ˆD (Y ( T K [ (aa + ba 2 + λ +(aa + ba 2 + λ 2 +O( 2 ] = [ (aa +ba 2 + λ +(aa + ba 2 + λ 2 +O( 2 ] T/L which can be made arbitrarily small as L.Thesame holds for receiver 2. As a result part a of the theorem is proved. Now assume the decoders satisfy part a which implies that decoder D i (Y i ( T depends only on {Y in } n N. We note that the probability law defining {Y in } n N when X imi (t i = 2 are transmitted is unchanged if X imi (t are replaced by X imi (t such that n (n X imi (tdt = n (n X imi (tdt for n N. (42

13 LAI et al.: ON THE CAPACITY BOUNDS FOR POISSON INTERFERENCE CHANNELS 235 For m i M i n N i = 2 we define t mi n = (n + A i and for m i M i let n (n X imi (tdt X imi (t = { Ai (n t < t mi n t mi n < t < n Since the code X imi (t satisfies the condition (42 using the same decoder the error probabilities at the receivers remain the same. However X imi (t do not make transitions on an evenly spaced grid. Let N = NLwhereL is a large integer and let = T/N.For m i M i n N if X imi (t is constant on ((n n ]then ˆX imi (t = X imi (t for t in that interval. Otherwise ˆX imi (t = fort ((n n ]. Note that ˆX imi differs from X imi for at most N intervals. Hence a ˆX m + b ˆX 2m2 differs from a X m + b X 2m2 for at most 2N intervals. Thus for any m and m 2 a ˆX m + b ˆX 2m2 (a X m + b X 2m2 a ˆX m + b ˆX 2m2 (a X m + b X 2m2 2N(aA + ba 2 = 2(aA + ba 2 T/L. (43 Similarly we have c ˆX m + d ˆX 2m2 (c X m + d X 2m2 2(cA + da 2 T/L. Let ˆμ i be the error probability corresponding to the code { ˆX imi }.Then μ ˆμ M M M m = 2 M 2 m 2 = P(B m X m X 2m2 P(B m ˆX m ˆX 2m2. Following from [4 Lemma 2.2] for any ɛ> we have P(B m X m X 2m2 P(B m ˆX m ˆX 2m2 ɛ for sufficiently large L as long as (43 is satisfied. Thus part b of the theorem is established. APPENDIX B PROOF OF LEMMA 5.3 The capacity region of the MAC (X X 2 (Y Y is given by []: R T I (X ( T ; Y ( T Y ( T X 2( T R 2 T I (X 2( T ; Y ( T Y ( T X ( T R + R 2 T I (X ( T X 2 ( T ; Y ( T Y ( T.. The key part of the proof is to compute the above mutual information terms. After that we can follow the steps in [] to get the desired results. In this proof we compute I (X ( T X 2 ( T ; Y ( T Y ( T in detail and the other two terms can be computed similarly. Let N (t be the number of points observed from the doubly stochastic Poisson point process Y before time t and T(t ={T...T N (t} be the corresponding ordered arrival times. Obviously (N (T T(T is a complete description of Y ( T. Similarly let N (t be the number of points observed from the doubly stochastic Poisson point process Y before time t and T (t = {T...T N (t} be the corresponding ordered arrival times. And (N (T T (T is a complete description of Y ( T. We also define Y σ (t = {N(t T(t N (t T (t} which is a complete description of Y and Y. We note that given x ( T and x 2 ( T Y and Y are two independent Poisson point processes with the rates λ Y (t = ax (t + bx 2 (t + λ and λ Y (t = (ad/c bx 2 (t + λ 2 cλ /a respectively. Hence given x (t and x 2 (t thesample function density of Y σ (T is [2] with f Yσ (T X X 2 (y σ x x 2 f (λ Y (t λ Y (t n (T = n (T = f 2 (λ Y (t λ Y = (t n (T n (T = f 3 (λ Y (t λ Y (t n (T = n (T f 4 (λ Y (t λ Y (t n (T n (T f (λ Y (t λ Y (t ( ( = exp λ Y (tdt exp λ Y (tdt f 2 (λ Y (t λ Y (t = exp ( λ Y (tdt ( n (T i= λ Y (t i exp λ Y (tdt f 3 (λ Y (t λ Y (t ( ( = exp λ Y (tdt exp λ Y (tdt n (T j= λ Y (t j f 4 (λ Y (t λ Y (t = exp ( ( exp λ Y (tdt n (T i= λ Y (t i λ Y (tdt n (T j= λ Y (t j. Now define ˆλ Y (t = λ + E{aX (t + bx 2 (t Y σ (τ; τ t} and ˆλ Y (t = λ 2 cλ /a + E{(ad/c b X 2 (t Y σ (τ; τ t}. Then following from [2] (Chap. 6 Y and Y are self-exciting counting processes with intensity processes ˆλ Y (t t T and ˆλ Y (t t T. Moreover the sample density function of Y σ (T is given by

Sum-Rate Capacity of Poisson MIMO Multiple-Access Channels

Sum-Rate Capacity of Poisson MIMO Multiple-Access Channels 1 Sum-Rate Capacity of Poisson MIMO Multiple-Access Channels Ain-ul-Aisha, Lifeng Lai, Yingbin Liang and Shlomo Shamai (Shitz Abstract In this paper, we analyze the sum-rate capacity of two-user Poisson

More information

On Gaussian MIMO Broadcast Channels with Common and Private Messages

On Gaussian MIMO Broadcast Channels with Common and Private Messages On Gaussian MIMO Broadcast Channels with Common and Private Messages Ersen Ekrem Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 20742 ersen@umd.edu

More information

Primary Rate-Splitting Achieves Capacity for the Gaussian Cognitive Interference Channel

Primary Rate-Splitting Achieves Capacity for the Gaussian Cognitive Interference Channel Primary Rate-Splitting Achieves Capacity for the Gaussian Cognitive Interference Channel Stefano Rini, Ernest Kurniawan and Andrea Goldsmith Technische Universität München, Munich, Germany, Stanford University,

More information

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 Capacity Theorems for Discrete, Finite-State Broadcast Channels With Feedback and Unidirectional Receiver Cooperation Ron Dabora

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

The Poisson Channel with Side Information

The Poisson Channel with Side Information The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH

More information

An Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel

An Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 4, APRIL 2012 2427 An Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel Ersen Ekrem, Student Member, IEEE,

More information

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,

More information

arxiv: v1 [cs.it] 4 Jun 2018

arxiv: v1 [cs.it] 4 Jun 2018 State-Dependent Interference Channel with Correlated States 1 Yunhao Sun, 2 Ruchen Duan, 3 Yingbin Liang, 4 Shlomo Shamai (Shitz) 5 Abstract arxiv:180600937v1 [csit] 4 Jun 2018 This paper investigates

More information

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12 Lecture 1: The Multiple Access Channel Copyright G. Caire 12 Outline Two-user MAC. The Gaussian case. The K-user case. Polymatroid structure and resource allocation problems. Copyright G. Caire 13 Two-user

More information

Lecture 10: Broadcast Channel and Superposition Coding

Lecture 10: Broadcast Channel and Superposition Coding Lecture 10: Broadcast Channel and Superposition Coding Scribed by: Zhe Yao 1 Broadcast channel M 0M 1M P{y 1 y x} M M 01 1 M M 0 The capacity of the broadcast channel depends only on the marginal conditional

More information

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability

More information

Draft. On Limiting Expressions for the Capacity Region of Gaussian Interference Channels. Mojtaba Vaezi and H. Vincent Poor

Draft. On Limiting Expressions for the Capacity Region of Gaussian Interference Channels. Mojtaba Vaezi and H. Vincent Poor Preliminaries Counterexample Better Use On Limiting Expressions for the Capacity Region of Gaussian Interference Channels Mojtaba Vaezi and H. Vincent Poor Department of Electrical Engineering Princeton

More information

Information Theory Meets Game Theory on The Interference Channel

Information Theory Meets Game Theory on The Interference Channel Information Theory Meets Game Theory on The Interference Channel Randall A. Berry Dept. of EECS Northwestern University e-mail: rberry@eecs.northwestern.edu David N. C. Tse Wireless Foundations University

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

820 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 2, FEBRUARY Stefano Rini, Daniela Tuninetti, and Natasha Devroye

820 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 2, FEBRUARY Stefano Rini, Daniela Tuninetti, and Natasha Devroye 820 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 2, FEBRUARY 2012 Inner and Outer Bounds for the Gaussian Cognitive Interference Channel and New Capacity Results Stefano Rini, Daniela Tuninetti,

More information

Capacity and Optimal Power Allocation of Poisson Optical Communication Channels

Capacity and Optimal Power Allocation of Poisson Optical Communication Channels , October 19-21, 2011, San Francisco, USA Capacity and Optimal ower Allocation of oisson Optical Communication Channels Samah A M Ghanem, Member, IAENG, Munnujahan Ara, Member, IAENG Abstract in this paper,

More information

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR 1 Stefano Rini, Daniela Tuninetti and Natasha Devroye srini2, danielat, devroye @ece.uic.edu University of Illinois at Chicago Abstract

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

On the Capacity of the Interference Channel with a Relay

On the Capacity of the Interference Channel with a Relay On the Capacity of the Interference Channel with a Relay Ivana Marić, Ron Dabora and Andrea Goldsmith Stanford University, Stanford, CA {ivanam,ron,andrea}@wsl.stanford.edu Abstract Capacity gains due

More information

IN this paper, we show that the scalar Gaussian multiple-access

IN this paper, we show that the scalar Gaussian multiple-access 768 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 50, NO. 5, MAY 2004 On the Duality of Gaussian Multiple-Access and Broadcast Channels Nihar Jindal, Student Member, IEEE, Sriram Vishwanath, and Andrea

More information

Degrees of Freedom Region of the Gaussian MIMO Broadcast Channel with Common and Private Messages

Degrees of Freedom Region of the Gaussian MIMO Broadcast Channel with Common and Private Messages Degrees of Freedom Region of the Gaussian MIMO Broadcast hannel with ommon and Private Messages Ersen Ekrem Sennur Ulukus Department of Electrical and omputer Engineering University of Maryland, ollege

More information

Information Theory for Wireless Communications, Part II:

Information Theory for Wireless Communications, Part II: Information Theory for Wireless Communications, Part II: Lecture 5: Multiuser Gaussian MIMO Multiple-Access Channel Instructor: Dr Saif K Mohammed Scribe: Johannes Lindblom In this lecture, we give the

More information

Approximately achieving the feedback interference channel capacity with point-to-point codes

Approximately achieving the feedback interference channel capacity with point-to-point codes Approximately achieving the feedback interference channel capacity with point-to-point codes Joyson Sebastian*, Can Karakus*, Suhas Diggavi* Abstract Superposition codes with rate-splitting have been used

More information

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel

More information

Random Access: An Information-Theoretic Perspective

Random Access: An Information-Theoretic Perspective Random Access: An Information-Theoretic Perspective Paolo Minero, Massimo Franceschetti, and David N. C. Tse Abstract This paper considers a random access system where each sender can be in two modes of

More information

A Comparison of Two Achievable Rate Regions for the Interference Channel

A Comparison of Two Achievable Rate Regions for the Interference Channel A Comparison of Two Achievable Rate Regions for the Interference Channel Hon-Fah Chong, Mehul Motani, and Hari Krishna Garg Electrical & Computer Engineering National University of Singapore Email: {g030596,motani,eleghk}@nus.edu.sg

More information

Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels

Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels Bike ie, Student Member, IEEE and Richard D. Wesel, Senior Member, IEEE Abstract Certain degraded broadcast channels

More information

6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011

6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011 6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011 On the Structure of Real-Time Encoding and Decoding Functions in a Multiterminal Communication System Ashutosh Nayyar, Student

More information

Error Exponent Region for Gaussian Broadcast Channels

Error Exponent Region for Gaussian Broadcast Channels Error Exponent Region for Gaussian Broadcast Channels Lihua Weng, S. Sandeep Pradhan, and Achilleas Anastasopoulos Electrical Engineering and Computer Science Dept. University of Michigan, Ann Arbor, MI

More information

On the Capacity and Degrees of Freedom Regions of MIMO Interference Channels with Limited Receiver Cooperation

On the Capacity and Degrees of Freedom Regions of MIMO Interference Channels with Limited Receiver Cooperation On the Capacity and Degrees of Freedom Regions of MIMO Interference Channels with Limited Receiver Cooperation Mehdi Ashraphijuo, Vaneet Aggarwal and Xiaodong Wang 1 arxiv:1308.3310v1 [cs.it] 15 Aug 2013

More information

Variable Length Codes for Degraded Broadcast Channels

Variable Length Codes for Degraded Broadcast Channels Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates

More information

EE 4TM4: Digital Communications II. Channel Capacity

EE 4TM4: Digital Communications II. Channel Capacity EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

On the Secrecy Capacity of the Z-Interference Channel

On the Secrecy Capacity of the Z-Interference Channel On the Secrecy Capacity of the Z-Interference Channel Ronit Bustin Tel Aviv University Email: ronitbustin@post.tau.ac.il Mojtaba Vaezi Princeton University Email: mvaezi@princeton.edu Rafael F. Schaefer

More information

On Dependence Balance Bounds for Two Way Channels

On Dependence Balance Bounds for Two Way Channels On Dependence Balance Bounds for Two Way Channels Ravi Tandon Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 20742 ravit@umd.edu ulukus@umd.edu

More information

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122 Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel

More information

On the Capacity Region of the Gaussian Z-channel

On the Capacity Region of the Gaussian Z-channel On the Capacity Region of the Gaussian Z-channel Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@eng.umd.edu ulukus@eng.umd.edu

More information

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing Lecture 7 Agenda for the lecture M-ary hypothesis testing and the MAP rule Union bound for reducing M-ary to binary hypothesis testing Introduction of the channel coding problem 7.1 M-ary hypothesis testing

More information

On the Duality between Multiple-Access Codes and Computation Codes

On the Duality between Multiple-Access Codes and Computation Codes On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch

More information

Duality, Achievable Rates, and Sum-Rate Capacity of Gaussian MIMO Broadcast Channels

Duality, Achievable Rates, and Sum-Rate Capacity of Gaussian MIMO Broadcast Channels 2658 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 49, NO 10, OCTOBER 2003 Duality, Achievable Rates, and Sum-Rate Capacity of Gaussian MIMO Broadcast Channels Sriram Vishwanath, Student Member, IEEE, Nihar

More information

Lecture 4 Channel Coding

Lecture 4 Channel Coding Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity

More information

On Network Interference Management

On Network Interference Management On Network Interference Management Aleksandar Jovičić, Hua Wang and Pramod Viswanath March 3, 2008 Abstract We study two building-block models of interference-limited wireless networks, motivated by the

More information

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@cam.ac.uk Alfonso Martinez Universitat Pompeu Fabra alfonso.martinez@ieee.org

More information

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions EE/Stat 376B Handout #5 Network Information Theory October, 14, 014 1. Problem.4 parts (b) and (c). Homework Set # Solutions (b) Consider h(x + Y ) h(x + Y Y ) = h(x Y ) = h(x). (c) Let ay = Y 1 + Y, where

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

An Outer Bound for the Gaussian. Interference channel with a relay.

An Outer Bound for the Gaussian. Interference channel with a relay. An Outer Bound for the Gaussian Interference Channel with a Relay Ivana Marić Stanford University Stanford, CA ivanam@wsl.stanford.edu Ron Dabora Ben-Gurion University Be er-sheva, Israel ron@ee.bgu.ac.il

More information

Amobile satellite communication system, like Motorola s

Amobile satellite communication system, like Motorola s I TRANSACTIONS ON INFORMATION THORY, VOL. 45, NO. 4, MAY 1999 1111 Distributed Source Coding for Satellite Communications Raymond W. Yeung, Senior Member, I, Zhen Zhang, Senior Member, I Abstract Inspired

More information

Katalin Marton. Abbas El Gamal. Stanford University. Withits A. El Gamal (Stanford University) Katalin Marton Withits / 9

Katalin Marton. Abbas El Gamal. Stanford University. Withits A. El Gamal (Stanford University) Katalin Marton Withits / 9 Katalin Marton Abbas El Gamal Stanford University Withits 2010 A. El Gamal (Stanford University) Katalin Marton Withits 2010 1 / 9 Brief Bio Born in 1941, Budapest Hungary PhD from Eötvös Loránd University

More information

On Scalable Coding in the Presence of Decoder Side Information

On Scalable Coding in the Presence of Decoder Side Information On Scalable Coding in the Presence of Decoder Side Information Emrah Akyol, Urbashi Mitra Dep. of Electrical Eng. USC, CA, US Email: {eakyol, ubli}@usc.edu Ertem Tuncel Dep. of Electrical Eng. UC Riverside,

More information

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Igal Sason Department of Electrical Engineering Technion - Israel Institute of Technology Haifa 32000, Israel 2009 IEEE International

More information

A Comparison of Superposition Coding Schemes

A Comparison of Superposition Coding Schemes A Comparison of Superposition Coding Schemes Lele Wang, Eren Şaşoğlu, Bernd Bandemer, and Young-Han Kim Department of Electrical and Computer Engineering University of California, San Diego La Jolla, CA

More information

Upper Bounds on the Capacity of Binary Intermittent Communication

Upper Bounds on the Capacity of Binary Intermittent Communication Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,

More information

On the Secrecy Capacity of Fading Channels

On the Secrecy Capacity of Fading Channels On the Secrecy Capacity of Fading Channels arxiv:cs/63v [cs.it] 7 Oct 26 Praveen Kumar Gopala, Lifeng Lai and Hesham El Gamal Department of Electrical and Computer Engineering The Ohio State University

More information

II. THE TWO-WAY TWO-RELAY CHANNEL

II. THE TWO-WAY TWO-RELAY CHANNEL An Achievable Rate Region for the Two-Way Two-Relay Channel Jonathan Ponniah Liang-Liang Xie Department of Electrical Computer Engineering, University of Waterloo, Canada Abstract We propose an achievable

More information

Intermittent Communication

Intermittent Communication Intermittent Communication Mostafa Khoshnevisan, Student Member, IEEE, and J. Nicholas Laneman, Senior Member, IEEE arxiv:32.42v2 [cs.it] 7 Mar 207 Abstract We formulate a model for intermittent communication

More information

Optimal Power Allocation for Parallel Gaussian Broadcast Channels with Independent and Common Information

Optimal Power Allocation for Parallel Gaussian Broadcast Channels with Independent and Common Information SUBMIED O IEEE INERNAIONAL SYMPOSIUM ON INFORMAION HEORY, DE. 23 1 Optimal Power Allocation for Parallel Gaussian Broadcast hannels with Independent and ommon Information Nihar Jindal and Andrea Goldsmith

More information

Joint Source-Channel Coding for the Multiple-Access Relay Channel

Joint Source-Channel Coding for the Multiple-Access Relay Channel Joint Source-Channel Coding for the Multiple-Access Relay Channel Yonathan Murin, Ron Dabora Department of Electrical and Computer Engineering Ben-Gurion University, Israel Email: moriny@bgu.ac.il, ron@ee.bgu.ac.il

More information

National University of Singapore Department of Electrical & Computer Engineering. Examination for

National University of Singapore Department of Electrical & Computer Engineering. Examination for National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:

More information

The Poisson Optical Communication Channels: Capacity and Optimal Power Allocation

The Poisson Optical Communication Channels: Capacity and Optimal Power Allocation The oisson Optical Communication : Capacity and Optimal ower Allocation Samah A M Ghanem, Member, IAENG, Munnujahan Ara, Member, IAENG Abstract In this paper, the channel capacity for different models

More information

Lecture 15: Conditional and Joint Typicaility

Lecture 15: Conditional and Joint Typicaility EE376A Information Theory Lecture 1-02/26/2015 Lecture 15: Conditional and Joint Typicaility Lecturer: Kartik Venkat Scribe: Max Zimet, Brian Wai, Sepehr Nezami 1 Notation We always write a sequence of

More information

Network coding for multicast relation to compression and generalization of Slepian-Wolf

Network coding for multicast relation to compression and generalization of Slepian-Wolf Network coding for multicast relation to compression and generalization of Slepian-Wolf 1 Overview Review of Slepian-Wolf Distributed network compression Error exponents Source-channel separation issues

More information

An introduction to basic information theory. Hampus Wessman

An introduction to basic information theory. Hampus Wessman An introduction to basic information theory Hampus Wessman Abstract We give a short and simple introduction to basic information theory, by stripping away all the non-essentials. Theoretical bounds on

More information

Bounds and Capacity Results for the Cognitive Z-interference Channel

Bounds and Capacity Results for the Cognitive Z-interference Channel Bounds and Capacity Results for the Cognitive Z-interference Channel Nan Liu nanliu@stanford.edu Ivana Marić ivanam@wsl.stanford.edu Andrea J. Goldsmith andrea@wsl.stanford.edu Shlomo Shamai (Shitz) Technion

More information

A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels

A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels Mehdi Mohseni Department of Electrical Engineering Stanford University Stanford, CA 94305, USA Email: mmohseni@stanford.edu

More information

Capacity Bounds for the Gaussian IM-DD Optical Multiple-Access Channel

Capacity Bounds for the Gaussian IM-DD Optical Multiple-Access Channel This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: OI.9/TWC.7.687, IEEE

More information

List of Figures. Acknowledgements. Abstract 1. 1 Introduction 2. 2 Preliminaries Superposition Coding Block Markov Encoding...

List of Figures. Acknowledgements. Abstract 1. 1 Introduction 2. 2 Preliminaries Superposition Coding Block Markov Encoding... Contents Contents i List of Figures iv Acknowledgements vi Abstract 1 1 Introduction 2 2 Preliminaries 7 2.1 Superposition Coding........................... 7 2.2 Block Markov Encoding.........................

More information

Information Theory - Entropy. Figure 3

Information Theory - Entropy. Figure 3 Concept of Information Information Theory - Entropy Figure 3 A typical binary coded digital communication system is shown in Figure 3. What is involved in the transmission of information? - The system

More information

Duality Between Channel Capacity and Rate Distortion With Two-Sided State Information

Duality Between Channel Capacity and Rate Distortion With Two-Sided State Information IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 6, JUNE 2002 1629 Duality Between Channel Capacity Rate Distortion With Two-Sided State Information Thomas M. Cover, Fellow, IEEE, Mung Chiang, Student

More information

Optimum Power Allocation in Fading MIMO Multiple Access Channels with Partial CSI at the Transmitters

Optimum Power Allocation in Fading MIMO Multiple Access Channels with Partial CSI at the Transmitters Optimum Power Allocation in Fading MIMO Multiple Access Channels with Partial CSI at the Transmitters Alkan Soysal Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland,

More information

Modulation & Coding for the Gaussian Channel

Modulation & Coding for the Gaussian Channel Modulation & Coding for the Gaussian Channel Trivandrum School on Communication, Coding & Networking January 27 30, 2017 Lakshmi Prasad Natarajan Dept. of Electrical Engineering Indian Institute of Technology

More information

Capacity Region of the Permutation Channel

Capacity Region of the Permutation Channel Capacity Region of the Permutation Channel John MacLaren Walsh and Steven Weber Abstract We discuss the capacity region of a degraded broadcast channel (DBC) formed from a channel that randomly permutes

More information

The Gallager Converse

The Gallager Converse The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing

More information

Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel

Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel Shahid Mehraj Shah and Vinod Sharma Department of Electrical Communication Engineering, Indian Institute of

More information

The Fading Number of a Multiple-Access Rician Fading Channel

The Fading Number of a Multiple-Access Rician Fading Channel The Fading Number of a Multiple-Access Rician Fading Channel Intermediate Report of NSC Project Capacity Analysis of Various Multiple-Antenna Multiple-Users Communication Channels with Joint Estimation

More information

A Simple Memoryless Proof of the Capacity of the Exponential Server Timing Channel

A Simple Memoryless Proof of the Capacity of the Exponential Server Timing Channel A Simple Memoryless Proof of the Capacity of the Exponential Server iming Channel odd P. Coleman ECE Department Coordinated Science Laboratory University of Illinois colemant@illinois.edu Abstract his

More information

The Capacity Region of the Cognitive Z-interference Channel with One Noiseless Component

The Capacity Region of the Cognitive Z-interference Channel with One Noiseless Component 1 The Capacity Region of the Cognitive Z-interference Channel with One Noiseless Component Nan Liu, Ivana Marić, Andrea J. Goldsmith, Shlomo Shamai (Shitz) arxiv:0812.0617v1 [cs.it] 2 Dec 2008 Dept. of

More information

Optimal Encoding Schemes for Several Classes of Discrete Degraded Broadcast Channels

Optimal Encoding Schemes for Several Classes of Discrete Degraded Broadcast Channels Optimal Encoding Schemes for Several Classes of Discrete Degraded Broadcast Channels Bike Xie, Student Member, IEEE and Richard D. Wesel, Senior Member, IEEE Abstract arxiv:0811.4162v4 [cs.it] 8 May 2009

More information

UCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, Solutions to Take-Home Midterm (Prepared by Pinar Sen)

UCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, Solutions to Take-Home Midterm (Prepared by Pinar Sen) UCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, 2017 Solutions to Take-Home Midterm (Prepared by Pinar Sen) 1. (30 points) Erasure broadcast channel. Let p(y 1,y 2 x) be a discrete

More information

On Capacity of. the Writing onto Fast Fading Dirt Channel

On Capacity of. the Writing onto Fast Fading Dirt Channel On Capacity of the Writing onto Fast Fading Dirt Channel Stefano Rini and Shlomo Shamai (Shitz) National Chiao-Tung University, Hsinchu, Taiwan E-mail: stefano@nctu.edu.tw arxiv:606.06039v [cs.it] 0 Jun

More information

Interference Channels with Source Cooperation

Interference Channels with Source Cooperation Interference Channels with Source Cooperation arxiv:95.319v1 [cs.it] 19 May 29 Vinod Prabhakaran and Pramod Viswanath Coordinated Science Laboratory University of Illinois, Urbana-Champaign Urbana, IL

More information

Source-Channel Coding Theorems for the Multiple-Access Relay Channel

Source-Channel Coding Theorems for the Multiple-Access Relay Channel Source-Channel Coding Theorems for the Multiple-Access Relay Channel Yonathan Murin, Ron Dabora, and Deniz Gündüz Abstract We study reliable transmission of arbitrarily correlated sources over multiple-access

More information

Capacity bounds for multiple access-cognitive interference channel

Capacity bounds for multiple access-cognitive interference channel Mirmohseni et al. EURASIP Journal on Wireless Communications and Networking, :5 http://jwcn.eurasipjournals.com/content///5 RESEARCH Open Access Capacity bounds for multiple access-cognitive interference

More information

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 11, NOVEMBER On the Performance of Sparse Recovery

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 11, NOVEMBER On the Performance of Sparse Recovery IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 11, NOVEMBER 2011 7255 On the Performance of Sparse Recovery Via `p-minimization (0 p 1) Meng Wang, Student Member, IEEE, Weiyu Xu, and Ao Tang, Senior

More information

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels Representation of Correlated s into Graphs for Transmission over Broadcast s Suhan Choi Department of Electrical Eng. and Computer Science University of Michigan, Ann Arbor, MI 80, USA Email: suhanc@eecs.umich.edu

More information

Fading Wiretap Channel with No CSI Anywhere

Fading Wiretap Channel with No CSI Anywhere Fading Wiretap Channel with No CSI Anywhere Pritam Mukherjee Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 7 pritamm@umd.edu ulukus@umd.edu Abstract

More information

A Single-letter Upper Bound for the Sum Rate of Multiple Access Channels with Correlated Sources

A Single-letter Upper Bound for the Sum Rate of Multiple Access Channels with Correlated Sources A Single-letter Upper Bound for the Sum Rate of Multiple Access Channels with Correlated Sources Wei Kang Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College

More information

On Gaussian Interference Channels with Mixed Gaussian and Discrete Inputs

On Gaussian Interference Channels with Mixed Gaussian and Discrete Inputs On Gaussian nterference Channels with Mixed Gaussian and Discrete nputs Alex Dytso Natasha Devroye and Daniela Tuninetti University of llinois at Chicago Chicago L 60607 USA Email: odytso devroye danielat

More information

Lecture 4: Proof of Shannon s theorem and an explicit code

Lecture 4: Proof of Shannon s theorem and an explicit code CSE 533: Error-Correcting Codes (Autumn 006 Lecture 4: Proof of Shannon s theorem and an explicit code October 11, 006 Lecturer: Venkatesan Guruswami Scribe: Atri Rudra 1 Overview Last lecture we stated

More information

K User Interference Channel with Backhaul

K User Interference Channel with Backhaul 1 K User Interference Channel with Backhaul Cooperation: DoF vs. Backhaul Load Trade Off Borna Kananian,, Mohammad A. Maddah-Ali,, Babak H. Khalaj, Department of Electrical Engineering, Sharif University

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

On Common Information and the Encoding of Sources that are Not Successively Refinable

On Common Information and the Encoding of Sources that are Not Successively Refinable On Common Information and the Encoding of Sources that are Not Successively Refinable Kumar Viswanatha, Emrah Akyol, Tejaswi Nanjundaswamy and Kenneth Rose ECE Department, University of California - Santa

More information

INFORMATION PROCESSING ABILITY OF BINARY DETECTORS AND BLOCK DECODERS. Michael A. Lexa and Don H. Johnson

INFORMATION PROCESSING ABILITY OF BINARY DETECTORS AND BLOCK DECODERS. Michael A. Lexa and Don H. Johnson INFORMATION PROCESSING ABILITY OF BINARY DETECTORS AND BLOCK DECODERS Michael A. Lexa and Don H. Johnson Rice University Department of Electrical and Computer Engineering Houston, TX 775-892 amlexa@rice.edu,

More information

Characterization of Convex and Concave Resource Allocation Problems in Interference Coupled Wireless Systems

Characterization of Convex and Concave Resource Allocation Problems in Interference Coupled Wireless Systems 2382 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL 59, NO 5, MAY 2011 Characterization of Convex and Concave Resource Allocation Problems in Interference Coupled Wireless Systems Holger Boche, Fellow, IEEE,

More information

Cognitive Multiple Access Networks

Cognitive Multiple Access Networks Cognitive Multiple Access Networks Natasha Devroye Email: ndevroye@deas.harvard.edu Patrick Mitran Email: mitran@deas.harvard.edu Vahid Tarokh Email: vahid@deas.harvard.edu Abstract A cognitive radio can

More information

Source and Channel Coding for Correlated Sources Over Multiuser Channels

Source and Channel Coding for Correlated Sources Over Multiuser Channels Source and Channel Coding for Correlated Sources Over Multiuser Channels Deniz Gündüz, Elza Erkip, Andrea Goldsmith, H. Vincent Poor Abstract Source and channel coding over multiuser channels in which

More information

Dispersion of the Gilbert-Elliott Channel

Dispersion of the Gilbert-Elliott Channel Dispersion of the Gilbert-Elliott Channel Yury Polyanskiy Email: ypolyans@princeton.edu H. Vincent Poor Email: poor@princeton.edu Sergio Verdú Email: verdu@princeton.edu Abstract Channel dispersion plays

More information

On the Throughput, Capacity and Stability Regions of Random Multiple Access over Standard Multi-Packet Reception Channels

On the Throughput, Capacity and Stability Regions of Random Multiple Access over Standard Multi-Packet Reception Channels On the Throughput, Capacity and Stability Regions of Random Multiple Access over Standard Multi-Packet Reception Channels Jie Luo, Anthony Ephremides ECE Dept. Univ. of Maryland College Park, MD 20742

More information