On Multiple User Channels with State Information at the Transmitters

Size: px
Start display at page:

Download "On Multiple User Channels with State Information at the Transmitters"

Transcription

1 On Multiple User Channels with State Information at the Transmitters Styrmir Sigurjónsson and Young-Han Kim* Information Systems Laboratory Stanford University Stanford, CA 94305, USA Abstract We extend Shannon s result on the capacity of channels with state information to multiple user channels. More specifically, we characterize the capacity (region) of degraded broadcast channels and physically degraded relay channels where the channel state information is causally available at the transmitters. We also obtain inner and outer bounds on the capacity region for multiple access channels with causal state information at the transmitters. I. INTRODUCTION In his 1958 paper [1], Shannon considered a communication system where additional side information about the channel is causally available at the transmitter. (See Figure 1.) He showed W Encoder X i (W, ) S n Q n p(s i ) p(y x, s) Y i Decoder Fig. 1. Communication channel with state information causally known at the transmitter. that the capacity of this channel can be achieved by adding a physical device in front of the channel, which depends on the current state (and the message to be sent) only. Using the modern language, we can write the capacity of this channel as C max I(U; Y ) p(u)p(x u,s) max I(U; Y ), (1) p(u), xf(u,s) where U is an auxilliary random variable with finite cardinality. The achievability of the rate in (1) is clear from Shannon s original argument of attaching a physical device in front of the channel. The optimality of this rate is deceptively easy to show. Indeed, by recognizing U i : (W, Y i 1, 1 ) to be independent of and by observing that X i can be written as * This work was partly supported by NSF Grant CCR Ŵ a function of (U i, ), we have I(W; ) H( ) H( W) i i i H(Y i ) H(Y i W, Y i 1 ) H(Y i ) H(Y i W, Y i 1, 1 ) I(U i ; Y i ) nc. () It is also worth pointing out the similarity of (1) to the capacity C when the state information is noncausally available at the transmitter, as shown by Gel fand and Pinsker [] and Heegard and El Gamal [3]: C max I(U; Y ) I(U; S). p(u s)p(x u,s) Since the capacity C for the causal case can be written as max p(u)p(x u,s) I(U; Y ) I(U; S), the loss of causality merely lies in the independence between U and S. In this paper, we try to extend Shannon s result to multiple user channels. Research in this direction is not new. Most notably, Steinberg [4] obtained bounds on the capacity region of degraded broadcast channels when the state information is noncausally available at the transmitter, although the tightness of these bounds is still open. Causality of the state information makes the problem much easier. Recently, Steinberg [5] reported the capacity region of the degraded broadcast channel when the state information is causally available at the transmitter. In Section II, we give an independent treatment of his result by scaling the converse technique we just developed for the single-user case. In Section III, we run the same program for the physically degraded relay channel and obtain the capacity. Unfortunately, the optimality of the similar coding scheme for multiple access channels is yet to be established, and we present inner and outer bounds on the capacity region in Section IV. II. DEGRADED BROADCAST CHANNELS Definition 1: The discrete memoryless broadcast channel with state information consists of input alphabet X, state alphabet S, output alphabet Y 1 Y, and a probability transition function p(y 1, y x, s), as in Figure /05/$ IEEE 7 ISIT 005

2 We assume the state information S n is available causally to the transmitter. We also assume that the channel is physically degraded, i.e., p(y 1, y x, s) p(y 1 x, s)p(y y 1 ). (W 1, W ) X i(w 1, W, ) p(y 1, y x, s) 1 Ŵ 1 Ŵ thus have ( ) P E,1 c i 1 E,i P(E c,1) + i 1 P(E,i ) ǫ + nr n(i(u;y) ǫ). Hence, 0 if R < I(U ; Y ) ǫ. Similarily, ( ) 1 P E1,1,1 c i 1 E 1,i j 1 E 1,1,j P(E c 1,1,1) + i 1 P(E 1,i ) + j 1 P(E 1,1,j ) Fig.. Broadcast channel with state causally available at the transmitter. Theorem 1: The capacity region of the degraded broadcast channel with state information available causally at the transmitter is the set of all rate pairs (R 1, R ) satisfying R 1 I(U 1 ; Y 1 U ) R I(U ; Y ) for some joint distribution p(u 1, u )p(s), x f(u 1, u, s) where the auxiliary random variables U 1 and U have finite alphabets. Proof: Although the proof of the achievability can be found in [5], we repeat it here for the sake of completeness. Fix R 1, R and p(u )p(u 1 u )p(x u 1, u, s). Codebook Generation. Generate nr independent codewords u n (w ) according to n p(u i). For each of these codewords, generate nr1 independent codewords u n 1(w 1, w ) according to n p(u 1i u i ). Codebook assignments are fixed and revealed to the transmitter and the receivers. Encoder. To send w 1 {1,,..., nr1 } to receiver 1 and w {1,,..., nr } to receiver, we select the corresponding codewords u n (w ) and u n 1(w 1, w ). At time i, upon observing the channel state s i, the transmitter sends x i f(u 1,i, u,i, s i ). Decoding. Receiver declares that a message ŵ was sent if there is a unique ŵ such that u n (ŵ ) and y n are jointly typical; otherwise an error is declared. Receiver 1 declares that a message pair (ŵ 1, ŵ ) was sent if there is a unique (ŵ 1, ŵ ) such that u n 1(ŵ 1, ŵ ), u n (ŵ ) and y n are jointly typical; otherwise an error is declared. Define the following events: E,i {(U n (i), ) A (n) E 1,i {(U n (i), Y1 n ) A (n) E 1,i,j {(U n (i), U1 n (j, i), Y1 n ) A (n) Without loss of generality, assume the sent messages were w 1 1 and w 1 and denote the probabilities that receivers 1 and declare an error with 1 and, respectively. We By the asymptotic equipartition property (AEP), P(E c 1,1,1) 0 and i 1 P(E 1,i) nr n(i(u;y1) ǫ) 0 if R < I(U ; Y 1 ). But we already have R < I(U ; Y ) and the degradedness implies I(U ; Y ) < I(U ; Y 1 ). Similarly, we can show that P(E 1,1,j ) n(i(u1;y1 U) 3ǫ). Thus j 1 P(E 1,1,j) nr1 n(i(u1;y1 U) 3ǫ) 0 if R 1 < I(U 1 ; Y 1 U ) 3ǫ, which completes the proof of achievability. Let us now prove the converse. We wish to show that given any sequence of (( nr1, nr ), n) codes (X i (W 1, W, ), Ŵ 1 (Y1 n ), Ŵ ( )) such that e : P(W 1 Ŵ1 or W Ŵ) 0 for uniform and independent message indices W 1 and W, the rate pair (R 1, R ) must satisfy the conditions in the theorem. By Fano s inequality, H(W 1 Y1 n ) nr 1 P e (n) + 1 nǫ 1n H(W ) nr P e (n) + 1 nǫ n since P e (n) max{ 1, }. Define the following auxiliary random variables, U 1i : (W 1, Y1 i 1, 1 ) and U i : (W, Y1 i 1 ). We now have Similarily, nr H(W ) I(W ; ) + H(W ) I(W ; ) + nǫ n nr 1 H(W 1 ) I(W ; Y i Y i 1 ) + nǫ n I(W, Y i 1 ; Y i ) + nǫ n I(W, Y1 i 1 ; Y i) + nǫ n }{{} U i I(U i ; Y i ) + nǫ n. 73

3 I(W 1 ; Y1 n ) + H(W 1 Y1 n ) I(W 1 ; Y1 n ) + nǫ 1n I(W 1 ; Y1 n W ) + nǫ 1n I(W 1 ; Y 1i W, Y i 1 1 ) + nǫ 1n I(W 1, Y1 i 1, 1 i 1 ; Y 1i W, Y1 ) + nǫ 1n }{{}}{{} U 1,i U,i I(U 1i ; Y 1i U i ) + nǫ 1n. We define a random variable Q independent of everything else, uniformly distributed over {1,,...,n} and define U 1 (Q, U 1Q ), U (Q, U Q ), X X Q, S S Q, Y 1 Y 1Q, and Y Y Q. Then, we have and R 1 1 n 1 n I(U 1i ; Y 1i U i ) + ǫ 1n I(U 1i ; Y 1i U i, Q i) + ǫ 1n I(U 1Q ; Y 1Q U Q, Q) + ǫ 1n I(U 1Q, Q; Y 1Q U Q, Q) + ǫ 1n I(U 1 ; Y 1 U ) + ǫ 1n, R 1 n I(U i ; Y i ) + ǫ n I(U Q ; Y Q Q) + ǫ n I(U Q, Q; Y Q ) + ǫ n I(U ; Y ) + ǫ n. But it is see to see that (U 1, U ) is independent of S, that X is a deterministic function of (U 1, U, S), and that the joint distribution of (X, S, Y 1, Y ) is consistent with the channel p(y 1, y x, s). Thus, we have established the converse of the theorem. III. PHYSICALLY DEGRADED RELAY CHANNELS Definition : The discrete memoryless relay channel with state information consists of input alphabet X, relay input alphabet X 1, state alphabet S, relay output alphabet Y 1, output alphabet Y, and a probability transition function p(y, y 1 x, x 1, s). In the following, we will assume that the state variable S is causally available to the transmitter and the relay. We also assume that the channel is physically degraded, i.e., p(y, y 1 x, x 1, s) p(y 1 x, x 1, s)p(y y 1, x 1, s). The main result of this section is the following: Theorem : The capacity of the degraded relay channel with state information causally available at the transmitter and Relay Y 1 X 1 W X i (W, ) p(y, y 1 x, x 1, s) Fig. 3. Relay channel with state information available causally at the transmitter and relay relay is given by C max min{i(u, U 1 ; Y ), I(U; Y 1 U 1, S)}. p(u,u 1) xf(u,s) x 1f 1(u 1,s) Proof: We first prove the achievability of the rate region. The approach follows Shannon s method of attaching physical devices [1] and the relay coding theorem by Cover and El Gamal [6], which transforms the original relay channel into one with auxilliary inputs U and U 1. More specifically, we use the block Markov encoding. A sequence of B 1 messages, w i W, each selected independly and uniformly over W, are to be sent over the channel in nb transmissions. Within each block of length n, the sender and the relay use a doubly-indexed set of codewords C {(u n (w t), u n 1(t)) : w {1,..., nr }, Ŵ t {, 1,..., nr0 }} Fix a probability distribution p(u, u 1 ), x f(u, s), x 1 f 1 (u 1, s). Codebook Generation: Generate at random nr0 independent n-sequences u n 1 (t), t {1,...,nR0 }, each drawn according to n p(u 1i). For each u n 1(t) sequence, generate nr conditionally independent u n (w t) sequences drawn according to n p(u i u 1i ). This defines the random codebook C. For each message w {1,..., nr } assign an index t(w) at random from {1,..., nr }. The set of messages with the same index form a bin T t W. The codebook and bin assignments are revealed to all parties. Encoding: Let w(b) {1,..., nr } be the new index to be sent in block b, and assume that w(b 1) T t(b). The encoder then selects u n (w(b) t(b)). At time i in block b, upon receiving s i (b), the sender sends x i (b) f(u i (w(b) t(b)), s i (b)). The relay will have an estimate ŵ(b 1) of the previous index w(b 1). Assume that ŵ(b 1). Then upon receiving Tˆt(b) s(b) the relay encoder sends x 1i f 1 (u 1i (ˆt(b)), s i (b)). Decoding: We assume that at the end of block b 1, the receiver knows (w(1),...,w(b )) and (t(1),...,t(b 1)) and the relay knows (w(1),...,w(b 1)) and consequently (t(1),...,t(b)). The decoding procedures at the end of block b are as follows: 1) With (t(b), y1 n (b), s n (b)), the relay estimates the message of the transmitter as ŵ(b) if there exists a unique ŵ(b) such that (u n (ŵ(b) t(b)), u n 1(t(b)), y1 n (b), s n (b)) are jointly typical. 74

4 It can be shown that ŵ(b) w(b) with arbitrarily small probability of error, if R < I(U; Y 1 U 1, S) and n sufficiently large. ) The receiver declares that ˆt(b) was sent if there exists exactly one ˆt(b) such that (u n 1 (ˆt(b)), y n (b)) is jointly typical. It can be shown that ˆt(b) t(b) with arbitrarily small probability of error, if R 0 < I(U 1 ; Y ) (3) and n sufficiently large. 3) Assuming that t(b) is decoded succesfully at the receiver, then ŵ(b 1) is declared to be the index sent in block b 1 if there is a unique ŵ(b 1) T t(b) that is jointly typical with y n (b 1). It can be shown that if n is sufficiently large and if R < I(U; Y U 1 ) + R 0 (4) then ŵ(b 1) w(b 1)x with arbitrarily small probability of error. Combining (3) and (4) yields the condition R < I(U, U 1 ; Y ). Let us now prove the converse for any rate-r code with encoding functions X i (W, ) and X 1i (, Y i 1 ). We define the auxiliary random variables U i (W, Y i 1, 1 ) and U 1i (Y1 i 1, 1 ). It is easy to see that (U i, U 1i ) is independent of and that X i and X 1i are deterministic functions of (U i, ) and (U 1i, ), respectively. We have I(W; ) I(W; Y i Y i 1 ) H(Y i Y i 1 ) H(Y i Y i 1, W) H(Y i ) H(Y i Y i 1, Y i 1 1, W, 1 ) I(U i, U 1i ; Y i ) I(W; ) I(W;, Y1 n S n ) H(Y i, Y 1i Y i 1, Y1 i 1, S n ) H(Y i, Y 1i Y i 1, Y1 i 1, S n, W) H(Y i, Y 1i Y i 1, Y1 i 1, S n ) H(Y i, Y 1i Y i 1, Y1 i 1, S n, W, X i, X 1i ) H(Y i, Y 1i Y1 i 1, 1, ) H(Y i, Y 1i X i, X 1i, ) H(Y i, Y 1i Y1 i 1, 1, ) H(Y i, Y 1i X i, X 1i,, Y i 1, Y i 1 1, 1, W) H(Y i, Y 1i U 1i, ) H(Y i, Y 1i U i, U 1i, ) I(U i ; Y i, Y 1i U 1i, ). Fano s inequality and the use of the usual time-sharing random variable will show R min {I(U, U 1 ; Y ), I(U; Y, Y 1 U 1, S)} + ǫ n. Combining the following Markov relationship U (S, U 1, Y 1 ) Y with the degradedness of the channel and the fact that X and X 1 are functions of (U, S) and (U 1, S), we can easily show that I(U; Y, Y 1 U 1, S) I(U; Y 1 U 1, S). This completes the proof of the converse. IV. MULTIPLE ACCESS CHANNELS Definition 3: The discrete memoryless multiple access channel with state information consists of input alphabet X 1 X, state alphabet S, output alphabet Y, and a probability transition function p(y x 1, x, s); see Figure 4. Again we consider the case when the state variable s causally available to the transmitters. W 1 W X 1i(W 1, ) X i(w, ) p(y x 1, x, s) (Ŵ1, Ŵ) Fig. 4. Multiple access channel with state causally available to the transmitters. Define the region Rp(u mac 1,u ) to be the convex hull of all rate pairs (R 1, R ) satisfying R 1 I(U 1 ; Y U ) R I(U ; Y U 1 ) R 1 + R I(U 1, U ; Y ) over all p(u 1, u ), x 1 f 1 (u 1, s), x f(u, s). Similarly, define Rp(u mac 1)p(u ) to be the convex hull of all rate pairs (R 1, R ) satisfying the same set of inequalities over all p(u 1, u ), x 1 f 1 (u 1, s), and x f(u, s). For the multiple access channel case, the capacity theorem is yet to be established, mostly because of the coupling of two auxiliary random variables in the proof of the converse. Here we give bounds on the capacity region instead. Theorem 3: Let C mac denote the capacity region of the multiple access channel with state causally available at both transmitters. Then Rp(u mac 1)p(u ) Cmac Rp(u mac. 1,u ) Proof: Following the same arguments as before, it is not hard to show that C mac Rp(u mac 1,u ) ; hence, we skip the details. 75

5 For the lower bound, we first fix p(u 1 )p(u ), f 1 (u 1, s), and f (u, s). Codebook Generation. Generate nr1 independent codewords u n 1(w! ), w 1 {1,,..., nr1 }, generating each element i.i.d. n p(u 1i) and nr independent codewords u n (w ), w {1,,..., nr }, generating each element i.i.d. n p(u i). These codewords form the codebook, which is revealed to the senders and the receiver. Encoding. To send message indices w 1 {1,,..., nr1 } and w {1,,..., nr }, select the corresponding codewords u n 1 (w 1) and u n (w ). At time i, upon receiving s i, transmitter 1 sends x 1,i f 1 (u 1i (w 1 ), s i ) and transmitter acts similarly. Decoding. Let A ǫ (n) denote the set of typical (U1 n, Un, ) sequences. If there exists a unique pair Ŵ1 and Ŵ such that (U1 n(ŵ1), U n(ŵ), ) A ǫ (n) declare that (Ŵ1, Ŵ) was sent, otherwise declare an error. We now analyze the probability of error. Define the events, E i,j {(U n 1 (i), U n (j), ) A (n). Without loss of generality assume the messages sent were W 1 1 and W, the probability of error becomes, { } P e (n) P E1,1 c (i,j) (1,1) E i,j P(E1,1) c + P(E i,1 ) + P(E 1,j ) i 1 j 1 + P(E i,j ) (i,j) (1,1) From the AEP, P(E c 1,1) 0. For the second term we get P(E i,1 ) P {(U1 n (i), U n (1), ) A (n) p(u n 1)p(u n, y n ) (u n 1,un,yn ) A (n) ǫ (u n 1,un,yn ) A (n) ǫ n(h(u1) ǫ) n(h(u,y ) ǫ) n(h(u1,u,y )+ǫ) n(h(u1) ǫ) n(h(u,y ) ǫ) n(h(u1)+h(u,y ) H(U1,U,Y ) 3ǫ) n(i(u1;u,y ) 3ǫ) n(i(u1;y U) 3ǫ) where the last equality follows since U 1 and U are independent. Similarily for j 1, and for (i, j) (1, 1) P(E 1,j ) n(i(u;y U1) 3ǫ), P(E i,j ) n(i(u1,u;y ) 4ǫ). We can now write the probability of error as, e P(E1,1) c + nr1 n(i(u1;y U) 3ǫ) + nr n(i(u;y U1) 3ǫ) + n(r1+r) n(i(u1,u;y ) 4ǫ) which tends to zero if the conditions of the theorem are met. V. CONCLUDING REMARKS We characterized the capacity region of a few simple multiple user channels with state information, in single-letter formulas. The value of this result may lie in expanding the set of toy examples in network information theory and thus giving a glimpse on the structure of the theory. Questions still remain on other multiple user channels and on the cost of causality of state information. REFERENCES [1] C. E. Shannon, Channels with side information at the transmitter, IBM Journal of Research and Development, vol., pp , [] S. I. Gel fand and M. S. Pinsker, Coding for channel with random parameters, Problems of Control and Information Theory, vol. 9, no. 1 pp , [3] C. Heegard and A. A. El Gamal, On the capacity of computer memories with defects, IEEE Trans. Info. Theory, vol. 9, pp , September [4] Y. Steinberg, On the broadcast channel with random parameters, in Proc. IEEE ISIT 00 (Lausanne, Switzerland), p. 5, June 00. [5] Y. Steinberg, Coding for the degraded broadcast channel with random parameters, with causal and non-causal side information, accepted for publication in IEEE Trans. Info. Theory. [6] T. M. Cover and A. A. El Gamal, Capacity theorems for the relay channel, IEEE Trans. Info. Theory, vol. IT-5, pp , Sep

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

The Gallager Converse

The Gallager Converse The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing

More information

On the Capacity Region of the Gaussian Z-channel

On the Capacity Region of the Gaussian Z-channel On the Capacity Region of the Gaussian Z-channel Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@eng.umd.edu ulukus@eng.umd.edu

More information

On the Rate-Limited Gelfand-Pinsker Problem

On the Rate-Limited Gelfand-Pinsker Problem On the Rate-Limited Gelfand-Pinsker Problem Ravi Tandon Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 ravit@umd.edu ulukus@umd.edu Abstract

More information

A Comparison of Two Achievable Rate Regions for the Interference Channel

A Comparison of Two Achievable Rate Regions for the Interference Channel A Comparison of Two Achievable Rate Regions for the Interference Channel Hon-Fah Chong, Mehul Motani, and Hari Krishna Garg Electrical & Computer Engineering National University of Singapore Email: {g030596,motani,eleghk}@nus.edu.sg

More information

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel

More information

Lecture 3: Channel Capacity

Lecture 3: Channel Capacity Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.

More information

Variable Length Codes for Degraded Broadcast Channels

Variable Length Codes for Degraded Broadcast Channels Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates

More information

A Comparison of Superposition Coding Schemes

A Comparison of Superposition Coding Schemes A Comparison of Superposition Coding Schemes Lele Wang, Eren Şaşoğlu, Bernd Bandemer, and Young-Han Kim Department of Electrical and Computer Engineering University of California, San Diego La Jolla, CA

More information

The Capacity Region of the Cognitive Z-interference Channel with One Noiseless Component

The Capacity Region of the Cognitive Z-interference Channel with One Noiseless Component 1 The Capacity Region of the Cognitive Z-interference Channel with One Noiseless Component Nan Liu, Ivana Marić, Andrea J. Goldsmith, Shlomo Shamai (Shitz) arxiv:0812.0617v1 [cs.it] 2 Dec 2008 Dept. of

More information

Simultaneous Nonunique Decoding Is Rate-Optimal

Simultaneous Nonunique Decoding Is Rate-Optimal Fiftieth Annual Allerton Conference Allerton House, UIUC, Illinois, USA October 1-5, 2012 Simultaneous Nonunique Decoding Is Rate-Optimal Bernd Bandemer University of California, San Diego La Jolla, CA

More information

Multicoding Schemes for Interference Channels

Multicoding Schemes for Interference Channels Multicoding Schemes for Interference Channels 1 Ritesh Kolte, Ayfer Özgür, Haim Permuter Abstract arxiv:1502.04273v1 [cs.it] 15 Feb 2015 The best known inner bound for the 2-user discrete memoryless interference

More information

The Poisson Channel with Side Information

The Poisson Channel with Side Information The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH

More information

Duality Between Channel Capacity and Rate Distortion With Two-Sided State Information

Duality Between Channel Capacity and Rate Distortion With Two-Sided State Information IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 6, JUNE 2002 1629 Duality Between Channel Capacity Rate Distortion With Two-Sided State Information Thomas M. Cover, Fellow, IEEE, Mung Chiang, Student

More information

List of Figures. Acknowledgements. Abstract 1. 1 Introduction 2. 2 Preliminaries Superposition Coding Block Markov Encoding...

List of Figures. Acknowledgements. Abstract 1. 1 Introduction 2. 2 Preliminaries Superposition Coding Block Markov Encoding... Contents Contents i List of Figures iv Acknowledgements vi Abstract 1 1 Introduction 2 2 Preliminaries 7 2.1 Superposition Coding........................... 7 2.2 Block Markov Encoding.........................

More information

An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel

An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel Forty-Ninth Annual Allerton Conference Allerton House, UIUC, Illinois, USA September 8-3, An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel Invited Paper Bernd Bandemer and

More information

Feedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case

Feedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case 1 arxiv:0901.3580v1 [cs.it] 23 Jan 2009 Feedback Capacity of the Gaussian Interference Channel to Within 1.7075 Bits: the Symmetric Case Changho Suh and David Tse Wireless Foundations in the Department

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

Bounds and Capacity Results for the Cognitive Z-interference Channel

Bounds and Capacity Results for the Cognitive Z-interference Channel Bounds and Capacity Results for the Cognitive Z-interference Channel Nan Liu nanliu@stanford.edu Ivana Marić ivanam@wsl.stanford.edu Andrea J. Goldsmith andrea@wsl.stanford.edu Shlomo Shamai (Shitz) Technion

More information

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels Representation of Correlated s into Graphs for Transmission over Broadcast s Suhan Choi Department of Electrical Eng. and Computer Science University of Michigan, Ann Arbor, MI 80, USA Email: suhanc@eecs.umich.edu

More information

Lecture 10: Broadcast Channel and Superposition Coding

Lecture 10: Broadcast Channel and Superposition Coding Lecture 10: Broadcast Channel and Superposition Coding Scribed by: Zhe Yao 1 Broadcast channel M 0M 1M P{y 1 y x} M M 01 1 M M 0 The capacity of the broadcast channel depends only on the marginal conditional

More information

IET Commun., 2009, Vol. 3, Iss. 4, pp doi: /iet-com & The Institution of Engineering and Technology 2009

IET Commun., 2009, Vol. 3, Iss. 4, pp doi: /iet-com & The Institution of Engineering and Technology 2009 Published in IET Communications Received on 10th March 2008 Revised on 10th July 2008 ISSN 1751-8628 Comprehensive partial decoding approach for two-level relay networks L. Ghabeli M.R. Aref Information

More information

Capacity bounds for multiple access-cognitive interference channel

Capacity bounds for multiple access-cognitive interference channel Mirmohseni et al. EURASIP Journal on Wireless Communications and Networking, :5 http://jwcn.eurasipjournals.com/content///5 RESEARCH Open Access Capacity bounds for multiple access-cognitive interference

More information

arxiv: v1 [cs.it] 4 Jun 2018

arxiv: v1 [cs.it] 4 Jun 2018 State-Dependent Interference Channel with Correlated States 1 Yunhao Sun, 2 Ruchen Duan, 3 Yingbin Liang, 4 Shlomo Shamai (Shitz) 5 Abstract arxiv:180600937v1 [csit] 4 Jun 2018 This paper investigates

More information

Distributed Lossless Compression. Distributed lossless compression system

Distributed Lossless Compression. Distributed lossless compression system Lecture #3 Distributed Lossless Compression (Reading: NIT 10.1 10.5, 4.4) Distributed lossless source coding Lossless source coding via random binning Time Sharing Achievability proof of the Slepian Wolf

More information

Secret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper

Secret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper Secret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper Mohsen Bahrami, Ali Bereyhi, Mahtab Mirmohseni and Mohammad Reza Aref Information Systems and Security

More information

On the Capacity of the Interference Channel with a Relay

On the Capacity of the Interference Channel with a Relay On the Capacity of the Interference Channel with a Relay Ivana Marić, Ron Dabora and Andrea Goldsmith Stanford University, Stanford, CA {ivanam,ron,andrea}@wsl.stanford.edu Abstract Capacity gains due

More information

II. THE TWO-WAY TWO-RELAY CHANNEL

II. THE TWO-WAY TWO-RELAY CHANNEL An Achievable Rate Region for the Two-Way Two-Relay Channel Jonathan Ponniah Liang-Liang Xie Department of Electrical Computer Engineering, University of Waterloo, Canada Abstract We propose an achievable

More information

UCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, Solutions to Take-Home Midterm (Prepared by Pinar Sen)

UCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, Solutions to Take-Home Midterm (Prepared by Pinar Sen) UCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, 2017 Solutions to Take-Home Midterm (Prepared by Pinar Sen) 1. (30 points) Erasure broadcast channel. Let p(y 1,y 2 x) be a discrete

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

Relay Networks With Delays

Relay Networks With Delays Relay Networks With Delays Abbas El Gamal, Navid Hassanpour, and James Mammen Department of Electrical Engineering Stanford University, Stanford, CA 94305-9510 Email: {abbas, navid, jmammen}@stanford.edu

More information

On Gaussian MIMO Broadcast Channels with Common and Private Messages

On Gaussian MIMO Broadcast Channels with Common and Private Messages On Gaussian MIMO Broadcast Channels with Common and Private Messages Ersen Ekrem Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 20742 ersen@umd.edu

More information

Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel

Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel Ahmad Abu Al Haija ECE Department, McGill University, Montreal, QC, Canada Email: ahmad.abualhaija@mail.mcgill.ca

More information

The Capacity Region of a Class of Discrete Degraded Interference Channels

The Capacity Region of a Class of Discrete Degraded Interference Channels The Capacity Region of a Class of Discrete Degraded Interference Channels Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@umd.edu

More information

Energy State Amplification in an Energy Harvesting Communication System

Energy State Amplification in an Energy Harvesting Communication System Energy State Amplification in an Energy Harvesting Communication System Omur Ozel Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland College Park, MD 20742 omur@umd.edu

More information

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 Capacity Theorems for Discrete, Finite-State Broadcast Channels With Feedback and Unidirectional Receiver Cooperation Ron Dabora

More information

Research Article Multiaccess Channels with State Known to Some Encoders and Independent Messages

Research Article Multiaccess Channels with State Known to Some Encoders and Independent Messages Hindawi Publishing Corporation EURASIP Journal on Wireless Communications and etworking Volume 2008, Article ID 450680, 14 pages doi:10.1155/2008/450680 Research Article Multiaccess Channels with State

More information

X 1 : X Table 1: Y = X X 2

X 1 : X Table 1: Y = X X 2 ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access

More information

Solutions to Homework Set #2 Broadcast channel, degraded message set, Csiszar Sum Equality

Solutions to Homework Set #2 Broadcast channel, degraded message set, Csiszar Sum Equality 1st Semester 2010/11 Solutions to Homework Set #2 Broadcast channel, degraded message set, Csiszar Sum Equality 1. Convexity of capacity region of broadcast channel. Let C R 2 be the capacity region of

More information

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR 1 Stefano Rini, Daniela Tuninetti and Natasha Devroye srini2, danielat, devroye @ece.uic.edu University of Illinois at Chicago Abstract

More information

Joint Write-Once-Memory and Error-Control Codes

Joint Write-Once-Memory and Error-Control Codes 1 Joint Write-Once-Memory and Error-Control Codes Xudong Ma Pattern Technology Lab LLC, U.S.A. Email: xma@ieee.org arxiv:1411.4617v1 [cs.it] 17 ov 2014 Abstract Write-Once-Memory (WOM) is a model for many

More information

On the Capacity of Interference Channels with Degraded Message sets

On the Capacity of Interference Channels with Degraded Message sets On the Capacity of Interference Channels with Degraded Message sets Wei Wu, Sriram Vishwanath and Ari Arapostathis arxiv:cs/060507v [cs.it] 7 May 006 Abstract This paper is motivated by a sensor network

More information

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying Ahmad Abu Al Haija, and Mai Vu, Department of Electrical and Computer Engineering McGill University Montreal, QC H3A A7 Emails: ahmadabualhaija@mailmcgillca,

More information

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability

More information

Interference Channel aided by an Infrastructure Relay

Interference Channel aided by an Infrastructure Relay Interference Channel aided by an Infrastructure Relay Onur Sahin, Osvaldo Simeone, and Elza Erkip *Department of Electrical and Computer Engineering, Polytechnic Institute of New York University, Department

More information

The Sensor Reachback Problem

The Sensor Reachback Problem Submitted to the IEEE Trans. on Information Theory, November 2003. 1 The Sensor Reachback Problem João Barros Sergio D. Servetto Abstract We consider the problem of reachback communication in sensor networks.

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

Joint Source-Channel Coding for the Multiple-Access Relay Channel

Joint Source-Channel Coding for the Multiple-Access Relay Channel Joint Source-Channel Coding for the Multiple-Access Relay Channel Yonathan Murin, Ron Dabora Department of Electrical and Computer Engineering Ben-Gurion University, Israel Email: moriny@bgu.ac.il, ron@ee.bgu.ac.il

More information

Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel

Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel Jun Chen Dept. of Electrical and Computer Engr. McMaster University Hamilton, Ontario, Canada Chao Tian AT&T Labs-Research 80 Park

More information

An Achievable Rate for the Multiple Level Relay Channel

An Achievable Rate for the Multiple Level Relay Channel An Achievable Rate for the Multiple Level Relay Channel Liang-Liang Xie and P. R. Kumar Department of Electrical and Computer Engineering, and Coordinated Science Laboratory University of Illinois, Urbana-Champaign

More information

Network coding for multicast relation to compression and generalization of Slepian-Wolf

Network coding for multicast relation to compression and generalization of Slepian-Wolf Network coding for multicast relation to compression and generalization of Slepian-Wolf 1 Overview Review of Slepian-Wolf Distributed network compression Error exponents Source-channel separation issues

More information

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

Secret Key Agreement Using Asymmetry in Channel State Knowledge

Secret Key Agreement Using Asymmetry in Channel State Knowledge Secret Key Agreement Using Asymmetry in Channel State Knowledge Ashish Khisti Deutsche Telekom Inc. R&D Lab USA Los Altos, CA, 94040 Email: ashish.khisti@telekom.com Suhas Diggavi LICOS, EFL Lausanne,

More information

Interactive Decoding of a Broadcast Message

Interactive Decoding of a Broadcast Message In Proc. Allerton Conf. Commun., Contr., Computing, (Illinois), Oct. 2003 Interactive Decoding of a Broadcast Message Stark C. Draper Brendan J. Frey Frank R. Kschischang University of Toronto Toronto,

More information

Interference Channels with Source Cooperation

Interference Channels with Source Cooperation Interference Channels with Source Cooperation arxiv:95.319v1 [cs.it] 19 May 29 Vinod Prabhakaran and Pramod Viswanath Coordinated Science Laboratory University of Illinois, Urbana-Champaign Urbana, IL

More information

Can Feedback Increase the Capacity of the Energy Harvesting Channel?

Can Feedback Increase the Capacity of the Energy Harvesting Channel? Can Feedback Increase the Capacity of the Energy Harvesting Channel? Dor Shaviv EE Dept., Stanford University shaviv@stanford.edu Ayfer Özgür EE Dept., Stanford University aozgur@stanford.edu Haim Permuter

More information

Capacity of channel with energy harvesting transmitter

Capacity of channel with energy harvesting transmitter IET Communications Research Article Capacity of channel with energy harvesting transmitter ISSN 75-868 Received on nd May 04 Accepted on 7th October 04 doi: 0.049/iet-com.04.0445 www.ietdl.org Hamid Ghanizade

More information

STRONG CONVERSE FOR GEL FAND-PINSKER CHANNEL. Pierre Moulin

STRONG CONVERSE FOR GEL FAND-PINSKER CHANNEL. Pierre Moulin STROG COVERSE FOR GEL FAD-PISKER CHAEL Pierre Moulin Beckman Inst., Coord. Sci. Lab and ECE Department University of Illinois at Urbana-Champaign, USA ABSTRACT A strong converse for the Gel fand-pinsker

More information

Source and Channel Coding for Correlated Sources Over Multiuser Channels

Source and Channel Coding for Correlated Sources Over Multiuser Channels Source and Channel Coding for Correlated Sources Over Multiuser Channels Deniz Gündüz, Elza Erkip, Andrea Goldsmith, H. Vincent Poor Abstract Source and channel coding over multiuser channels in which

More information

Shannon s Noisy-Channel Coding Theorem

Shannon s Noisy-Channel Coding Theorem Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 13, 2015 Lucas Slot, Sebastian Zur Shannon s Noisy-Channel Coding Theorem February 13, 2015 1 / 29 Outline 1 Definitions and Terminology

More information

1174 IET Commun., 2010, Vol. 4, Iss. 10, pp

1174 IET Commun., 2010, Vol. 4, Iss. 10, pp Published in IET Communications Received on 26th June 2009 Revised on 12th November 2009 ISSN 1751-8628 Compress-and-forward strategy for relay channel with causal and non-causal channel state information

More information

A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels

A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels Mehdi Mohseni Department of Electrical Engineering Stanford University Stanford, CA 94305, USA Email: mmohseni@stanford.edu

More information

An Outer Bound for the Gaussian. Interference channel with a relay.

An Outer Bound for the Gaussian. Interference channel with a relay. An Outer Bound for the Gaussian Interference Channel with a Relay Ivana Marić Stanford University Stanford, CA ivanam@wsl.stanford.edu Ron Dabora Ben-Gurion University Be er-sheva, Israel ron@ee.bgu.ac.il

More information

LECTURE 13. Last time: Lecture outline

LECTURE 13. Last time: Lecture outline LECTURE 13 Last time: Strong coding theorem Revisiting channel and codes Bound on probability of error Error exponent Lecture outline Fano s Lemma revisited Fano s inequality for codewords Converse to

More information

On Compound Channels With Side Information at the Transmitter

On Compound Channels With Side Information at the Transmitter IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 52, NO 4, APRIL 2006 1745 On Compound Channels With Side Information at the Transmitter Patrick Mitran, Student Member, IEEE, Natasha Devroye, Student Member,

More information

A Formula for the Capacity of the General Gel fand-pinsker Channel

A Formula for the Capacity of the General Gel fand-pinsker Channel A Formula for the Capacity of the General Gel fand-pinsker Channel Vincent Y. F. Tan Institute for Infocomm Research (I 2 R, A*STAR, Email: tanyfv@i2r.a-star.edu.sg ECE Dept., National University of Singapore

More information

Cognitive Multiple Access Networks

Cognitive Multiple Access Networks Cognitive Multiple Access Networks Natasha Devroye Email: ndevroye@deas.harvard.edu Patrick Mitran Email: mitran@deas.harvard.edu Vahid Tarokh Email: vahid@deas.harvard.edu Abstract A cognitive radio can

More information

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Vincent Y. F. Tan (Joint work with Silas L. Fong) National University of Singapore (NUS) 2016 International Zurich Seminar on

More information

Achievable Rates and Outer Bound for the Half-Duplex MAC with Generalized Feedback

Achievable Rates and Outer Bound for the Half-Duplex MAC with Generalized Feedback 1 Achievable Rates and Outer Bound for the Half-Duplex MAC with Generalized Feedback arxiv:1108.004v1 [cs.it] 9 Jul 011 Ahmad Abu Al Haija and Mai Vu, Department of Electrical and Computer Engineering

More information

Random Access: An Information-Theoretic Perspective

Random Access: An Information-Theoretic Perspective Random Access: An Information-Theoretic Perspective Paolo Minero, Massimo Franceschetti, and David N. C. Tse Abstract This paper considers a random access system where each sender can be in two modes of

More information

On the Secrecy Capacity of Fading Channels

On the Secrecy Capacity of Fading Channels On the Secrecy Capacity of Fading Channels arxiv:cs/63v [cs.it] 7 Oct 26 Praveen Kumar Gopala, Lifeng Lai and Hesham El Gamal Department of Electrical and Computer Engineering The Ohio State University

More information

Capacity of a Class of Cognitive Radio Channels: Interference Channels with Degraded Message Sets

Capacity of a Class of Cognitive Radio Channels: Interference Channels with Degraded Message Sets Capacity of a Class of Cognitive Radio Channels: Interference Channels with Degraded Message Sets Wei Wu, Sriram Vishwanath and Ari Arapostathis Abstract This paper is motivated by two different scenarios.

More information

Interactive Hypothesis Testing with Communication Constraints

Interactive Hypothesis Testing with Communication Constraints Fiftieth Annual Allerton Conference Allerton House, UIUC, Illinois, USA October - 5, 22 Interactive Hypothesis Testing with Communication Constraints Yu Xiang and Young-Han Kim Department of Electrical

More information

820 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 2, FEBRUARY Stefano Rini, Daniela Tuninetti, and Natasha Devroye

820 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 2, FEBRUARY Stefano Rini, Daniela Tuninetti, and Natasha Devroye 820 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 2, FEBRUARY 2012 Inner and Outer Bounds for the Gaussian Cognitive Interference Channel and New Capacity Results Stefano Rini, Daniela Tuninetti,

More information

Degrees of Freedom Region of the Gaussian MIMO Broadcast Channel with Common and Private Messages

Degrees of Freedom Region of the Gaussian MIMO Broadcast Channel with Common and Private Messages Degrees of Freedom Region of the Gaussian MIMO Broadcast hannel with ommon and Private Messages Ersen Ekrem Sennur Ulukus Department of Electrical and omputer Engineering University of Maryland, ollege

More information

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem LECTURE 15 Last time: Feedback channel: setting up the problem Perfect feedback Feedback capacity Data compression Lecture outline Joint source and channel coding theorem Converse Robustness Brain teaser

More information

Source-Channel Coding Theorems for the Multiple-Access Relay Channel

Source-Channel Coding Theorems for the Multiple-Access Relay Channel Source-Channel Coding Theorems for the Multiple-Access Relay Channel Yonathan Murin, Ron Dabora, and Deniz Gündüz Abstract We study reliable transmission of arbitrarily correlated sources over multiple-access

More information

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12 Lecture 1: The Multiple Access Channel Copyright G. Caire 12 Outline Two-user MAC. The Gaussian case. The K-user case. Polymatroid structure and resource allocation problems. Copyright G. Caire 13 Two-user

More information

On Dependence Balance Bounds for Two Way Channels

On Dependence Balance Bounds for Two Way Channels On Dependence Balance Bounds for Two Way Channels Ravi Tandon Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 20742 ravit@umd.edu ulukus@umd.edu

More information

Sum Capacity of Gaussian Vector Broadcast Channels

Sum Capacity of Gaussian Vector Broadcast Channels Sum Capacity of Gaussian Vector Broadcast Channels Wei Yu, Member IEEE and John M. Cioffi, Fellow IEEE Abstract This paper characterizes the sum capacity of a class of potentially non-degraded Gaussian

More information

Capacity Theorems for Relay Channels

Capacity Theorems for Relay Channels Capacity Theorems for Relay Channels Abbas El Gamal Department of Electrical Engineering Stanford University April, 2006 MSRI-06 Relay Channel Discrete-memoryless relay channel [vm 7] Relay Encoder Y n

More information

EE 4TM4: Digital Communications II. Channel Capacity

EE 4TM4: Digital Communications II. Channel Capacity EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.

More information

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem Information Theory for Wireless Communications. Lecture 0 Discrete Memoryless Multiple Access Channel (DM-MAC: The Converse Theorem Instructor: Dr. Saif Khan Mohammed Scribe: Antonios Pitarokoilis I. THE

More information

Multiuser Successive Refinement and Multiple Description Coding

Multiuser Successive Refinement and Multiple Description Coding Multiuser Successive Refinement and Multiple Description Coding Chao Tian Laboratory for Information and Communication Systems (LICOS) School of Computer and Communication Sciences EPFL Lausanne Switzerland

More information

Classical codes for quantum broadcast channels. arxiv: Ivan Savov and Mark M. Wilde School of Computer Science, McGill University

Classical codes for quantum broadcast channels. arxiv: Ivan Savov and Mark M. Wilde School of Computer Science, McGill University Classical codes for quantum broadcast channels arxiv:1111.3645 Ivan Savov and Mark M. Wilde School of Computer Science, McGill University International Symposium on Information Theory, Boston, USA July

More information

Keyless authentication in the presence of a simultaneously transmitting adversary

Keyless authentication in the presence of a simultaneously transmitting adversary Keyless authentication in the presence of a simultaneously transmitting adversary Eric Graves Army Research Lab Adelphi MD 20783 U.S.A. ericsgra@ufl.edu Paul Yu Army Research Lab Adelphi MD 20783 U.S.A.

More information

Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel

Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel Shahid Mehraj Shah and Vinod Sharma Department of Electrical Communication Engineering, Indian Institute of

More information

The Capacity Region for Multi-source Multi-sink Network Coding

The Capacity Region for Multi-source Multi-sink Network Coding The Capacity Region for Multi-source Multi-sink Network Coding Xijin Yan Dept. of Electrical Eng. - Systems University of Southern California Los Angeles, CA, U.S.A. xyan@usc.edu Raymond W. Yeung Dept.

More information

The Capacity of the Semi-Deterministic Cognitive Interference Channel and its Application to Constant Gap Results for the Gaussian Channel

The Capacity of the Semi-Deterministic Cognitive Interference Channel and its Application to Constant Gap Results for the Gaussian Channel The Capacity of the Semi-Deterministic Cognitive Interference Channel and its Application to Constant Gap Results for the Gaussian Channel Stefano Rini, Daniela Tuninetti, and Natasha Devroye Department

More information

Capacity of a Class of Semi-Deterministic Primitive Relay Channels

Capacity of a Class of Semi-Deterministic Primitive Relay Channels Capacity of a Class of Semi-Deterministic Primitive Relay Channels Ravi Tandon Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 2742 ravit@umd.edu

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

On Scalable Coding in the Presence of Decoder Side Information

On Scalable Coding in the Presence of Decoder Side Information On Scalable Coding in the Presence of Decoder Side Information Emrah Akyol, Urbashi Mitra Dep. of Electrical Eng. USC, CA, US Email: {eakyol, ubli}@usc.edu Ertem Tuncel Dep. of Electrical Eng. UC Riverside,

More information

Chapter 9. Gaussian Channel

Chapter 9. Gaussian Channel Chapter 9 Gaussian Channel Peng-Hua Wang Graduate Inst. of Comm. Engineering National Taipei University Chapter Outline Chap. 9 Gaussian Channel 9.1 Gaussian Channel: Definitions 9.2 Converse to the Coding

More information

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels S. Sandeep Pradhan a, Suhan Choi a and Kannan Ramchandran b, a {pradhanv,suhanc}@eecs.umich.edu, EECS Dept.,

More information

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Nevroz Şen, Fady Alajaji and Serdar Yüksel Department of Mathematics and Statistics Queen s University Kingston, ON K7L 3N6, Canada

More information

Optimal Encoding Schemes for Several Classes of Discrete Degraded Broadcast Channels

Optimal Encoding Schemes for Several Classes of Discrete Degraded Broadcast Channels Optimal Encoding Schemes for Several Classes of Discrete Degraded Broadcast Channels Bike Xie, Student Member, IEEE and Richard D. Wesel, Senior Member, IEEE Abstract arxiv:0811.4162v4 [cs.it] 8 May 2009

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

Cut-Set Bound and Dependence Balance Bound

Cut-Set Bound and Dependence Balance Bound Cut-Set Bound and Dependence Balance Bound Lei Xiao lxiao@nd.edu 1 Date: 4 October, 2006 Reading: Elements of information theory by Cover and Thomas [1, Section 14.10], and the paper by Hekstra and Willems

More information

SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003

SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003 SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003 SLEPIAN-WOLF RESULT { X i} RATE R x ENCODER 1 DECODER X i V i {, } { V i} ENCODER 0 RATE R v Problem: Determine R, the

More information