Capacity Theorems for Relay Channels

Size: px
Start display at page:

Download "Capacity Theorems for Relay Channels"

Transcription

1 Capacity Theorems for Relay Channels Abbas El Gamal Department of Electrical Engineering Stanford University April, 2006 MSRI-06

2 Relay Channel Discrete-memoryless relay channel [vm 7] Relay Encoder Y n X n W p(y, y x, x ) Encoder X n Y n Decoder Ŵ Goal: Reliably communicate message W [, 2 nr ] from X to Y with the help of relay node (X,Y ) Relay transmission: Classical: x i = f i (y, y 2,...,y (i ) ) Without-delay: x i = f i (y, y 2,...,y i ) MSRI-06 2

3 Capacity Capacity C is the supremum over achievable rates R An infinite-letter characterization of the capacity is given by C = lim k sup p(x k ),{f i } k i= k I(Xk ;Y k ) Not considered a satisfactory answer computationally intractable? A computable description of capacity is not known in general What we known: Single-letter charcterizations of capacity for some special classes Single-letter upper and lower bounds on C MSRI-06 3

4 My Encounter with The Relay Channel Visited University of Hawaii in Spring 976 ARPA Aloha packet radio project David Slepian introduced me to the problem Send data directly and via a satellite Worked on it as part of my PhD thesis and with my first student M. Aref Little interest from information theory and communication community for years Renewed interest motivated by wireless networks work with S. Zahedi, M. Mohseni, N. Hassanpour, and J. Mammen MSRI-06 4

5 Early Work [CE 79 ] Cover, El Gamal, Capacity theorems for the relay channel, IT, 979 Cutset upper bound on C Block Markov coding (decode-and-forward) Capacity of degraded relay channels Capacity of relay channel with feedback Side-information coding (compress-and-forward) General lower bound on C combining partial decode-and-forward and compress and forward [EA 82 ] El Gamal, Aref, The Capacity of the semi-determininstic relay channel, IT Trans., 982 (partial decode-and-forward) El Gamal, On information flow in relay networks, IEEE NTC, 980 (cutset bound for relay networks) Aref, Information flow in relay networks, PhD Thesis, Stanford 980 (capacity of degraded relay networks) MSRI-06 5

6 Recent Work 2002 Present [EZ 05 ] El Gamal, Zahedi, Capacity of a class of of relay channels with orthogonal components, IT Trans., 2005 (partial decode-and-forward) [EMZ 06 ] El Gamal, Mohseni, Zahedi, Bounds on Capacity and Minimum Energy-Per-Bit for AWGN Relay Channels, to appear in IT Trans. Compress-and-forward with time sharing Capacity of FD-AWGN relay channels with linear relaying functions Bounds on minimum energy-per-bit that differ by factor <.7 [EH 05 ] El Gamal, Hassanpour, Relay-without-delay, ISIT, 2005 El Gamal, Hassanpour, Capacity Theorems for the relay-without-delay channel, Allerton, 2005 [EM 06 ] El Gamal, Mammen, Relay networks with delays, UCSD, 2006 MSRI-06 6

7 Outline Classical relay: Cutset upper bound Partial decode-and-forward Compress-and-forward Upper and lower bounds on capacity of AWGN relay channel Relay-without-delay: New cutset bound Instantaneous relaying Lower bound and capacity results (see poster by N. Hassanpour) MSRI-06 7

8 Cutset Upper Bound Upper bound on capacity of classical relay [CE 79]: C max p(x,x ) min {I(X, X ;Y ),I(X;Y, Y X )} Max-flow Min-cut interpretation: X Y : X X Y X Y Multiple access bound Broadcast bound Bound is tight for all classes where capacity is known Not known whether it is tight in general no known tighter bound MSRI-06 8

9 Partial Decode-and-Forward Lower bound [CE 79]: C max min {I(X, X ; Y ), I(X;Y X,U) + I(U;Y X )} p(u,x,x ) Transmission in B blocks. In block b:... W W 2 W 3 W B W B B B The relay and sender coherently send information (X ) to Y The sender also superposes a new message W b (thr X) The relay decodes part of W b (represented by U) The receiver decodes X, which helps it decode W b (U then X) Bound is tight for all cases where capacity is known: Degraded: (X (Y, X ) Y ). Set U = X [CE 79] Semideterministic: (Y = g(x, X )). Set U = Y [EA 82] Relay channel with orthogonal components (X = (X D,X R ), p(y x D, x )p(y x R,x )). Set U = X R [EZ 05] MSRI-06 9

10 Compress-and-Forward The relay can help even without decoding any part of the message Compress-and-forward scheme [CE 79]: Again transmission in B blocks Relay quantizes the received sequence in previous block and sends it to the receiver (a la Wyner-Ziv) Ŷ Y : X C W X Relay and the sender do not cooperate (interfere with each other) max min{i(x;y,ŷ X ), I(X,X ; Y ) I(Y ;Ŷ X,X )} p(x)p(x )p(ŷ y,x ) Except for some asymptotic results, not optimal for any known class Partial decode-and-forward and compress-and-forward can be combined (Theorem 7 of [CE 79]) Y Ŵ MSRI-06 0

11 Compress-and-Forward with Time-Sharing [EMZ 06] Compress-and-forward achievable rate is not convex Can be improved by time-sharing, which gives the bound C max min{i(x;y, Ŷ X,Q),I(X, X ; Y Q) I(Y ; Ŷ X,X,Q)}, where the maximization is over p(q)p(x q)p(x q)p(ŷ y,x,q) Example: FD-AWGN relay channel [EMZ 06] MSRI-06

12 Summary We don t know the capacity of the DM relay channel in general We have upper and lower bounds that coincide in some cases: Cutset upper bound. Tight for all cases where we know capacity. Is it tight in general? Partial decode-and-forward lower bound. Tight for all classes where we know capacity Compress-and-forward with time-sharing? Is it optimal for any class? Partial decode-and-forward and compress-and-forward with time-sharing can be combined. Is it optimal for any class? Neeed new coding strategies and tighter upper bound MSRI-06 2

13 General AWGN Relay Channel Consider the following AWGN relay channel Z a Y : X b Z W X Y Ŵ a, b > 0 are relative channel gains Z N(0, ) is independent of Z N(0, ) Average power constraints: E(X 2 ) P and E(X) 2 P Capacity with average power constraints not known for any 0 < a, b < MSRI-06 3

14 Cutset and Decode-and-Forward [EMZ 06] Z a Y : X b Z W X Decode-and-forward Y Cutset Upper Bound Ŵ C ( ( b a 2 + ( a 2 b 2 ) 2 P ( ), if a 2 + b 2 ab+ ) +a C 2 b 2 ) 2 P, if a 2 b 2 a 2 (+a 2 ) C ( max{, a 2 }P ), otherwise C ( ( + a 2 )P ), otherwise C(x) = 2 log 2( + x) Partial decode-and-forward reduces to decode-and-forward Bounds never coincide MSRI-06 4

15 Weak Channel to Relay (a ) Z a Y : X b Z W X Y Ŵ When a, decode-and-forward lower bound reduces to C (P), i.e., direct transmission lower bound Decoding at the relay is not beneficial since everything the relay can decode is already decoded by the receiver But, relay can still help by forwarding some information about it s received sequence We consider two strategies: Compress-and-forward Linear relaying functions MSRI-06 5

16 Compress-and-Forward [EMZ 06] Achievable rate using compress-and-forward R = max min{i(x,x ;Y ) I(Y ;Ŷ X,X ),I(X;Y, Ŷ X )}, p(x)p(x )p(ŷ y,x ) The optimal choice of p(x)p(x )p(ŷ y, x ) is not known Assume X, X and Ŷ Gaussian: Ŷ Z α Z a Y : X b Z X C C ( P ( + a 2 b 2 P P( + a 2 + b 2 ) + As b, this bound becomes tight (coincides with broadcast bound) )) Y MSRI-06 6

17 Comparison of Bounds a = and b = Compress-forward Rate (Bits/Trans.).5 Upper Bound Decode-forward SNR MSRI-06 7

18 Frequency-Division AWGN Relay Channel [EMZ 06] Model motivated by wireless communication: Z Z R a Y : X b Y R X Y S Bounds on capacity of FD-AWGN relay Decode-and-forward Z S Cutset Upper Bound C ( P ( + b 2 + b 2 P )), if a 2 + b 2 + b 2 P C ( P ( + b 2 + b 2 P )), if a 2 b 2 + b 2 P C ( max{, a 2 }P ), otherwise C ( ( + a 2 )P ), otherwise If a 2 + b 2 + b 2 P capacity is C ( P ( + b 2 + b 2 P )) (decode-forward) Model same as without delay MSRI-06 8

19 FD-AWGN Weak Channel to Relay (a ) [EMZ 06] Again if a, decode-and-forward reduces to direct transmission Z Z R a Y : X b Y R X Y S Can improve rate using compress-and-forward. Using Gaussian signals, we obtain: ( ( a 2 b 2 )) P( + P) C C P + a 2 P + ( + P)( + b 2 P) As b or P, compress-and-forward becomes optimal Z S MSRI-06 9

20 FD-AWGN Compress-and-Forward With Time-Sharing For small P, compress-and-forward is ineffective due to low SNR of Y Rate Rate with time-sharing 0 P Rate can be improved by time-sharing at the broadcast side (X sends at power P/α for fraction 0 < α and 0, otherwise) to: C max 0<α αc P α + a 2 ( )/( ) + + a2 P + ( + b P+α 2 P) α MSRI-06 20

21 Comparison of Bounds for FD AWGN Example: a = and b = Upper Bound Rate (Bits/Trans.).5 Compress-forward Decode-forward SNR MSRI-06 2

22 Linear Relaying Assume relay can only transmit a linear combination of the past received symbols, i.e., x (i+) = i j= d ijy j Using vector notation X = DY, where D = [d ij ] k k is a strictly lower triangular matrix and X = (X,X 2,...,X k ) T and Y = (Y,Y 2,...,Y k ) T Z n W X n a Y n X n D b Z n Y n Ŵ Let C (l) be the capacity with linear relaying subject to the power constraints MSRI-06 22

23 Capacity with Linear Relaying Capacity with linear relaying can be expressed as where subject to C (l) = sup k C (l) k k C(l) k = lim k k C(l) k = sup I(X k ; Y k ) P X k,d Sender power constraint: k i= E(X2 i ) kp Relay power constraint: k i= E(x2 i) kp Causality constraint: D strictly lower triangular MSRI-06 23

24 Example [EMZ 04] Y φ Z φ X 2 X X 2 a Y b Z Y Y 2 W X α Let X N(0, 2αP), and X 2 = α X and X 2 = dy where d is chosen to satisfy relay power constraints The achievable rate is R = 2 I(X, X 2 ; Y,Y 2 ) = max 0 α 2 C 2αP + ( ( ) 2 α)/α + abd + b 2 d 2 Y Ŵ 2 C(l) 2 C (l) MSRI-06 24

25 Comparison of Bounds Example: a = and b = Upper Bound Rate (Bits/Trans.) Linear Relaying Decode-forward 0.05 Compress-forward SNR MSRI-06 25

26 Capacity with Linear Relaying for General AWGN Gaussian distribution P X k maximizes C (l) k = sup PX k,d I(X k ; Y k ) Problem reduces to: Maximize lim k 2k log (I + abd)σ x(i + abd) T + (I + b 2 DD T ) (I + b 2 DD T ) Subject to Σ x 0 tr(σ x ) kp tr(a 2 Σ x D T D + D T D) kp d ij = 0 for j i Σ x = E(XX T ) and D are variables of the problem Sequence of non-convex problems (open problem) Single-letter characterization can be found for FD-AWGN relay channel MSRI-06 26

27 FD-AWGN Relay Channel with Linear Relaying Example: Consider the following amplify-and-forward scheme X Y Z Z R X a Y b Y R Y R X Y S Y S X N(0, P) and X = dy where d is chosen to satisfy the relay power constraint The achievable rate for this scheme is given by I(X;Y S, Y R ) ( ( C (l) a 2 b 2 )) P = C P + + (a 2 + b 2 )P Z S MSRI-06 27

28 Example (continued) C (l) is convex for small P, concave for large P Rate can be improved with time-sharing on broadcast side ( ) ( ( C FD L (P) max αc θp θp a 2 b 2 )) P + αc + 0<α,θ α α a 2 θp + b 2 P + α Transmission power θp/α θp/α Direct Transmission Amplify-and-forward ᾱn Transmission index n MSRI-06 28

29 Capacity of FD-AWGN Relay Channel with Linear Relaying Capacity with linear relaying for the FD-AWGN model can be expressed as C (l) = sup k k C(l) k = lim k k C(l) k, where Subject to: C (l) k = sup I(X k ; YS k, YR) k P X k,d Sender power constraint: k i= E(X2 i ) kp Relay power constraint: k i= E(X2 i ) kp Causality constraint: D lower triangular MSRI-06 29

30 Optimization Problem Gaussian input distribution P X k maximizes C (l) k Finding C (l) k Maximize = sup PX k,d I(X k ;Y k S,Y k R ) reduces to [ ] I + Σ x abσ x D T 2k log abdσ x a 2 b 2 DΣ x D T + (I + b 2 DD T ) 2 [ ] I 0 0 I + b 2 DD T Subject to Σ x 0 tr(σ x ) kp tr(a 2 Σ x D T D + D T D) kp, and d ij = 0 for j > i Σ x = E(XX T ) and lower triangular D are the optimization variables This is a non-convex problem in (Σ x, D) with k 2 + k variables For a fixed D, problem is convex in Σ X (waterfilling optimal); however, finding D for a fixed Σ X is a non-convex problem MSRI-06 30

31 Key Steps of the Proof Can show that diagonal Σ x and D suffice Objective function reduces to k Maximize 2k log 2 i= ( ( + σ i + a2 b 2 d 2 i + b 2 d 2 i Subject to σ i 0, for i =, 2,...,k, k d 2 i( + a 2 σ i ) kp i= )) k σ i kp, and This still is a non-convex optimization problem time-share between k amplify-forward schemes At the optimum point: If σ i = 0 it is easy to show that d i = 0 If d i = d j = 0 (direct-transmission) then σ i = σ j (concavity of log function) MSRI-06 3 i=

32 Proof (continued) Problem reduces to: Maximize 2k log 2 ( + kθ 0 P/k 0 ) k 0 k i=k 0 + Subject to d i > 0, for i = k 0 +,...,k, k i=k 0 + d 2 i( + a 2 σ i ) kp ( + σi ( + a 2 b 2 d 2 i/( + b 2 d 2 i) )) k i=k 0 σ i k( θ 0 )P, and By KKT conditions, at the optimum, there are 4 non-zero (σ j,d j ) 4 Maximize 2k log 2 ( + kθ 0 P/k 0 ) k ( ( 0 + σj + a 2 b 2 d 2 j/( + b 2 d 2 j) )) k j Subject to d j > 0, for j =,...,4, 4 k j d 2 j( + a 2 σ j ) kp j= j= 4 k j σ j k( θ 0 )P, and MSRI j=

33 Capacity of FD-AWGN Relay with Linear Relaying [EMZ 06] Taking the limit as k, we obtain ( ) C (l) θ0 P 4 ( θj P = maxα 0 C + α j C α 0 j= α j ( + a2 b 2 )) η j, + b 2 η j subject to α j,θ j 0, η j > 0, 4 j=0 α j = 4 j=0 θ j =, and 4 j= η ( ) j a 2 θ j P + α j = P Time-share between direct transmission and 4 amplify-forward regimes A non-convex optimization problem, but with only 4 variables and 3 constraints We haven t been able to find any example where optimal requires > 2 amplify-forward regimes (in addition to direct transmission) MSRI-06 33

34 Comparison with Other Schemes Example: a = and b = Compress-forward Rate (Bits/Trans.) Upper Bound Linear Relaying Decode-forward SNR As b, C (l) approaches the cutset upper bound MSRI-06 34

35 Summary For general AWGN relay channel: Bounds are never tight for any 0 < a,b < Compress-and-forward becomes optimal as b Linear relaying can beat compress-and-forward we don t have a computable expression for capacity with linear relaying For FD-AWGN relay channel: Decode-and-forward is optimal (bound coincides with cutset) when a 2 + b 2 + b 2 P Compress-and-forward becomes optimal as b or P. Time-sharing improves rate for small P We have a computable form of capacity with linear relaying time-sharing between direct transmission and at most 4 amplify-and-forward regimes MSRI-06 35

36 Relay-Without-Delay (RWD) Suppose the delay from the sender X to the receiver Y is longer than to the relay (X,Y ), so that x i can depend on its current, in addition to past received symbols, i.e., Relay-without-delay: x i = f i (y, y 2,...,y (i ), y i ) Relay Encoder Y n X n W X n p(y x)p(y x, x, y ) Y n Ŵ Again, wish to reliably communicate W [, 2 nr ] from X to Y MSRI-06 36

37 Generalized Cutset Bound Upper bound on capacity of RWD channel: C max min{i(v, X;Y ), I(X;Y, Y V )}, p(v,x), f(v,y ) where x = f(v, y ), and V min{ Y, X X } + Proof follows similar lines to cutset bound: Define V i = Y i, so X i = f i (V i, Y i ) Key observation: (W, Y i ) (X i,v i ) (Y i,y i ) This bound can be strictly larger than the cutset bound Bound applies to classical case, but x is a function only of v and the bound reduces to the classical cutset bound MSRI-06 37

38 Proof Use standard Fano s inequality argument. nr I(W; Y n ) + nǫ n, where ǫ n 0 Consider I(W; Y n ) = = n I(W; Y i Y i ) i= n (H(Y i ) H(Y i W, Y i )) i= n i= n i= (H(Y i ) H(Y i W, Y i,y i, X i )) (H(Y i ) H(Y i Y i,x i )) n I(X i,v i ; Y i ) = ni(x Q,V Q ;Y Q Q) ni(x,v ; Y ) i= MSRI-06 38

39 Next consider I(W; Y n ) I(W; Y n, Y n ) n = I(W; Y i, Y i Y i,y i ) = = i= n i= n i= H(Y i,y i Y i,y i ) H(Y i, Y i Y i,y i, W) H(Y i,y i V i ) H(Y i, Y i Y i, Y i, W, X i ) n H(Y i,y i V i ) H(Y i, Y i V i, X i ) i= n I(X i ; Y i, Y i V i ) = ni(x;y, Y V ) i= MSRI-06 39

40 Instantaneous Relaying Any lower bound on the capacity of the classical relay channel, e.g., decode-and-forward, is a lower bound on the RWD channel Instantaneous relaying: Ignore the past and set X = f(y ); f(y ) Y X W X p(y x)p(y x, x, y ) Y Ŵ This scheme can achieve R = max I(X;Y ) p(x), f(y ) This strategy can be optimal and can achieve higher rate than classical cutset bound! MSRI-06 40

41 Sato Example Consider the following DW relay channel [Sato 76]. Y = X and X Y 0 X X = 0 X = The capacity of the classical case is.6878 bits/transmission [CE 79] Coincides with cutset bound The upper bound on the capacity without delay gives: C 2(log 3 ) = bits/transmission Instantaneous relaying with input distribution ( 3 9, 2 9, 4 9) and the mapping from X to X of 0 0,, 2 achieves this bound!! Capacity of this RWD channel > classical, which is = cutset bound Y 0 2 MSRI-06 4

42 AWGN RWD Channel Same model as before (but without relay coding delay) Z a Y : X b Z W X Y Ŵ a, b > 0 are relative channel gains Z N(0, ) is independent of Z N(0, ) x i = f i (y,y 2,...,y i ) Average power constraints: E(X 2 ) P and E(X 2 ) P Recall that capacity of the classical case is not known for any 0 < a,b < MSRI-06 42

43 Amplify-and-Forward Consider the following special case of instantaneous relaying: Z W X a Y X h b Z Y Ŵ If a b, the upper bound gives C C ( ( + a 2 )P ) Consider the case where a 2 b 2 min{,p/(a 2 P + )} Amplify-and-forward also gives C max p(x), E(X 2 ) P I(X;Y ) = C ( ( + a 2 )P ), Capacity in this case coincides with classical cutset bound Can capacity of AWGN RWD channel be > cutset bound? MSRI-06 43

44 New Lower Bound on Capacity of RWD Channel Idea: Use a superposition of partial decode-and-forward and instantaneous relaying Achieves the lower bound: C max min{i(v, X;Y ), I(U;Y V ) + I(X;Y V, U)} p(u,v,x), f(v,y ) U represents the information decoded by the relay in partial decode-and-forward and V (replacing X ) represents the information sent coherently by the sender and the relay to help the receiver decode the previous U Each x n codeword in partial decode-and-forward is replaced by a v n codeword, and at time i the relay sends x i = f(v i,y i ) This bound coincides with the generalized cutset bound for degraded and semi-deterministic RWD channels Same superposition idea can be used to obain new compress-and-forward lower bound MSRI-06 44

45 Achievable Rate for AWGN RWD Channel Restrict previous scheme to superposition of decode-and-forward and amplify-and-forward: Let U = X = V + X, where V N(0, αp) and X N(0, αp) are independent, for 0 α and α = α Let X be a normalized convex combination of Y and V X = h(βy + βv ), 0 β Using the power constraint on the relay sender, we obtain h 2 = P (β + aβ) 2 αp + β 2 (a 2 αp + N) Substituting the above choice of (U,V, X,X ), we obtain C RWD AWGN max α, β min { C ( a 2 αp ), C ( αp(bh(aβ + β) + ) 2 + αp(bhaβ + ) 2 )} ( + (βbh) 2 ) MSRI-06 45

46 Achievable Rate vs. Cutset vs. Upper Bound Assume a = 2 and b = Rate (bits/transmission) Achievable rate for AWGN RWD channel Cutset bound Upper Bound on capacity of AWGN RWD channel MSRI P

47 Classical vs. RWD Relay Channels Result Classical Relay RWD Capacity Not known in general Not known in general Can be > classical Upper bound max p(x,x ) min{i(x, X ; Y ), I(X; Y, Y X )} max p(v,x), f(v,y ) min{i(v, X; Y ), I(X; Y, Y V )} Cutset bound Can be > cutset bound Degraded capacity max p(x,x ) min{i(x, X ; Y ), I(X; Y X )} max p(v,x), f(v,y ) min{i(v, X; Y ), I(X; Y V )} X (X, Y ) Y Decode-forward Decode-forward + instantaneous coding Semi-det capacity max p(x,x ) min{i(x, X ; Y ), I(X; Y, Y X )} max p(v,x), f(v,y ) min{i(v, X; Y ), I(X; Y, Y V )} Y = g(x) Partial decode-forward + instantaneous coding AWGN Capacity not known for any a, b 0 Capacity known for a 2 b 2 min{, Amplify-forward Can be in general > classical P a 2 P+N } MSRI-06 47

48 Conclusion Overview of known upper and lower bounds on relay and RWD channels Bounds are tight only is special cases We know as much about the RWD as classical relay, main difference is instantaneous relaying, but it is not sufficient Generalized cutset bound needed for RWD channel Generalization to relay networks with delays (see poster with J. Mammen) MSRI-06 48

Relay Networks With Delays

Relay Networks With Delays Relay Networks With Delays Abbas El Gamal, Navid Hassanpour, and James Mammen Department of Electrical Engineering Stanford University, Stanford, CA 94305-9510 Email: {abbas, navid, jmammen}@stanford.edu

More information

Bounds on Capacity and Minimum Energy-Per-Bit for AWGN Relay Channels

Bounds on Capacity and Minimum Energy-Per-Bit for AWGN Relay Channels Bounds on Capacity and Minimum Energy-Per-Bit for AWG Relay Channels Abbas El Gamal, Mehdi Mohseni and Sina Zahedi Information Systems Lab Department of Electrical Engineering Stanford University, Stanford,

More information

On Reliable Communication over Additive White Gaussian Noise Relay Channels

On Reliable Communication over Additive White Gaussian Noise Relay Channels On Reliable Communication over Additive White Gaussian oise Relay Channels Abbas El Gamal, Mehdi Mohseni and Sina Zahedi Information Systems Lab Department of Electrical Engineering Stanford University,

More information

The Gallager Converse

The Gallager Converse The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing

More information

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel

More information

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 4, APRIL

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 4, APRIL IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 4, APRIL 2006 1545 Bounds on Capacity Minimum EnergyPerBit for AWGN Relay Channels Abbas El Gamal, Fellow, IEEE, Mehdi Mohseni, Student Member, IEEE,

More information

Coding Techniques for Primitive Relay Channels

Coding Techniques for Primitive Relay Channels Forty-Fifth Annual Allerton Conference Allerton House, UIUC, Illinois, USA September 26-28, 2007 WeB1.2 Coding Techniques for Primitive Relay Channels Young-Han Kim Abstract We give a comprehensive discussion

More information

On the Capacity of the Interference Channel with a Relay

On the Capacity of the Interference Channel with a Relay On the Capacity of the Interference Channel with a Relay Ivana Marić, Ron Dabora and Andrea Goldsmith Stanford University, Stanford, CA {ivanam,ron,andrea}@wsl.stanford.edu Abstract Capacity gains due

More information

An Outer Bound for the Gaussian. Interference channel with a relay.

An Outer Bound for the Gaussian. Interference channel with a relay. An Outer Bound for the Gaussian Interference Channel with a Relay Ivana Marić Stanford University Stanford, CA ivanam@wsl.stanford.edu Ron Dabora Ben-Gurion University Be er-sheva, Israel ron@ee.bgu.ac.il

More information

A Comparison of Superposition Coding Schemes

A Comparison of Superposition Coding Schemes A Comparison of Superposition Coding Schemes Lele Wang, Eren Şaşoğlu, Bernd Bandemer, and Young-Han Kim Department of Electrical and Computer Engineering University of California, San Diego La Jolla, CA

More information

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying Ahmad Abu Al Haija, and Mai Vu, Department of Electrical and Computer Engineering McGill University Montreal, QC H3A A7 Emails: ahmadabualhaija@mailmcgillca,

More information

On Multiple User Channels with State Information at the Transmitters

On Multiple User Channels with State Information at the Transmitters On Multiple User Channels with State Information at the Transmitters Styrmir Sigurjónsson and Young-Han Kim* Information Systems Laboratory Stanford University Stanford, CA 94305, USA Email: {styrmir,yhk}@stanford.edu

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel

Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel Ahmad Abu Al Haija ECE Department, McGill University, Montreal, QC, Canada Email: ahmad.abualhaija@mail.mcgill.ca

More information

UCSD ECE 255C Handout #14 Prof. Young-Han Kim Thursday, March 9, Solutions to Homework Set #5

UCSD ECE 255C Handout #14 Prof. Young-Han Kim Thursday, March 9, Solutions to Homework Set #5 UCSD ECE 255C Handout #14 Prof. Young-Han Kim Thursday, March 9, 2017 Solutions to Homework Set #5 3.18 Bounds on the quadratic rate distortion function. Recall that R(D) = inf F(ˆx x):e(x ˆX)2 DI(X; ˆX).

More information

X 1 : X Table 1: Y = X X 2

X 1 : X Table 1: Y = X X 2 ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access

More information

Solutions to Homework Set #3 Channel and Source coding

Solutions to Homework Set #3 Channel and Source coding Solutions to Homework Set #3 Channel and Source coding. Rates (a) Channels coding Rate: Assuming you are sending 4 different messages using usages of a channel. What is the rate (in bits per channel use)

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

IET Commun., 2009, Vol. 3, Iss. 4, pp doi: /iet-com & The Institution of Engineering and Technology 2009

IET Commun., 2009, Vol. 3, Iss. 4, pp doi: /iet-com & The Institution of Engineering and Technology 2009 Published in IET Communications Received on 10th March 2008 Revised on 10th July 2008 ISSN 1751-8628 Comprehensive partial decoding approach for two-level relay networks L. Ghabeli M.R. Aref Information

More information

Capacity of a Class of Deterministic Relay Channels

Capacity of a Class of Deterministic Relay Channels Capacity of a Class of Deterministic Relay Channels Thomas M. Cover Information Systems Laboratory Stanford University Stanford, CA 94305, USA cover@ stanford.edu oung-han Kim Department of ECE University

More information

UCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, Solutions to Take-Home Midterm (Prepared by Pinar Sen)

UCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, Solutions to Take-Home Midterm (Prepared by Pinar Sen) UCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, 2017 Solutions to Take-Home Midterm (Prepared by Pinar Sen) 1. (30 points) Erasure broadcast channel. Let p(y 1,y 2 x) be a discrete

More information

Achievable Rates and Outer Bound for the Half-Duplex MAC with Generalized Feedback

Achievable Rates and Outer Bound for the Half-Duplex MAC with Generalized Feedback 1 Achievable Rates and Outer Bound for the Half-Duplex MAC with Generalized Feedback arxiv:1108.004v1 [cs.it] 9 Jul 011 Ahmad Abu Al Haija and Mai Vu, Department of Electrical and Computer Engineering

More information

Approximate Capacity of the MIMO Relay Channel

Approximate Capacity of the MIMO Relay Channel 04 IEEE International Symposium on Information Theory Approximate Capacity of the MIMO Relay Channel Xianglan Jin Department of Electrical and Computer Engineering Pusan National University Busan 609-735

More information

Cooperative Communication with Feedback via Stochastic Approximation

Cooperative Communication with Feedback via Stochastic Approximation Cooperative Communication with Feedback via Stochastic Approximation Utsaw Kumar J Nicholas Laneman and Vijay Gupta Department of Electrical Engineering University of Notre Dame Email: {ukumar jnl vgupta}@ndedu

More information

Feedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case

Feedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case 1 arxiv:0901.3580v1 [cs.it] 23 Jan 2009 Feedback Capacity of the Gaussian Interference Channel to Within 1.7075 Bits: the Symmetric Case Changho Suh and David Tse Wireless Foundations in the Department

More information

Capacity Bounds for Diamond Networks

Capacity Bounds for Diamond Networks Technische Universität München Capacity Bounds for Diamond Networks Gerhard Kramer (TUM) joint work with Shirin Saeedi Bidokhti (TUM & Stanford) DIMACS Workshop on Network Coding Rutgers University, NJ

More information

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR 1 Stefano Rini, Daniela Tuninetti and Natasha Devroye srini2, danielat, devroye @ece.uic.edu University of Illinois at Chicago Abstract

More information

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem Information Theory for Wireless Communications. Lecture 0 Discrete Memoryless Multiple Access Channel (DM-MAC: The Converse Theorem Instructor: Dr. Saif Khan Mohammed Scribe: Antonios Pitarokoilis I. THE

More information

The Capacity of the Semi-Deterministic Cognitive Interference Channel and its Application to Constant Gap Results for the Gaussian Channel

The Capacity of the Semi-Deterministic Cognitive Interference Channel and its Application to Constant Gap Results for the Gaussian Channel The Capacity of the Semi-Deterministic Cognitive Interference Channel and its Application to Constant Gap Results for the Gaussian Channel Stefano Rini, Daniela Tuninetti, and Natasha Devroye Department

More information

Solutions to Homework Set #2 Broadcast channel, degraded message set, Csiszar Sum Equality

Solutions to Homework Set #2 Broadcast channel, degraded message set, Csiszar Sum Equality 1st Semester 2010/11 Solutions to Homework Set #2 Broadcast channel, degraded message set, Csiszar Sum Equality 1. Convexity of capacity region of broadcast channel. Let C R 2 be the capacity region of

More information

Solutions to Homework Set #4 Differential Entropy and Gaussian Channel

Solutions to Homework Set #4 Differential Entropy and Gaussian Channel Solutions to Homework Set #4 Differential Entropy and Gaussian Channel 1. Differential entropy. Evaluate the differential entropy h(x = f lnf for the following: (a Find the entropy of the exponential density

More information

On the Capacity of the Two-Hop Half-Duplex Relay Channel

On the Capacity of the Two-Hop Half-Duplex Relay Channel On the Capacity of the Two-Hop Half-Duplex Relay Channel Nikola Zlatanov, Vahid Jamali, and Robert Schober University of British Columbia, Vancouver, Canada, and Friedrich-Alexander-University Erlangen-Nürnberg,

More information

Random Access: An Information-Theoretic Perspective

Random Access: An Information-Theoretic Perspective Random Access: An Information-Theoretic Perspective Paolo Minero, Massimo Franceschetti, and David N. C. Tse Abstract This paper considers a random access system where each sender can be in two modes of

More information

Lecture 3: Channel Capacity

Lecture 3: Channel Capacity Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.

More information

Secrecy in the 2-User Symmetric Interference Channel with Transmitter Cooperation: Deterministic View

Secrecy in the 2-User Symmetric Interference Channel with Transmitter Cooperation: Deterministic View Secrecy in the 2-User Symmetric Interference Channel with Transmitter Cooperation: Deterministic View P. Mohapatra 9 th March 2013 Outline Motivation Problem statement Achievable scheme 1 Weak interference

More information

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Vincent Y. F. Tan (Joint work with Silas L. Fong) National University of Singapore (NUS) 2016 International Zurich Seminar on

More information

Energy State Amplification in an Energy Harvesting Communication System

Energy State Amplification in an Energy Harvesting Communication System Energy State Amplification in an Energy Harvesting Communication System Omur Ozel Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland College Park, MD 20742 omur@umd.edu

More information

Common Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014

Common Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014 Common Information Abbas El Gamal Stanford University Viterbi Lecture, USC, April 2014 Andrew Viterbi s Fabulous Formula, IEEE Spectrum, 2010 El Gamal (Stanford University) Disclaimer Viterbi Lecture 2

More information

SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003

SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003 SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003 SLEPIAN-WOLF RESULT { X i} RATE R x ENCODER 1 DECODER X i V i {, } { V i} ENCODER 0 RATE R v Problem: Determine R, the

More information

On the Secrecy Capacity of Fading Channels

On the Secrecy Capacity of Fading Channels On the Secrecy Capacity of Fading Channels arxiv:cs/63v [cs.it] 7 Oct 26 Praveen Kumar Gopala, Lifeng Lai and Hesham El Gamal Department of Electrical and Computer Engineering The Ohio State University

More information

Can Feedback Increase the Capacity of the Energy Harvesting Channel?

Can Feedback Increase the Capacity of the Energy Harvesting Channel? Can Feedback Increase the Capacity of the Energy Harvesting Channel? Dor Shaviv EE Dept., Stanford University shaviv@stanford.edu Ayfer Özgür EE Dept., Stanford University aozgur@stanford.edu Haim Permuter

More information

Approximately achieving the feedback interference channel capacity with point-to-point codes

Approximately achieving the feedback interference channel capacity with point-to-point codes Approximately achieving the feedback interference channel capacity with point-to-point codes Joyson Sebastian*, Can Karakus*, Suhas Diggavi* Abstract Superposition codes with rate-splitting have been used

More information

Lecture 10: Broadcast Channel and Superposition Coding

Lecture 10: Broadcast Channel and Superposition Coding Lecture 10: Broadcast Channel and Superposition Coding Scribed by: Zhe Yao 1 Broadcast channel M 0M 1M P{y 1 y x} M M 01 1 M M 0 The capacity of the broadcast channel depends only on the marginal conditional

More information

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,

More information

ECE Information theory Final

ECE Information theory Final ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

List of Figures. Acknowledgements. Abstract 1. 1 Introduction 2. 2 Preliminaries Superposition Coding Block Markov Encoding...

List of Figures. Acknowledgements. Abstract 1. 1 Introduction 2. 2 Preliminaries Superposition Coding Block Markov Encoding... Contents Contents i List of Figures iv Acknowledgements vi Abstract 1 1 Introduction 2 2 Preliminaries 7 2.1 Superposition Coding........................... 7 2.2 Block Markov Encoding.........................

More information

Bounds on Achievable Rates for General Multi-terminal Networks with Practical Constraints

Bounds on Achievable Rates for General Multi-terminal Networks with Practical Constraints Bounds on Achievable Rates for General Multi-terminal Networs with Practical Constraints Mohammad Ali Khojastepour, Ashutosh Sabharwal, and Behnaam Aazhang Department of Electrical and Computer Engineering

More information

Interactive Decoding of a Broadcast Message

Interactive Decoding of a Broadcast Message In Proc. Allerton Conf. Commun., Contr., Computing, (Illinois), Oct. 2003 Interactive Decoding of a Broadcast Message Stark C. Draper Brendan J. Frey Frank R. Kschischang University of Toronto Toronto,

More information

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

EE 4TM4: Digital Communications II. Channel Capacity

EE 4TM4: Digital Communications II. Channel Capacity EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

Multicoding Schemes for Interference Channels

Multicoding Schemes for Interference Channels Multicoding Schemes for Interference Channels 1 Ritesh Kolte, Ayfer Özgür, Haim Permuter Abstract arxiv:1502.04273v1 [cs.it] 15 Feb 2015 The best known inner bound for the 2-user discrete memoryless interference

More information

Lecture 4 Channel Coding

Lecture 4 Channel Coding Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity

More information

Error Exponent Region for Gaussian Broadcast Channels

Error Exponent Region for Gaussian Broadcast Channels Error Exponent Region for Gaussian Broadcast Channels Lihua Weng, S. Sandeep Pradhan, and Achilleas Anastasopoulos Electrical Engineering and Computer Science Dept. University of Michigan, Ann Arbor, MI

More information

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122 Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel

More information

Capacity of a Class of Semi-Deterministic Primitive Relay Channels

Capacity of a Class of Semi-Deterministic Primitive Relay Channels Capacity of a Class of Semi-Deterministic Primitive Relay Channels Ravi Tandon Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 2742 ravit@umd.edu

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels

A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels Mehdi Mohseni Department of Electrical Engineering Stanford University Stanford, CA 94305, USA Email: mmohseni@stanford.edu

More information

Variable Length Codes for Degraded Broadcast Channels

Variable Length Codes for Degraded Broadcast Channels Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates

More information

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Silas Fong (Joint work with Vincent Tan) Department of Electrical & Computer Engineering National University

More information

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12 Lecture 1: The Multiple Access Channel Copyright G. Caire 12 Outline Two-user MAC. The Gaussian case. The K-user case. Polymatroid structure and resource allocation problems. Copyright G. Caire 13 Two-user

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem LECTURE 15 Last time: Feedback channel: setting up the problem Perfect feedback Feedback capacity Data compression Lecture outline Joint source and channel coding theorem Converse Robustness Brain teaser

More information

II. THE TWO-WAY TWO-RELAY CHANNEL

II. THE TWO-WAY TWO-RELAY CHANNEL An Achievable Rate Region for the Two-Way Two-Relay Channel Jonathan Ponniah Liang-Liang Xie Department of Electrical Computer Engineering, University of Waterloo, Canada Abstract We propose an achievable

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

ON RELIABLE COMMUNICATION OVER RELAY CHANNELS

ON RELIABLE COMMUNICATION OVER RELAY CHANNELS ON RELIABLE COMMUNICATION OVER RELAY CHANNELS A DISSERTATION SUBMITTED TO THE DEPARTMENT OF ELECTRICAL ENGINEERING AND THE COMMITTEE ON GRADUATE STUDIES OF STANFORD UNIVERSITY IN PARTIAL FULFILLMENT OF

More information

Solutions to Homework Set #1 Sanov s Theorem, Rate distortion

Solutions to Homework Set #1 Sanov s Theorem, Rate distortion st Semester 00/ Solutions to Homework Set # Sanov s Theorem, Rate distortion. Sanov s theorem: Prove the simple version of Sanov s theorem for the binary random variables, i.e., let X,X,...,X n be a sequence

More information

On the Capacity of the Binary-Symmetric Parallel-Relay Network

On the Capacity of the Binary-Symmetric Parallel-Relay Network On the Capacity of the Binary-Symmetric Parallel-Relay Network Lawrence Ong, Sarah J. Johnson, and Christopher M. Kellett arxiv:207.3574v [cs.it] 6 Jul 202 Abstract We investigate the binary-symmetric

More information

Joint Source-Channel Coding for the Multiple-Access Relay Channel

Joint Source-Channel Coding for the Multiple-Access Relay Channel Joint Source-Channel Coding for the Multiple-Access Relay Channel Yonathan Murin, Ron Dabora Department of Electrical and Computer Engineering Ben-Gurion University, Israel Email: moriny@bgu.ac.il, ron@ee.bgu.ac.il

More information

Capacity bounds for multiple access-cognitive interference channel

Capacity bounds for multiple access-cognitive interference channel Mirmohseni et al. EURASIP Journal on Wireless Communications and Networking, :5 http://jwcn.eurasipjournals.com/content///5 RESEARCH Open Access Capacity bounds for multiple access-cognitive interference

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

Capacity of channel with energy harvesting transmitter

Capacity of channel with energy harvesting transmitter IET Communications Research Article Capacity of channel with energy harvesting transmitter ISSN 75-868 Received on nd May 04 Accepted on 7th October 04 doi: 0.049/iet-com.04.0445 www.ietdl.org Hamid Ghanizade

More information

Shannon meets Wiener II: On MMSE estimation in successive decoding schemes

Shannon meets Wiener II: On MMSE estimation in successive decoding schemes Shannon meets Wiener II: On MMSE estimation in successive decoding schemes G. David Forney, Jr. MIT Cambridge, MA 0239 USA forneyd@comcast.net Abstract We continue to discuss why MMSE estimation arises

More information

On the Capacity of Interference Channels with Degraded Message sets

On the Capacity of Interference Channels with Degraded Message sets On the Capacity of Interference Channels with Degraded Message sets Wei Wu, Sriram Vishwanath and Ari Arapostathis arxiv:cs/060507v [cs.it] 7 May 006 Abstract This paper is motivated by a sensor network

More information

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157 Lecture 6: Gaussian Channels Copyright G. Caire (Sample Lectures) 157 Differential entropy (1) Definition 18. The (joint) differential entropy of a continuous random vector X n p X n(x) over R is: Z h(x

More information

Sum Capacity of Gaussian Vector Broadcast Channels

Sum Capacity of Gaussian Vector Broadcast Channels Sum Capacity of Gaussian Vector Broadcast Channels Wei Yu, Member IEEE and John M. Cioffi, Fellow IEEE Abstract This paper characterizes the sum capacity of a class of potentially non-degraded Gaussian

More information

Simultaneous Nonunique Decoding Is Rate-Optimal

Simultaneous Nonunique Decoding Is Rate-Optimal Fiftieth Annual Allerton Conference Allerton House, UIUC, Illinois, USA October 1-5, 2012 Simultaneous Nonunique Decoding Is Rate-Optimal Bernd Bandemer University of California, San Diego La Jolla, CA

More information

Primary Rate-Splitting Achieves Capacity for the Gaussian Cognitive Interference Channel

Primary Rate-Splitting Achieves Capacity for the Gaussian Cognitive Interference Channel Primary Rate-Splitting Achieves Capacity for the Gaussian Cognitive Interference Channel Stefano Rini, Ernest Kurniawan and Andrea Goldsmith Technische Universität München, Munich, Germany, Stanford University,

More information

Quiz 2 Date: Monday, November 21, 2016

Quiz 2 Date: Monday, November 21, 2016 10-704 Information Processing and Learning Fall 2016 Quiz 2 Date: Monday, November 21, 2016 Name: Andrew ID: Department: Guidelines: 1. PLEASE DO NOT TURN THIS PAGE UNTIL INSTRUCTED. 2. Write your name,

More information

Source and Channel Coding for Correlated Sources Over Multiuser Channels

Source and Channel Coding for Correlated Sources Over Multiuser Channels Source and Channel Coding for Correlated Sources Over Multiuser Channels Deniz Gündüz, Elza Erkip, Andrea Goldsmith, H. Vincent Poor Abstract Source and channel coding over multiuser channels in which

More information

Degrees of Freedom Region of the Gaussian MIMO Broadcast Channel with Common and Private Messages

Degrees of Freedom Region of the Gaussian MIMO Broadcast Channel with Common and Private Messages Degrees of Freedom Region of the Gaussian MIMO Broadcast hannel with ommon and Private Messages Ersen Ekrem Sennur Ulukus Department of Electrical and omputer Engineering University of Maryland, ollege

More information

Capacity of a Class of Cognitive Radio Channels: Interference Channels with Degraded Message Sets

Capacity of a Class of Cognitive Radio Channels: Interference Channels with Degraded Message Sets Capacity of a Class of Cognitive Radio Channels: Interference Channels with Degraded Message Sets Wei Wu, Sriram Vishwanath and Ari Arapostathis Abstract This paper is motivated by two different scenarios.

More information

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

18.2 Continuous Alphabet (discrete-time, memoryless) Channel 0-704: Information Processing and Learning Spring 0 Lecture 8: Gaussian channel, Parallel channels and Rate-distortion theory Lecturer: Aarti Singh Scribe: Danai Koutra Disclaimer: These notes have not

More information

Mixed Noisy Network Coding

Mixed Noisy Network Coding Mixed Noisy Networ Coding Arash Behboodi Dept of elecommunication Systems, U Berlin 10587 Berlin, Germany Email: behboodi@tntu-berlinde Pablo Piantanida Dept of elecommunications, SUPELEC 9119 Gif-sur-Yvette,

More information

On the Capacity Region of the Gaussian Z-channel

On the Capacity Region of the Gaussian Z-channel On the Capacity Region of the Gaussian Z-channel Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@eng.umd.edu ulukus@eng.umd.edu

More information

The Role of Directed Information in Network Capacity

The Role of Directed Information in Network Capacity The Role of Directed Information in Network Capacity Sudeep Kamath 1 and Young-Han Kim 2 1 Department of Electrical Engineering Princeton University 2 Department of Electrical and Computer Engineering

More information

An Achievable Rate for the Multiple Level Relay Channel

An Achievable Rate for the Multiple Level Relay Channel An Achievable Rate for the Multiple Level Relay Channel Liang-Liang Xie and P. R. Kumar Department of Electrical and Computer Engineering, and Coordinated Science Laboratory University of Illinois, Urbana-Champaign

More information

An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel

An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel Forty-Ninth Annual Allerton Conference Allerton House, UIUC, Illinois, USA September 8-3, An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel Invited Paper Bernd Bandemer and

More information

A digital interface for Gaussian relay and interference networks: Lifting codes from the discrete superposition model

A digital interface for Gaussian relay and interference networks: Lifting codes from the discrete superposition model A digital interface for Gaussian relay and interference networks: Lifting codes from the discrete superposition model M. Anand, Student Member, IEEE, and P. R. Kumar, Fellow, IEEE Abstract For every Gaussian

More information

Distributed Lossless Compression. Distributed lossless compression system

Distributed Lossless Compression. Distributed lossless compression system Lecture #3 Distributed Lossless Compression (Reading: NIT 10.1 10.5, 4.4) Distributed lossless source coding Lossless source coding via random binning Time Sharing Achievability proof of the Slepian Wolf

More information

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity 5-859: Information Theory and Applications in TCS CMU: Spring 23 Lecture 8: Channel and source-channel coding theorems; BEC & linear codes February 7, 23 Lecturer: Venkatesan Guruswami Scribe: Dan Stahlke

More information

On the Duality between Multiple-Access Codes and Computation Codes

On the Duality between Multiple-Access Codes and Computation Codes On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch

More information

CS6304 / Analog and Digital Communication UNIT IV - SOURCE AND ERROR CONTROL CODING PART A 1. What is the use of error control coding? The main use of error control coding is to reduce the overall probability

More information

Lecture 2. Capacity of the Gaussian channel

Lecture 2. Capacity of the Gaussian channel Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN

More information

On Capacity Under Received-Signal Constraints

On Capacity Under Received-Signal Constraints On Capacity Under Received-Signal Constraints Michael Gastpar Dept. of EECS, University of California, Berkeley, CA 9470-770 gastpar@berkeley.edu Abstract In a world where different systems have to share

More information

Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel

Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel Jun Chen Dept. of Electrical and Computer Engr. McMaster University Hamilton, Ontario, Canada Chao Tian AT&T Labs-Research 80 Park

More information

Interference Channel aided by an Infrastructure Relay

Interference Channel aided by an Infrastructure Relay Interference Channel aided by an Infrastructure Relay Onur Sahin, Osvaldo Simeone, and Elza Erkip *Department of Electrical and Computer Engineering, Polytechnic Institute of New York University, Department

More information

Linear Codes, Target Function Classes, and Network Computing Capacity

Linear Codes, Target Function Classes, and Network Computing Capacity Linear Codes, Target Function Classes, and Network Computing Capacity Rathinakumar Appuswamy, Massimo Franceschetti, Nikhil Karamchandani, and Kenneth Zeger IEEE Transactions on Information Theory Submitted:

More information