Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels

Size: px
Start display at page:

Download "Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels"

Transcription

1 Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels Bike ie, Student Member, IEEE and Richard D. Wesel, Senior Member, IEEE Abstract Certain degraded broadcast channels (DBCs) have the property that the boundary of the capacity region can be achieved by an encoder that combines independent codebooks (one for each receiver) using the same singleletter function that adds distortion to the channel. We call this the natural encoder for the DBC. Natural encoders are known to achieve the capacity region boundary of the broadcast Gaussian channel, and the broadcast binary-symmetric channel. Recently, they have also been shown to achieve the capacity region of the broadcast Z channel. his paper shows that natural encoding achieves the capacity region boundary for discrete multiplicative DBCs. he optimality of the natural encoder also leads to a relatively simple expression for the capacity region for discrete multiplicative DBCs. Index erms Conditional entropy bound, degraded broadcast channel, discrete multiplicative degraded broadcast channel, natural encoding I. Introduction In the 70 s, Cover [], Bergmans [] and Gallager [3] established the capacity region for degraded broadcast channels (DBCs). he general optimal transmission strategy to achieve the boundary of the capacity region for DBCs is a oint encoding scheme. Certain DBCs have the property that the boundary of the capacity region can be achieved by an encoder that combines independent codebooks (one for each receiver) using the same single-letter function that adds distortion to the channel. We call this the natural encoder for the DBC. Natural encoders are known to achieve the capacity region boundary of the broadcast Gaussian channel [4], the broadcast binary-symmetric channel [] [5], and discrete additive DBCs [6]. Recently, natural encoding has also been shown to achieve the capacity region of the two-user broadcast Z channel [7]. he discrete multiplicative degraded broadcast channel (DM-DBC) is a discrete DBC whose channel outputs are discrete multiplications (multiplications in a finite field) of the channel input and noise. his paper decomposes the DM-DBC into a group-additive DBC with an extra erasure (zero) symbol. Based on this decomposition, this paper studies a conditional entropy bound for the DM-DBC, and proves that the natural encoding approach achieves the boundary of the capacity region for DM-DBCs. his work was supported by the Defence Advanced Research Proect Agency SPAWAR Systems Center, San Diego, California under Grant N his work was also supported by Rockwell Collins through contract # he authors are with the Electrical Engineering Department, University of California, Los Angeles, CA USA ( xbk@ee.ucla.edu; wesel@ee.ucla.edu). his conference paper presents part of the Ariv submission [8] which will be submitted to IEEE transaction on information theory. For more details, please refer to [8]. his paper is organized as follows: Section II provides definitions and states some results from [8] that will be useful. Section III defines the discrete multiplicative DBC and decomposes it into a group-additive DBC with an extra erasure (zero) symbol. Section IV computes the optimal input distribution to achieve the boundary of the capacity region for DM-DBCs. Section V introduces the natural encoding approach and proves that it achieves the capacity region for DM-DBCs. Section VI provides the conclusion. II. Definitions and Preliminaries Let Y Z be a discrete memoryless DBC where {,,, k}, Y {,,, n} and Z {,,, m}. Let Y be an n k stochastic matrix with entries Y (, i)= Pr(Y = =i) and Z be an m k stochastic matrix with entries Z (, i)=pr(z= =i). hus, Y and Z are the marginal transition probability matrices of the DBC. A. Conditional entropy bound F, DBC capacity regions Our results depend heavily on the function F, which we will now define. Let vector q in the simplex k of probability k-vectors be the distribution of the channel input. For any H(Y ) s H(Y ), define the function F Y, Z (q, s) as the infimum of H(Z U) with respect to all discrete random variables U such that a) H(Y U) = s; b) U and Y, Z are conditionally independent given, i.e., the sequence U,, Y, Z forms a Markov chain U Y Z. he function F ( ) is an extension to the function F ( ) introduced in [5]. We will use F Y, Z (q, s), F (q, s) and F (s) interchangeably. heorem : F Y, Z (q, s) is ointly convex in (q, s), and nondecreasing in s. he infimum in its definition is attainable. (See [8] for proof.) heorem : he capacity region for the discrete memoryless DBC Y Z is the closure of the convex hull of all rate pairs (R, R ) satisfying 0 R I(; Y ), () R H(Z) F Y, Z (q, R + H(Y )), () for some q k. (See [8] for proof.) B. Definitions of C, C q, and the (ξ, η)-plane for DBCs For any choice of the integer l, w = [w,, w l ] l and p k, =,, l, let U be an l-ary random variable with distribution w, and let U = [p,, p l ] be the transition probability matrix from U to. We can

2 compute p = p = U w = ξ = H(Y U) = η = H(Z U) = w p, (3) = w h n ( Y p ), (4) = w h m ( Z p ), (5) = where h n : n R is the entropy function, i.e., h n (p,, p n ) = p i ln p i. Let C be the set of all (p, ξ, η) satisfying (3) (4) and (5) for some choice of l, w and p. C is compact, connected, and convex. Let C = { (ξ, η) (q, ξ, η) C } be the proection of the set C onto the (ξ, η)-plane. Define Cq ={(ξ, η) (q, ξ, η) C} as the proection onto the (ξ, η)- plane of the subset of C where p = q. C and Cq are also compact and convex. By definition, F Y, Z (q, s) is the infimum of all η, for which Cq contains the point (s, η). Fig. (a) shows how F Y, Z (q, s) forms a lower boundary of the region Cq. C. U s that achieve F and optimal U s for DBCs In Fig. (a), the line with slope in the (ξ, η)-plane supporting C q has the equation η = ξ + ψ(q, ), where ψ(q, ) is the η-intercept of the tangent line with slope for F Y, Z (q, s). Let φ(q, ) = h m ( Z q) h n ( Y q). As illustrated in Fig. (b), ψ(q, ) is the lower convex envelope of φ(q, ) on k as shown in [8]. For any 0, the associated point on the boundary of the capacity region may be found (from its unique value of R + R ) as follows max max{r + R p = q} max{h(z) F (q, s) + s H(Y )} (H(Z) H(Y ) min{f (q, s) s}) (H(Z) H(Y ) ψ(q, )). (6) heorem 3: his theorem has two parts: i) For any fixed 0, if a point of the graph of ψ(, ) is the convex combination of l points of the graph of φ(, ) with arguments p and weights w for =,, l, then F ( w p, ) w h n ( Y p ) = w h m ( Z p ). Furthermore, for a fixed input distribution q = w p, the optimal transmission strategy to achieve the maximum of R + R is determined by l,w and p. In particular, the optimal transmission strategy is U = l, Pr(U =)=w and p U= = p, where p U= denotes the conditional distribution of given U =. ii) For a predetermined channel input distribution q, if the transmission strategy U = l, Pr(U = ) = w and H(Z) H(Z ) (q,) H(Y ) slope 0 (a) C * q H(Y) slope slope F*(q,s) s or (p,) (p,) 0 p (b) Fig.. (a) he illustrations of the curve F (q, s) shown in bold, the region Cq, and the point (0, ψ(q, )), (b) he illustrations of φ(, ) and ψ(, ) for the broadcast Z channel. p U= = p achieves max{r + R w p = q}, then the point (q, ψ(q, )) is the convex combination of l points of the graph of φ(, ) with arguments p and weights w for =,, l. (See [8] for proof.) D. Example: the two-user broadcast Z channel An important example, because of the decomposition we will apply to the discrete multiplicative DBC, is the broadcast Z channel with marginal transition probability matrices Y = α 0 β and Z = α, (7) 0 β where 0 < α α <, β + α = β + α =. Let β = β /β. For β, φ (p, ) 0 for all 0 p. herefore, φ(p, ) is convex in p and thus φ(p, ) = ψ(p, ) for all 0 p. For 0 < β, φ(p, ) is concave in p for p [ 0, β β β β β β ( ) ] and convex in p for p [ β β ( ), ]. he graph of φ(, ) in this case is shown in Fig. (b). Since φ(0, ) = 0, ψ(, ), the lower convex envelope of φ(, ), is constructed by drawing the tangent through the origin. Let (p, φ(p, )) be the point of contact. he value of p is determined by the equation φ p(p, ) = φ(p, )/p, i.e., ln( β p ) = ln( β p ). (8) E. Input-symmetric degraded broadcast channels Let Φ n denote the set of all n n permutation matrices. An n-input m-output channel with transition probability matrix m n is input-symmetric if the set G = {G Φ n Π Φ m, s.t. G = Π }, (9) where Π and G are permutation matrices, is transitive, which means each element of {,, n} can be mapped to every other element of {,, n} by some permutation matrix in G [5] [9]. An important property of inputsymmetric channel is that the uniform distribution achieves capacity. Definition : Input-Symmetric Degraded Broadcast Channel: A discrete memoryless DBC Y Z with = k, Y = n and Z = m is input-symmetric if the set G Y, Z = GY G Z (0) = { G Φ k Π Y Φ n, Π Z Φ m, s.t. Y G = Π Y Y, Z G = Π Z Z } ()

3 S S Fig.. W W N N Y Z he group-additive degraded broadcast channel. Encoder Encoder Fig. 3. Y Z Y Z Successive Decoder Decoder he group-addition encoding approach. is transitive. Lemma : For any permutation matrix G G Y, Z and (p, ξ, η) C, one has (Gp, ξ, η) C. (See [8] for proof.) Corollary : p k and G G Y, Z, one has C Gp = C p, and so F (Gp, s) = F (p, s) for any H(Y ) s H(Y ). (See [8] for proof.) Corollary : A uniform distribution on the alphabet of achieves the capacity region of any input-symmetric DBC. (See [8] for proof.) III. Group-additive and Multiplicative DBCs A. Group-additive DBC Definition : Group-additive (GA) Degraded Broadcast Channel: A DBC Y Z with, Y, Z {,, n} is a group-additive DBC if there exist two n-ary random variables N and N such that Y N and Z Y N as shown in Fig., where denotes identical distribution and denotes group addition. he group-additive DBC is input-symmetric. It includes the broadcast binary-symmetric channel and the discrete additive degraded broadcast channel [6] as special cases. Fig. 3 shows the group-addition encoding approach, which is to independently encode the message for each of the two users and broadcast the group addition of the two resulting codewords. he group-addition encoding approach achieves the boundary of capacity region for GA- DBCs [8]. B. Discrete multiplicative DBC Definition 3: Discrete Multiplicative Degraded Broadcast Channel: A discrete memoryless DBC Y Z with, Y, Z {0,,, n} is a discrete multiplicative DBC if there exist two (n + )-ary random variables N and N such that Y N and Z Y N as shown in Fig. 4, where denotes discrete multiplication. Y Z Y Z Y ZY Fig. 5. he channel structure of a discrete multiplicative degraded broadcast channel. By the definition of field and group, the discrete multiplication of zero and any element in {0,,, n} is always zero. Also {,, n} forms a group under discrete multiplication. Hence, the DM-DBC Y Z has the channel structure as shown in Fig. 5. he sub-channel Ỹ Z is a GA-DBC with marginal distributions Ỹ and Z = ZỸ Ỹ, where, Ỹ, Z = {,, n}. For the DM-DBC Y Z, if the channel input is zero, then the channel outputs Y and Z are zeros. If the channel input is a non-zero symbol, the channel output Y is zero with probability α and Z is zero with probability α, where α = α + ( α )α. herefore, the marginal transmission probability matrices for Y Z are α Y =, () 0 ( α )Ỹ [ α ZY = 0 ( α ) ZỸ ], (3) and α Z = ZY Y = α 0 ( α )Ỹ 0 ( α ) ZỸ α =, (4) 0 ( α ) ZỸ where is an all-one vector and 0 is an all-zero vector. IV. Optimal Input Distribution for DM-DBCs he sub-channel Ỹ Z is a group-additive DBC, which is input-symmetric, and hence, G Ỹ, Z is transitive. For any n n permutation matrix G G Ỹ, Z with Ỹ G = ΠỸ Ỹ and Z G = Π Z Z, the (n + ) (n + ) permutation matrix 0 G = (5) 0 G has α Y G= 0 0 = 0 ( α )Ỹ 0 G 0 Y (6) ΠỸ Fig. 4. N N Y Z he discrete multiplicative degraded broadcast channel. and so G G Y. Similarly, G G Z, and hence G G Y, Z. herefore, any non-zero element in {0,,, n} can be mapped to any other non-zero element in {0,,, n} by some permutation matrix in G Y, Z, however, no matrix in G Y, Z maps zero to non-zero element or non-zero element to zero. Hence, any permutation matrix G G Y, Z has the structure (5) for 3

4 some G G Ỹ, Z. hese results are summarized in the following Lemma: Lemma : Let G Ỹ, Z = { G,, G l }. Hence, G Y, Z = {G,, G l }, where 0 G =, for =,..., l. (7) 0 G Now we state and prove that the uniform distributed is optimal for the DM-DBC. Lemma 3: Let p = ( q, qp ) n+ be the distribution of channel input, where p is the distribution of. For any DM-DBC, C p C ( q,qu ) and C = q [0,] C ( q,qu ), where u n denotes the uniform distribution. (See [8] for proof.) heorem 4: he capacity region of the DM-DBC can be achieved by using the transmission strategies such that is uniformly distributed, i.e., the distribution of has p = ( q, qu ) for some q [0, ]. As a consequence, the capacity region is the convex hull of the closure of all (R, R ) satisfying R s qh n (Ỹ e ), (8) R h(( α )q) + ( α )q ln(n) F Y, Z (( q, qu ), s), (9) for some 0 q and qh n (Ỹ e ) s h(( α )q) + ( α )q ln(n). Proof : Let p = ( q, qp ) be the distribution of the channel input, where p = (p,, p n ). Since G Ỹ is transitive and the columns of Ỹ are permutations of each other, H(Y ) = H(Y = i) (0) i=0 = ( q)h(y = 0) + = qp i h n (Ỹ e i ) () qp i h n (Ỹ e ) () = qh n (Ỹ e ), (3) which is independent with p. Let G Y, Z ={G,,G l }. H(Z) = h n+ ( Z p ) (4) = h n+ ( Z G i p l ) (5) h n+ ( Z l G i p ) (6) = h n+ ( Z ( q, qu ) ) (7) = h(( α )q) + ( α )q ln(n) (8) where (6) follows from Jensen s inequality. Since C p from Lemma 3, C ( q,qu ) F (p, s) F (( q, qu ), s). (9) S S W W Fig. 6. Encoder Encoder Y Z Y Z Successive Decoder Decoder he natural encoding approach for DM-DBCs. Plugging (3), (8) and (9) into heorem, the capacity region for DM-DBCs is co (R, R ) : R s H(Y ), R H(Z) F Y, Z (q, s) }] (30) co (R, R ) : R s h n (Ỹ e ), [ = co q [0,] [ = co R h(( α )q) + ( α )q ln(n) F Y, Z (( q, qu ), s) }] (3) { (R, R ) : R s qh n (Ỹ e ), R h(( α )q) + ( α )q ln(n) p =( q,qu ) F Y, Z (( q, qu ), s) }] (3) { (R, R ) : R s H(Y ), R H(Z) F Y, Z (q, s) }] (33) co (R, R ) : R s H(Y ), R H(Z) F Y, Z (q, s) }], (34) where co denotes the convex hull of the closure. Note that (30) and (34) are identical, hence (30-34) are all equal. herefore, (3) expresses the capacity region for the DM- DBC, which also means that the capacity region can be achieved by using the transmission strategies such that the broadcast signal has distribution p = ( q, qu ) for some q [0, ]. Q.E.D. V. Natural Encoding and Capacity Region for DM-DBCs he natural encoding approach for the discrete multiplicative DBC is shown in Fig. 6. W is the message for User who sees the better channel Y and W is the message for User who sees the worse channel Z. he natural encoding approach is first to independently encode these two messages into two codewords and respectively, and then to broadcast which is obtained by applying the single-letter multiplication function = on the symbols of the codewords and. he distribution of is constrained to be p = ( q, qu ) for some q [0, ] and hence the distribution of the broadcast signal also has p = ( q, qu ) for some q [0, ], which the previous section proved to be the optimal input distribution for the DM-DBC. 4

5 User receives Z and decodes the desired message directly. User receives Y and successively decodes the message for User and then for User. Let p = ( q, qp ) be the distribution of the channel input, where p is the distribution of sub-channel input. For the DM-DBC Y Z, φ(p, ) = h n+ ( Z p ) h n+ ( Y p ) (35) ([ ]) q + qα =h n+ q( α ) Z p ([ ]) q + qα h n+ (36) q( α )Ỹ p =h(q( α )) q( α )h n ( Z p ) (h(q( α )) q( α )h n (Ỹ p )) (37) =h(qβ ) h(qβ ) + qβ ( hn ( Z p ) α h n (Ỹ p ) ), (38) α h n (Ỹ p ). Define ϕ(q, p, ) as fol- where β = α and β = α. For the sub-channel Ỹ Z, define φ(p, h n ( Z p ) lows: α ) = ϕ(q, p, ) = h(qβ ) h(qβ )+qβ ψ(p, α ), (39) where ψ is the lower envelope of φ(p, α ) in p. With this definition, note that ψ(p, ), the lower envelope of φ(p, ), is also the lower envelope of ϕ(q, p, ). Lemma 4: ψ(( q, qu ), ), the lower envelope of φ(p, ) in p at p = ( q, qu ) is on the lower envelope of ϕ(q, u, ) in q. (See [8] for proof.) Lemma 4 indicates that the lower envelope of φ(, ) at p = ( q, qu ) can be decomposed into two steps. First, for any fixed q, the lower envelope of φ(p, ) in p is ϕ(q, p, ). Second, for p = u, the lower envelope of ϕ(q, u, ) in q coincides with ψ(p, ), the lower envelope of φ(p, ) in p. heorem 5: he natural encoding approach with time sharing achieves the boundary of the capacity region for the discrete multiplicative DBC. Proof : heorem 4 shows that the boundary of the capacity region for the DM-DBC can be achieved by using transmission strategies with uniformly distributed, i.e., the input distribution p = ( q, qu ). For p = ( q, qu ), ψ(( q, qu ), ) can be attained by the convex combination of points on the graph of ϕ(q, u, ). Since ϕ(q, u, ) = h(qβ ) h(qβ )+qβ ψ(u, α ), which is the sum of φ(q, ) for the broadcast Z channel and q times the constant β ψ(u, α ). Hence, by a discussion analogous to Section II-D, ψ(( q, qu ), ) can be attained by the convex combination of points on the graph of ϕ(q, u, ). One point is at q = 0 and ϕ(0, u, ) = 0. he other point is at q = p, where p is determined by (8). Note that the point (0,0) on the graph of ϕ(q, u, ) is also on the graph of φ(p, ). By heorem 3, the point (p, ϕ(p, u, )) is the convex combination of n points on U 0 U p U V Y Z Y Z Y ZY Fig. 7. he optimal transmission strategy for the discrete multiplicative degraded broadcast channel. the graph of φ(p, ), which corresponds to the groupaddition encoding approach for the sub-channel Ỹ Z because the group-addition encoding approach is optimal for the group-additive DBC Ỹ Z. herefore, by heorem 3, an optimal transmission strategy for the DM- DBC Y Z has the structure as shown in Fig. 7. If the auxiliary random variable U = 0, then the channel input = 0. If U is a non-zero symbol, then = 0 with probability p. In the case where U and are both non-zero, can be obtained as = Ũ Ṽ, where is the group addition in the group-additive degraded broadcast sub-channel Ỹ Z, Ũ is uniformly distributed and Ṽ is an n-ary random variable. By the definition of group addition and discrete multiplication, the transmission strategy with the structure in Fig. 7 is the natural encoding approach. Q.E.D. VI. Conclusion his paper shows that the natural encoding of Fig. 6 is optimal for two-user DM-DBCs. Its achievable rate region is also a single-letter characterization of the capacity region for DM-DBCs. Hence, the capacity region for the DM-DBC in Fig. 4 is co (R, R ) : R H(U V N ) H(U V N U) p U,p V R H(U V N U) H(U V N U V ) }]. (40) References []. M. Cover. Broadcast channels. IEEE rans. Inform. heory, I-8: 4, January 97. [] P. P. Bergmans. Random coding theorem for broadcast channels with degraded components. IEEE rans. Inform. heory, I- 9:97 07, March 973. [3] R. G. Gallager. Capacity and coding for degraded broadcast channels. Probl. Pered. Inform., 0:3 4, July Sept [4] P. P. Bergmans. A simple converse for broadcast channels with additive white Gaussian noise. IEEE rans. Inform. heory, I- 0:79 80, March 974. [5] H. Witsenhausen and A. Wyner. A conditional entropy bound for a pair of discrete random variables. IEEE rans. Inform. heory, I-(5):493 50, Sep 975. [6] R. Benzel. he capacity region of a class of discrete additive degraded interference channels. IEEE rans. Inform. heory, 5:8 3, Mar 979. [7] B. ie, M. Griot, A. I. Vila Casado and R. D. Wesel. Optimal transmission strategy and capacity region for broadcast Z channels. In IEEE Information heory Workshop 007, Lake ahoe, USA, Sep 007. [8] B. ie and R. D. Wesel. Optimal Independent-Encoding Schemes for Several Classes of Discrete Degraded Broadcast Channels. In Ariv:08.46v, Jan [9] B. ie and R. D. Wesel. A mutual information invariance approach to symmetry in discrete memoryless channels. In Information heory and Application 008, UCSD, San Diego, USA, Jan. 7-Feb

Optimal Encoding Schemes for Several Classes of Discrete Degraded Broadcast Channels

Optimal Encoding Schemes for Several Classes of Discrete Degraded Broadcast Channels Optimal Encoding Schemes for Several Classes of Discrete Degraded Broadcast Channels Bike Xie, Student Member, IEEE and Richard D. Wesel, Senior Member, IEEE Abstract arxiv:0811.4162v4 [cs.it] 8 May 2009

More information

Optimal Independent Encoding Schemes for Several Classes of Discrete Degraded Broadcast Channels

Optimal Independent Encoding Schemes for Several Classes of Discrete Degraded Broadcast Channels Optimal Independent Encoding Schemes for Several Classes of Discrete Degraded Broadcast Channels Bike Xie, Student Member, IEEE and Richard D. Wesel, Senior Member, IEEE arxiv:0811.4162v1 [cs.it] 25 Nov

More information

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12 Lecture 1: The Multiple Access Channel Copyright G. Caire 12 Outline Two-user MAC. The Gaussian case. The K-user case. Polymatroid structure and resource allocation problems. Copyright G. Caire 13 Two-user

More information

The Capacity Region of a Class of Discrete Degraded Interference Channels

The Capacity Region of a Class of Discrete Degraded Interference Channels The Capacity Region of a Class of Discrete Degraded Interference Channels Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@umd.edu

More information

Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel

Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel Jun Chen Dept. of Electrical and Computer Engr. McMaster University Hamilton, Ontario, Canada Chao Tian AT&T Labs-Research 80 Park

More information

Symmetric Characterization of Finite State Markov Channels

Symmetric Characterization of Finite State Markov Channels Symmetric Characterization of Finite State Markov Channels Mohammad Rezaeian Department of Electrical and Electronic Eng. The University of Melbourne Victoria, 31, Australia Email: rezaeian@unimelb.edu.au

More information

Lecture 10: Broadcast Channel and Superposition Coding

Lecture 10: Broadcast Channel and Superposition Coding Lecture 10: Broadcast Channel and Superposition Coding Scribed by: Zhe Yao 1 Broadcast channel M 0M 1M P{y 1 y x} M M 01 1 M M 0 The capacity of the broadcast channel depends only on the marginal conditional

More information

Nonlinear Turbo Codes for the broadcast Z Channel

Nonlinear Turbo Codes for the broadcast Z Channel UCLA Electrical Engineering Department Communication Systems Lab. Nonlinear Turbo Codes for the broadcast Z Channel Richard Wesel Miguel Griot Bike ie Andres Vila Casado Communication Systems Laboratory,

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

A Comparison of Superposition Coding Schemes

A Comparison of Superposition Coding Schemes A Comparison of Superposition Coding Schemes Lele Wang, Eren Şaşoğlu, Bernd Bandemer, and Young-Han Kim Department of Electrical and Computer Engineering University of California, San Diego La Jolla, CA

More information

A Comparison of Two Achievable Rate Regions for the Interference Channel

A Comparison of Two Achievable Rate Regions for the Interference Channel A Comparison of Two Achievable Rate Regions for the Interference Channel Hon-Fah Chong, Mehul Motani, and Hari Krishna Garg Electrical & Computer Engineering National University of Singapore Email: {g030596,motani,eleghk}@nus.edu.sg

More information

Reliable Computation over Multiple-Access Channels

Reliable Computation over Multiple-Access Channels Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,

More information

The Gallager Converse

The Gallager Converse The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing

More information

Optimal Power Allocation for Parallel Gaussian Broadcast Channels with Independent and Common Information

Optimal Power Allocation for Parallel Gaussian Broadcast Channels with Independent and Common Information SUBMIED O IEEE INERNAIONAL SYMPOSIUM ON INFORMAION HEORY, DE. 23 1 Optimal Power Allocation for Parallel Gaussian Broadcast hannels with Independent and ommon Information Nihar Jindal and Andrea Goldsmith

More information

Variable Length Codes for Degraded Broadcast Channels

Variable Length Codes for Degraded Broadcast Channels Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

On the Capacity Region of the Gaussian Z-channel

On the Capacity Region of the Gaussian Z-channel On the Capacity Region of the Gaussian Z-channel Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@eng.umd.edu ulukus@eng.umd.edu

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

On Scalable Coding in the Presence of Decoder Side Information

On Scalable Coding in the Presence of Decoder Side Information On Scalable Coding in the Presence of Decoder Side Information Emrah Akyol, Urbashi Mitra Dep. of Electrical Eng. USC, CA, US Email: {eakyol, ubli}@usc.edu Ertem Tuncel Dep. of Electrical Eng. UC Riverside,

More information

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel

More information

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Nevroz Şen, Fady Alajaji and Serdar Yüksel Department of Mathematics and Statistics Queen s University Kingston, ON K7L 3N6, Canada

More information

Capacity Region of the Permutation Channel

Capacity Region of the Permutation Channel Capacity Region of the Permutation Channel John MacLaren Walsh and Steven Weber Abstract We discuss the capacity region of a degraded broadcast channel (DBC) formed from a channel that randomly permutes

More information

A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels

A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels Mehdi Mohseni Department of Electrical Engineering Stanford University Stanford, CA 94305, USA Email: mmohseni@stanford.edu

More information

X 1 : X Table 1: Y = X X 2

X 1 : X Table 1: Y = X X 2 ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access

More information

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 Capacity Theorems for Discrete, Finite-State Broadcast Channels With Feedback and Unidirectional Receiver Cooperation Ron Dabora

More information

Lin Song Shuo Shao Jun Chen. 11 September 2013

Lin Song Shuo Shao Jun Chen. 11 September 2013 and Result in Song Shuo Shao 11 September 2013 1 / 33 Two s and Result n S Encoder 1: f1 R 1 Decoder: Decoder: g 1 g1,2 d 1 1 1,2 d1,2 Encoder 2: f Encoder 2 R 2 Decoder: g 2 Decoder d 2 2 2 / 33 Two s

More information

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet. EE376A - Information Theory Final, Monday March 14th 216 Solutions Instructions: You have three hours, 3.3PM - 6.3PM The exam has 4 questions, totaling 12 points. Please start answering each question on

More information

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

Lecture 8: Shannon s Noise Models

Lecture 8: Shannon s Noise Models Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have

More information

SUCCESSIVE refinement of information, or scalable

SUCCESSIVE refinement of information, or scalable IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 49, NO. 8, AUGUST 2003 1983 Additive Successive Refinement Ertem Tuncel, Student Member, IEEE, Kenneth Rose, Fellow, IEEE Abstract Rate-distortion bounds for

More information

Polar Codes for Some Multi-terminal Communications Problems

Polar Codes for Some Multi-terminal Communications Problems Polar Codes for ome Multi-terminal Communications Problems Aria G. ahebi and. andeep Pradhan Department of Electrical Engineering and Computer cience, niversity of Michigan, Ann Arbor, MI 48109, A. Email:

More information

Energy State Amplification in an Energy Harvesting Communication System

Energy State Amplification in an Energy Harvesting Communication System Energy State Amplification in an Energy Harvesting Communication System Omur Ozel Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland College Park, MD 20742 omur@umd.edu

More information

Aalborg Universitet. Bounds on information combining for parity-check equations Land, Ingmar Rüdiger; Hoeher, A.; Huber, Johannes

Aalborg Universitet. Bounds on information combining for parity-check equations Land, Ingmar Rüdiger; Hoeher, A.; Huber, Johannes Aalborg Universitet Bounds on information combining for parity-check equations Land, Ingmar Rüdiger; Hoeher, A.; Huber, Johannes Published in: 2004 International Seminar on Communications DOI link to publication

More information

On Lossless Coding With Coded Side Information Daniel Marco, Member, IEEE, and Michelle Effros, Fellow, IEEE

On Lossless Coding With Coded Side Information Daniel Marco, Member, IEEE, and Michelle Effros, Fellow, IEEE 3284 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 7, JULY 2009 On Lossless Coding With Coded Side Information Daniel Marco, Member, IEEE, Michelle Effros, Fellow, IEEE Abstract This paper considers

More information

New Results on the Equality of Exact and Wyner Common Information Rates

New Results on the Equality of Exact and Wyner Common Information Rates 08 IEEE International Symposium on Information Theory (ISIT New Results on the Equality of Exact and Wyner Common Information Rates Badri N. Vellambi Australian National University Acton, ACT 0, Australia

More information

Cut-Set Bound and Dependence Balance Bound

Cut-Set Bound and Dependence Balance Bound Cut-Set Bound and Dependence Balance Bound Lei Xiao lxiao@nd.edu 1 Date: 4 October, 2006 Reading: Elements of information theory by Cover and Thomas [1, Section 14.10], and the paper by Hekstra and Willems

More information

On the Rate-Limited Gelfand-Pinsker Problem

On the Rate-Limited Gelfand-Pinsker Problem On the Rate-Limited Gelfand-Pinsker Problem Ravi Tandon Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 ravit@umd.edu ulukus@umd.edu Abstract

More information

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels Representation of Correlated s into Graphs for Transmission over Broadcast s Suhan Choi Department of Electrical Eng. and Computer Science University of Michigan, Ann Arbor, MI 80, USA Email: suhanc@eecs.umich.edu

More information

An Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel

An Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 4, APRIL 2012 2427 An Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel Ersen Ekrem, Student Member, IEEE,

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR 1 Stefano Rini, Daniela Tuninetti and Natasha Devroye srini2, danielat, devroye @ece.uic.edu University of Illinois at Chicago Abstract

More information

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016 Midterm Exam Time: 09:10 12:10 11/23, 2016 Name: Student ID: Policy: (Read before You Start to Work) The exam is closed book. However, you are allowed to bring TWO A4-size cheat sheet (single-sheet, two-sided).

More information

Appendix B Information theory from first principles

Appendix B Information theory from first principles Appendix B Information theory from first principles This appendix discusses the information theory behind the capacity expressions used in the book. Section 8.3.4 is the only part of the book that supposes

More information

The Capacity Region for Multi-source Multi-sink Network Coding

The Capacity Region for Multi-source Multi-sink Network Coding The Capacity Region for Multi-source Multi-sink Network Coding Xijin Yan Dept. of Electrical Eng. - Systems University of Southern California Los Angeles, CA, U.S.A. xyan@usc.edu Raymond W. Yeung Dept.

More information

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem Information Theory for Wireless Communications. Lecture 0 Discrete Memoryless Multiple Access Channel (DM-MAC: The Converse Theorem Instructor: Dr. Saif Khan Mohammed Scribe: Antonios Pitarokoilis I. THE

More information

On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers

On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers Albrecht Wolf, Diana Cristina González, Meik Dörpinghaus, José Cândido Silveira Santos Filho, and Gerhard Fettweis Vodafone

More information

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@cam.ac.uk Alfonso Martinez Universitat Pompeu Fabra alfonso.martinez@ieee.org

More information

Energy Efficient Estimation of Gaussian Sources Over Inhomogeneous Gaussian MAC Channels

Energy Efficient Estimation of Gaussian Sources Over Inhomogeneous Gaussian MAC Channels Energy Efficient Estimation of Gaussian Sources Over Inhomogeneous Gaussian MAC Channels Shuangqing Wei, Ragopal Kannan, Sitharama Iyengar and Nageswara S. Rao Abstract In this paper, we first provide

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

Random Access: An Information-Theoretic Perspective

Random Access: An Information-Theoretic Perspective Random Access: An Information-Theoretic Perspective Paolo Minero, Massimo Franceschetti, and David N. C. Tse Abstract This paper considers a random access system where each sender can be in two modes of

More information

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,

More information

A Simple Memoryless Proof of the Capacity of the Exponential Server Timing Channel

A Simple Memoryless Proof of the Capacity of the Exponential Server Timing Channel A Simple Memoryless Proof of the Capacity of the Exponential Server iming Channel odd P. Coleman ECE Department Coordinated Science Laboratory University of Illinois colemant@illinois.edu Abstract his

More information

On the Gaussian Z-Interference channel

On the Gaussian Z-Interference channel On the Gaussian Z-Interference channel Max Costa, Chandra Nair, and David Ng University of Campinas Unicamp, The Chinese University of Hong Kong CUHK Abstract The optimality of the Han and Kobayashi achievable

More information

Simultaneous Nonunique Decoding Is Rate-Optimal

Simultaneous Nonunique Decoding Is Rate-Optimal Fiftieth Annual Allerton Conference Allerton House, UIUC, Illinois, USA October 1-5, 2012 Simultaneous Nonunique Decoding Is Rate-Optimal Bernd Bandemer University of California, San Diego La Jolla, CA

More information

Primary Rate-Splitting Achieves Capacity for the Gaussian Cognitive Interference Channel

Primary Rate-Splitting Achieves Capacity for the Gaussian Cognitive Interference Channel Primary Rate-Splitting Achieves Capacity for the Gaussian Cognitive Interference Channel Stefano Rini, Ernest Kurniawan and Andrea Goldsmith Technische Universität München, Munich, Germany, Stanford University,

More information

Cognitive Multiple Access Networks

Cognitive Multiple Access Networks Cognitive Multiple Access Networks Natasha Devroye Email: ndevroye@deas.harvard.edu Patrick Mitran Email: mitran@deas.harvard.edu Vahid Tarokh Email: vahid@deas.harvard.edu Abstract A cognitive radio can

More information

Exercise 1. = P(y a 1)P(a 1 )

Exercise 1. = P(y a 1)P(a 1 ) Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a

More information

An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets

An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets Jing Guo University of Cambridge jg582@cam.ac.uk Jossy Sayir University of Cambridge j.sayir@ieee.org Minghai Qin

More information

The Adaptive Classical Capacity of a Quantum Channel,

The Adaptive Classical Capacity of a Quantum Channel, The Adaptive Classical Capacity of a Quantum Channel, or Information Capacities of Three Symmetric Pure States in Three Dimensions Peter W. Shor 1 AT&T Labs Research Florham Park, NJ 07932 02139 1 Current

More information

ELEMENTS O F INFORMATION THEORY

ELEMENTS O F INFORMATION THEORY ELEMENTS O F INFORMATION THEORY THOMAS M. COVER JOY A. THOMAS Preface to the Second Edition Preface to the First Edition Acknowledgments for the Second Edition Acknowledgments for the First Edition x

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

1 Introduction to information theory

1 Introduction to information theory 1 Introduction to information theory 1.1 Introduction In this chapter we present some of the basic concepts of information theory. The situations we have in mind involve the exchange of information through

More information

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7 Electrical and Information Technology Information Theory Problems and Solutions Contents Problems.......... Solutions...........7 Problems 3. In Problem?? the binomial coefficent was estimated with Stirling

More information

On Common Information and the Encoding of Sources that are Not Successively Refinable

On Common Information and the Encoding of Sources that are Not Successively Refinable On Common Information and the Encoding of Sources that are Not Successively Refinable Kumar Viswanatha, Emrah Akyol, Tejaswi Nanjundaswamy and Kenneth Rose ECE Department, University of California - Santa

More information

On Gaussian MIMO Broadcast Channels with Common and Private Messages

On Gaussian MIMO Broadcast Channels with Common and Private Messages On Gaussian MIMO Broadcast Channels with Common and Private Messages Ersen Ekrem Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 20742 ersen@umd.edu

More information

On the Duality between Multiple-Access Codes and Computation Codes

On the Duality between Multiple-Access Codes and Computation Codes On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch

More information

Lecture 3: Channel Capacity

Lecture 3: Channel Capacity Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

WITH advances in communications media and technologies

WITH advances in communications media and technologies IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 45, NO. 6, SEPTEMBER 1999 1887 Distortion-Rate Bounds for Fixed- Variable-Rate Multiresolution Source Codes Michelle Effros, Member, IEEE Abstract The source

More information

On Scalable Source Coding for Multiple Decoders with Side Information

On Scalable Source Coding for Multiple Decoders with Side Information On Scalable Source Coding for Multiple Decoders with Side Information Chao Tian School of Computer and Communication Sciences Laboratory for Information and Communication Systems (LICOS), EPFL, Lausanne,

More information

Multiuser Successive Refinement and Multiple Description Coding

Multiuser Successive Refinement and Multiple Description Coding Multiuser Successive Refinement and Multiple Description Coding Chao Tian Laboratory for Information and Communication Systems (LICOS) School of Computer and Communication Sciences EPFL Lausanne Switzerland

More information

5218 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER 2006

5218 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER 2006 5218 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER 2006 Source Coding With Limited-Look-Ahead Side Information at the Decoder Tsachy Weissman, Member, IEEE, Abbas El Gamal, Fellow,

More information

ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016

ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016 ECE598: Information-theoretic methods in high-dimensional statistics Spring 06 Lecture : Mutual Information Method Lecturer: Yihong Wu Scribe: Jaeho Lee, Mar, 06 Ed. Mar 9 Quick review: Assouad s lemma

More information

An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel

An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel Forty-Ninth Annual Allerton Conference Allerton House, UIUC, Illinois, USA September 8-3, An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel Invited Paper Bernd Bandemer and

More information

Capacity Bounds for. the Gaussian Interference Channel

Capacity Bounds for. the Gaussian Interference Channel Capacity Bounds for the Gaussian Interference Channel Abolfazl S. Motahari, Student Member, IEEE, and Amir K. Khandani, Member, IEEE Coding & Signal Transmission Laboratory www.cst.uwaterloo.ca {abolfazl,khandani}@cst.uwaterloo.ca

More information

Distributed Functional Compression through Graph Coloring

Distributed Functional Compression through Graph Coloring Distributed Functional Compression through Graph Coloring Vishal Doshi, Devavrat Shah, Muriel Médard, and Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology

More information

An Achievable Error Exponent for the Mismatched Multiple-Access Channel

An Achievable Error Exponent for the Mismatched Multiple-Access Channel An Achievable Error Exponent for the Mismatched Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@camacuk Albert Guillén i Fàbregas ICREA & Universitat Pompeu Fabra University of

More information

Noisy-Channel Coding

Noisy-Channel Coding Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/05264298 Part II Noisy-Channel Coding Copyright Cambridge University Press 2003.

More information

PART III. Outline. Codes and Cryptography. Sources. Optimal Codes (I) Jorge L. Villar. MAMME, Fall 2015

PART III. Outline. Codes and Cryptography. Sources. Optimal Codes (I) Jorge L. Villar. MAMME, Fall 2015 Outline Codes and Cryptography 1 Information Sources and Optimal Codes 2 Building Optimal Codes: Huffman Codes MAMME, Fall 2015 3 Shannon Entropy and Mutual Information PART III Sources Information source:

More information

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I

More information

Feedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case

Feedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case 1 arxiv:0901.3580v1 [cs.it] 23 Jan 2009 Feedback Capacity of the Gaussian Interference Channel to Within 1.7075 Bits: the Symmetric Case Changho Suh and David Tse Wireless Foundations in the Department

More information

Degrees of Freedom Region of the Gaussian MIMO Broadcast Channel with Common and Private Messages

Degrees of Freedom Region of the Gaussian MIMO Broadcast Channel with Common and Private Messages Degrees of Freedom Region of the Gaussian MIMO Broadcast hannel with ommon and Private Messages Ersen Ekrem Sennur Ulukus Department of Electrical and omputer Engineering University of Maryland, ollege

More information

The Duality Between Information Embedding and Source Coding With Side Information and Some Applications

The Duality Between Information Embedding and Source Coding With Side Information and Some Applications IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 49, NO. 5, MAY 2003 1159 The Duality Between Information Embedding and Source Coding With Side Information and Some Applications Richard J. Barron, Member,

More information

The Poisson Channel with Side Information

The Poisson Channel with Side Information The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH

More information

Approximately achieving the feedback interference channel capacity with point-to-point codes

Approximately achieving the feedback interference channel capacity with point-to-point codes Approximately achieving the feedback interference channel capacity with point-to-point codes Joyson Sebastian*, Can Karakus*, Suhas Diggavi* Abstract Superposition codes with rate-splitting have been used

More information

Joint Write-Once-Memory and Error-Control Codes

Joint Write-Once-Memory and Error-Control Codes 1 Joint Write-Once-Memory and Error-Control Codes Xudong Ma Pattern Technology Lab LLC, U.S.A. Email: xma@ieee.org arxiv:1411.4617v1 [cs.it] 17 ov 2014 Abstract Write-Once-Memory (WOM) is a model for many

More information

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page. EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 28 Please submit on Gradescope. Start every question on a new page.. Maximum Differential Entropy (a) Show that among all distributions supported

More information

On the Secrecy Capacity of the Z-Interference Channel

On the Secrecy Capacity of the Z-Interference Channel On the Secrecy Capacity of the Z-Interference Channel Ronit Bustin Tel Aviv University Email: ronitbustin@post.tau.ac.il Mojtaba Vaezi Princeton University Email: mvaezi@princeton.edu Rafael F. Schaefer

More information

Channel Polarization and Blackwell Measures

Channel Polarization and Blackwell Measures Channel Polarization Blackwell Measures Maxim Raginsky Abstract The Blackwell measure of a binary-input channel (BIC is the distribution of the posterior probability of 0 under the uniform input distribution

More information

Error Exponent Region for Gaussian Broadcast Channels

Error Exponent Region for Gaussian Broadcast Channels Error Exponent Region for Gaussian Broadcast Channels Lihua Weng, S. Sandeep Pradhan, and Achilleas Anastasopoulos Electrical Engineering and Computer Science Dept. University of Michigan, Ann Arbor, MI

More information

On Two-user Fading Gaussian Broadcast Channels. with Perfect Channel State Information at the Receivers. Daniela Tuninetti

On Two-user Fading Gaussian Broadcast Channels. with Perfect Channel State Information at the Receivers. Daniela Tuninetti DIMACS Workshop on Network Information Theory - March 2003 Daniela Tuninetti 1 On Two-user Fading Gaussian Broadcast Channels with Perfect Channel State Information at the Receivers Daniela Tuninetti Mobile

More information

On Multiple User Channels with State Information at the Transmitters

On Multiple User Channels with State Information at the Transmitters On Multiple User Channels with State Information at the Transmitters Styrmir Sigurjónsson and Young-Han Kim* Information Systems Laboratory Stanford University Stanford, CA 94305, USA Email: {styrmir,yhk}@stanford.edu

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

The Capacity of Finite Abelian Group Codes Over Symmetric Memoryless Channels Giacomo Como and Fabio Fagnani

The Capacity of Finite Abelian Group Codes Over Symmetric Memoryless Channels Giacomo Como and Fabio Fagnani IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 5, MAY 2009 2037 The Capacity of Finite Abelian Group Codes Over Symmetric Memoryless Channels Giacomo Como and Fabio Fagnani Abstract The capacity

More information

Chain Independence and Common Information

Chain Independence and Common Information 1 Chain Independence and Common Information Konstantin Makarychev and Yury Makarychev Abstract We present a new proof of a celebrated result of Gács and Körner that the common information is far less than

More information

Multicoding Schemes for Interference Channels

Multicoding Schemes for Interference Channels Multicoding Schemes for Interference Channels 1 Ritesh Kolte, Ayfer Özgür, Haim Permuter Abstract arxiv:1502.04273v1 [cs.it] 15 Feb 2015 The best known inner bound for the 2-user discrete memoryless interference

More information

Bounds on Mutual Information for Simple Codes Using Information Combining

Bounds on Mutual Information for Simple Codes Using Information Combining ACCEPTED FOR PUBLICATION IN ANNALS OF TELECOMM., SPECIAL ISSUE 3RD INT. SYMP. TURBO CODES, 003. FINAL VERSION, AUGUST 004. Bounds on Mutual Information for Simple Codes Using Information Combining Ingmar

More information

Duality Between Channel Capacity and Rate Distortion With Two-Sided State Information

Duality Between Channel Capacity and Rate Distortion With Two-Sided State Information IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 6, JUNE 2002 1629 Duality Between Channel Capacity Rate Distortion With Two-Sided State Information Thomas M. Cover, Fellow, IEEE, Mung Chiang, Student

More information

820 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 2, FEBRUARY Stefano Rini, Daniela Tuninetti, and Natasha Devroye

820 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 2, FEBRUARY Stefano Rini, Daniela Tuninetti, and Natasha Devroye 820 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 2, FEBRUARY 2012 Inner and Outer Bounds for the Gaussian Cognitive Interference Channel and New Capacity Results Stefano Rini, Daniela Tuninetti,

More information

Upper Bounds on the Capacity of Binary Intermittent Communication

Upper Bounds on the Capacity of Binary Intermittent Communication Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,

More information