Lecture 10: Broadcast Channel and Superposition Coding Scribed by: Zhe Yao 1 Broadcast channel M 0M 1M P{y 1 y x} M M 01 1 M M 0 The capacity of the broadcast channel depends only on the marginal conditional probabilities p(y 1 x) and p(y x) 11 Common message only Here only M 0 exists, M 1 = M =, the transmitted message vector is M 0,, R 0 min (I(; ),I(; )) C 0 = max p(x) R 0 = max p(x) {min(i(;),i(; ))} C 0 min {max p(x) I(;), max p(x) I(;)} = min{c 1,C } which means that the common-message only capacity is less than the worst individual capacity 1 Private messages only M 0 =, the transmitted message vector is,m 1,M R C upper bound How to find the capacity? lower bound C 1 R 1 1
In general, if a rate pair (R 1,R ) is achievable for private messages then the rate triple (R 0,R 1 R 0,R R 0 ) is also achievable for the BC with common and private messages The capacity of the BC is not known in general, but is known for a special class as degraded broadcast channel Degraded broadcast channels 1 Definitions Definition A broadcast channel is physical degraded if forms a Markov chain, that is, is a degraded version of p(y 1 y x) = p(y 1 x)p(y y 1 ) Definition A broadcast channel is stochastically degraded broadcast channel if there exists another probablity transition p (y x) such that p(y x) = ỹ 1 p(ỹ 1 x)p (y ỹ 1 ) A stochastically degraded BC means that although does not form a Markov chain, we can find an equivalent physical degraded BC with the same capacity such that Ỹ1 forms a Markov chain Or equivalently Ỹ for some Ỹ with conditional marginal pmf p(ỹ x) the same as p(y x) Example The Gaussian BC is a stochastically degraded broadcast channel Z 1 ~ N(0,1) Z ~ N(0,) Z 1 Z 3 ~ N(0,1) Y 3 In the graph, we see that a new physical degraded channel can be constructed equivalently to the stochastically degraded channel by introducing a new independent noise Z 3 and a new received signal Y 3 These two channels have the same marginal distributions and hence the same capacity p(y x) = y 1 p(y 1 x)p(y 3 y 1 )
Binary symmetric broadcast channel Z 1 ~ Bern(p 1 ) Z ~ Bern(p ) Z 1 Z 3 ~ Bern( ) Y 3 We assume that 1 > p > p 1 and Z = Z 1 Z 3, in which α = p p 1 1 p 1 In this case Y 3 and are statistically the same A random coding scheme 1 For each M, generate U n (M ) Bern( 1 ) with elements iid For each M 1, generate V n (M 1 ) according to Bern(α), in which 0 α 1 U n U(M ) (M,M 1 ) 3 To send (M 1,M ), send n (M 1 M ) = U n (M ) V n (M 1 ) In this particular example, we have = Z 1 = U V Z 1 = Z = U V Z 3
Decoding scheme 1 At Rx (the worse receiver), choose the unique ˆM st (U( ˆM ), ) A (n) At Rx1 (the better receiver), choose the unique ˆM 1 ˆM st (U( ˆM ),( ˆM 1 ˆM ), ) A (n) We can derive that R 1 H(α p 1 ) H(p 1 ) R 1 H(α p ) 3 Superposition coding 31 Theorem Theorem 1 For a degraded broadcast channel, the capacity is the convex hull of the rate region such that R 1 I(; U) for some joint distributions p(u)p(x u)p(y 1 y x) R I( ;U) U: Auxiliary RV, not part of the channel itself, but to help with coding It is the center of a code cluster : Individual codeword in each cluster Sketch of the Proof Achievability Code generation: 1 For each M, generate U n (M ) with iid elements with a distribution p(u) The number of sequences U n (M ) is nr For each U n (M ), generate nr 1 codewords n (M,M 1 ) U n (1) U n () U n ( nr ) then for each j = 1, nr 1 4
U n (j) = n (j, 1) n (j, ) n (j, nr1 ) Encoding: To send M 1,M, send the codeword n (M,M 1 ) Decoding: Decoder (the worse receiver) chooses the unique ˆM st (U( ˆM ), ) A (n) Decoder 1 (the better receiver) chooses the unique ˆM 1, ˆM st (U( ˆM ),( ˆM, ˆM1 ), ) A (n) Decoder 1 can also perform successive cancellation by decoding ˆM first, then decoding ˆM 1 based on joint typicality Converse Fano s inequality, P e 0, and need to pick the right auxiliary RV 5