Lecture 10: Broadcast Channel and Superposition Coding

Similar documents
The Gallager Converse

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Solutions to Homework Set #2 Broadcast channel, degraded message set, Csiszar Sum Equality

UCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, Solutions to Take-Home Midterm (Prepared by Pinar Sen)

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010

Shannon s Noisy-Channel Coding Theorem

A Comparison of Superposition Coding Schemes

Capacity bounds for multiple access-cognitive interference channel

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

Lecture 22: Final Review

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Solutions to Homework Set #3 Channel and Source coding

LECTURE 13. Last time: Lecture outline

Lecture 4 Noisy Channel Coding

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

On Multiple User Channels with State Information at the Transmitters

Multicoding Schemes for Interference Channels

Lecture 4 Channel Coding

LECTURE 10. Last time: Lecture outline

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

Exercise 1. = P(y a 1)P(a 1 )

X 1 : X Table 1: Y = X X 2

Classical codes for quantum broadcast channels. arxiv: Ivan Savov and Mark M. Wilde School of Computer Science, McGill University

On the Capacity Region of the Gaussian Z-channel

EE5585 Data Compression May 2, Lecture 27

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

Entropies & Information Theory

EE 4TM4: Digital Communications II. Channel Capacity

Lecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs

National University of Singapore Department of Electrical & Computer Engineering. Examination for

The Capacity Region of the Gaussian MIMO Broadcast Channel

Generalized Writing on Dirty Paper

Lecture 15: Conditional and Joint Typicaility

Relay Networks With Delays

Shannon s noisy-channel theorem

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

Lecture 3: Channel Capacity

A Comparison of Two Achievable Rate Regions for the Interference Channel

Covert Communication with Channel-State Information at the Transmitter

Simultaneous Nonunique Decoding Is Rate-Optimal

Capacity Bounds for Diamond Networks

List of Figures. Acknowledgements. Abstract 1. 1 Introduction 2. 2 Preliminaries Superposition Coding Block Markov Encoding...

Capacity of a channel Shannon s second theorem. Information Theory 1/33

LECTURE 3. Last time:

Lecture 2. Capacity of the Gaussian channel

The Method of Types and Its Application to Information Hiding

Achievable Rates and Outer Bound for the Half-Duplex MAC with Generalized Feedback

Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels

ECE Information theory Final

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory

Lecture 5 Channel Coding over Continuous Channels

Interactive Decoding of a Broadcast Message

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

COMM901 Source Coding and Compression. Quiz 1

The Capacity Region of a Class of Discrete Degraded Interference Channels

Random Access: An Information-Theoretic Perspective

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information

Chapter 2 Review of Classical Information Theory

An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel

Lecture 8: Shannon s Noise Models

9 Forward-backward algorithm, sum-product on factor graphs

Optimal Encoding Schemes for Several Classes of Discrete Degraded Broadcast Channels

Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel

Secret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper

On the Capacity of the Interference Channel with a Relay

5 Mutual Information and Channel Capacity

Variable Length Codes for Degraded Broadcast Channels

Interference Channel aided by an Infrastructure Relay

Common Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014

Quiz 2 Date: Monday, November 21, 2016

Advanced Topics in Information Theory

On Network Interference Management

Capacity of channel with energy harvesting transmitter

Appendix B Information theory from first principles

Sparse Regression Codes for Multi-terminal Source and Channel Coding

Distributed Source Coding Using LDPC Codes

II. THE TWO-WAY TWO-RELAY CHANNEL

Lecture 14 February 28

Solutions to Homework Set #1 Sanov s Theorem, Rate distortion

Lecture 11: Quantum Information III - Source Coding

On the Capacity of the Two-Hop Half-Duplex Relay Channel

Feedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case

Solutions to Homework Set #4 Differential Entropy and Gaussian Channel

Nonlinear Turbo Codes for the broadcast Z Channel

ECE Information theory Final (Fall 2008)

Distributed Lossless Compression. Distributed lossless compression system

Source-Channel Coding Theorems for the Multiple-Access Relay Channel

ProblemsWeCanSolveWithaHelper

arxiv: v2 [cs.it] 28 May 2017

Lossy Distributed Source Coding

Ahmed S. Mansour, Rafael F. Schaefer and Holger Boche. June 9, 2015

Capacity of a Class of Deterministic Relay Channels

Error Exponent Region for Gaussian Broadcast Channels

Intermittent Communication

Homework Set #2 Data Compression, Huffman code and AEP

Transcription:

Lecture 10: Broadcast Channel and Superposition Coding Scribed by: Zhe Yao 1 Broadcast channel M 0M 1M P{y 1 y x} M M 01 1 M M 0 The capacity of the broadcast channel depends only on the marginal conditional probabilities p(y 1 x) and p(y x) 11 Common message only Here only M 0 exists, M 1 = M =, the transmitted message vector is M 0,, R 0 min (I(; ),I(; )) C 0 = max p(x) R 0 = max p(x) {min(i(;),i(; ))} C 0 min {max p(x) I(;), max p(x) I(;)} = min{c 1,C } which means that the common-message only capacity is less than the worst individual capacity 1 Private messages only M 0 =, the transmitted message vector is,m 1,M R C upper bound How to find the capacity? lower bound C 1 R 1 1

In general, if a rate pair (R 1,R ) is achievable for private messages then the rate triple (R 0,R 1 R 0,R R 0 ) is also achievable for the BC with common and private messages The capacity of the BC is not known in general, but is known for a special class as degraded broadcast channel Degraded broadcast channels 1 Definitions Definition A broadcast channel is physical degraded if forms a Markov chain, that is, is a degraded version of p(y 1 y x) = p(y 1 x)p(y y 1 ) Definition A broadcast channel is stochastically degraded broadcast channel if there exists another probablity transition p (y x) such that p(y x) = ỹ 1 p(ỹ 1 x)p (y ỹ 1 ) A stochastically degraded BC means that although does not form a Markov chain, we can find an equivalent physical degraded BC with the same capacity such that Ỹ1 forms a Markov chain Or equivalently Ỹ for some Ỹ with conditional marginal pmf p(ỹ x) the same as p(y x) Example The Gaussian BC is a stochastically degraded broadcast channel Z 1 ~ N(0,1) Z ~ N(0,) Z 1 Z 3 ~ N(0,1) Y 3 In the graph, we see that a new physical degraded channel can be constructed equivalently to the stochastically degraded channel by introducing a new independent noise Z 3 and a new received signal Y 3 These two channels have the same marginal distributions and hence the same capacity p(y x) = y 1 p(y 1 x)p(y 3 y 1 )

Binary symmetric broadcast channel Z 1 ~ Bern(p 1 ) Z ~ Bern(p ) Z 1 Z 3 ~ Bern( ) Y 3 We assume that 1 > p > p 1 and Z = Z 1 Z 3, in which α = p p 1 1 p 1 In this case Y 3 and are statistically the same A random coding scheme 1 For each M, generate U n (M ) Bern( 1 ) with elements iid For each M 1, generate V n (M 1 ) according to Bern(α), in which 0 α 1 U n U(M ) (M,M 1 ) 3 To send (M 1,M ), send n (M 1 M ) = U n (M ) V n (M 1 ) In this particular example, we have = Z 1 = U V Z 1 = Z = U V Z 3

Decoding scheme 1 At Rx (the worse receiver), choose the unique ˆM st (U( ˆM ), ) A (n) At Rx1 (the better receiver), choose the unique ˆM 1 ˆM st (U( ˆM ),( ˆM 1 ˆM ), ) A (n) We can derive that R 1 H(α p 1 ) H(p 1 ) R 1 H(α p ) 3 Superposition coding 31 Theorem Theorem 1 For a degraded broadcast channel, the capacity is the convex hull of the rate region such that R 1 I(; U) for some joint distributions p(u)p(x u)p(y 1 y x) R I( ;U) U: Auxiliary RV, not part of the channel itself, but to help with coding It is the center of a code cluster : Individual codeword in each cluster Sketch of the Proof Achievability Code generation: 1 For each M, generate U n (M ) with iid elements with a distribution p(u) The number of sequences U n (M ) is nr For each U n (M ), generate nr 1 codewords n (M,M 1 ) U n (1) U n () U n ( nr ) then for each j = 1, nr 1 4

U n (j) = n (j, 1) n (j, ) n (j, nr1 ) Encoding: To send M 1,M, send the codeword n (M,M 1 ) Decoding: Decoder (the worse receiver) chooses the unique ˆM st (U( ˆM ), ) A (n) Decoder 1 (the better receiver) chooses the unique ˆM 1, ˆM st (U( ˆM ),( ˆM, ˆM1 ), ) A (n) Decoder 1 can also perform successive cancellation by decoding ˆM first, then decoding ˆM 1 based on joint typicality Converse Fano s inequality, P e 0, and need to pick the right auxiliary RV 5