X 1 : X Table 1: Y = X X 2

Similar documents
Shannon s noisy-channel theorem

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

Exercise 1. = P(y a 1)P(a 1 )

ECE Information theory Final

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Lecture 4 Noisy Channel Coding

Lecture 3: Channel Capacity

ECE Information theory Final (Fall 2008)

Noisy channel communication

National University of Singapore Department of Electrical & Computer Engineering. Examination for

LECTURE 13. Last time: Lecture outline

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

Lecture 10: Broadcast Channel and Superposition Coding

Multicoding Schemes for Interference Channels

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

Shannon s Noisy-Channel Coding Theorem

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem

Lecture 14 February 28

On the Capacity of the Interference Channel with a Relay

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

Lecture 15: Conditional and Joint Typicaility

Appendix B Information theory from first principles

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

(Classical) Information Theory III: Noisy channel coding

On Multiple User Channels with State Information at the Transmitters

Chapter 9 Fundamental Limits in Information Theory

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

Lecture 22: Final Review

EE5585 Data Compression May 2, Lecture 27

On the Duality between Multiple-Access Codes and Computation Codes

Noisy-Channel Coding

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

Entropies & Information Theory

Homework Set #2 Data Compression, Huffman code and AEP

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

Network coding for multicast relation to compression and generalization of Slepian-Wolf

Note that the new channel is noisier than the original two : and H(A I +A2-2A1A2) > H(A2) (why?). min(c,, C2 ) = min(1 - H(a t ), 1 - H(A 2 )).

Relay Networks With Delays

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

Lecture 11: Continuous-valued signals and differential entropy

Random Access: An Information-Theoretic Perspective

Lecture 8: Channel Capacity, Continuous Random Variables

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Shannon s Noisy-Channel Coding Theorem

Source-Channel Coding Theorems for the Multiple-Access Relay Channel

A Summary of Multiple Access Channels

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

Solutions to Homework Set #1 Sanov s Theorem, Rate distortion

Lecture 4 Channel Coding

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

ELEMENT OF INFORMATION THEORY

On the Capacity of the Two-Hop Half-Duplex Relay Channel

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A

EE376A - Information Theory Midterm, Tuesday February 10th. Please start answering each question on a new page of the answer booklet.

EE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes

Interactive Decoding of a Broadcast Message

Lecture 2. Capacity of the Gaussian channel

An introduction to basic information theory. Hampus Wessman

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006)

EE 4TM4: Digital Communications II. Channel Capacity

Solutions to Homework Set #3 Channel and Source coding

Physical Layer and Coding

Variable Length Codes for Degraded Broadcast Channels

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

Lecture 5: Asymptotic Equipartition Property

EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm

Chapter 9. Gaussian Channel

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Discussion 6A Solution

Lecture 8: Shannon s Noise Models

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY

ELEC546 Review of Information Theory

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information

Quantum Information Theory and Cryptography

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12

Cut-Set Bound and Dependence Balance Bound

for some error exponent E( R) as a function R,

Energy State Amplification in an Energy Harvesting Communication System

Unequal Error Protection Querying Policies for the Noisy 20 Questions Problem

ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.

UCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, Solutions to Take-Home Midterm (Prepared by Pinar Sen)

COMPSCI 650 Applied Information Theory Apr 5, Lecture 18. Instructor: Arya Mazumdar Scribe: Hamed Zamani, Hadi Zolfaghari, Fatemeh Rezaei

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing

Source and Channel Coding for Correlated Sources Over Multiuser Channels

Nash Equilibrium and information transmission coding and decoding rules

CSCI 2570 Introduction to Nanocomputing

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR

arxiv: v2 [cs.it] 28 May 2017

Distributed Lossless Compression. Distributed lossless compression system

Information Theory CHAPTER. 5.1 Introduction. 5.2 Entropy

Upper Bounds on the Capacity of Binary Intermittent Communication

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information.

Dept. of Linguistics, Indiana University Fall 2015

An Outer Bound for the Gaussian. Interference channel with a relay.

Transcription:

ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access channel Y = X X 2, where: X {2, 4}, X 2 {, 2} Solution: The output of the channel is given in the following table: X : X 2 2 2 2 4 4 4 6 Table : Y = X X 2 Here we can see that by fixing X 2 = we can obtain a rate R = per transmission since we are able to decode two symbols at the output of the channel. Similarly, by setting X = 2 it is possible to achieve a rate R 2 =. These rates can be obtained by the following expressions, considering that X and X 2 are independent and that both have binary alphabets: R I(X ; Y X 2 ) () I(X ; X X 2 X 2) (2) H(X X 2 ) H(X X X 2, X 2) (3) H(X ) (4) (5) The third boundary is obtained by: R 2 I(X 2 ; Y X ) (6) I(X 2 ; X X 2 X ) (7) H(X 2 X ) H(X 2 X X 2, X ) (8) H(X 2 ) (9) (0) R + R 3 I(X, X 2 ; Y ) () H(Y ) H(Y X, X 2 ) (2) H(Y ) (3)

From table, we have Y {2, 4, 6}. uniformly distributed as follows: Then, we know that α = 2 So, In order to have H(Y ) maximized we can set Y 2 with probability α 2 p(y) = 4 with probability α 6 with probability α 2 to reach the maximum entropy so: ( H(Y ) = H 4, 2, ) =.5 4 R + R 3 H(Y ) (4).5 (5) From the behavior of the channel, shown in table, we can observe that this channel behaves like the Binary erasure multiple-access channel. Then for a fixed rate, i.e. R = we have that the transmission of X will look like noise for the transmission of X 2. Then we have a binary erasure channel with capacity 0.5 bits for the transmission of X 2. So, we obtain the capacity region shown in the following figure: (b) Suppose that the range of X is {, 2}. Is the capacity region decreased? Why or why not? The following table shows the behavior of the channel under the new conditions: X : X 2 2 2 2 4 Table 2: Y = X X 2 Here we notice that if we set X =, then the output of the channel remains the same regardless the value of X 2. So we won t be able to recover X 2 by fixing X. The rate R 2 will 2

depend on the value given for X so, we must consider the probability of X to be or 2. Let X and X 2 be Bernoulli random variables with the following distributions: Then, p(x ) = p(x 2 ) = { with probability r 2 with probability r { with probability s 2 with probability s R I(X ; Y X 2 ) (6) I(X ; X X 2 X 2) (7) H(X X 2 ) H(X X X 2, X 2) (8) H(X ) = H(r) (9) (20) R 2 I(X 2 ; Y X ) (2) H(Y X ) H(Y X, X 2 ) (22) H(Y X ) (23) p(x = )H(Y X = ) + p(x = 2)H(Y X = 2) (24) p(x = 2)(H(Y X = 2) (25) rh(s) (26) Here we can set s = /2 to maximize the region such that H(s) =. The third boundary is obtained by: We need the distribution of Y, that can obtain as follows: R + R 2 I(X, X 2 ; Y ) (27) H(Y ) H(Y X, X 2 ) (28) H(Y ) (29) X X 2 Y Probability ( r)( s) 2 ( r)s 2 2 r( s) 2 2 4 rs Table 3: Y = X X 2 with probability ( s)( r) + ( r)s = r p(y) = 2 with probability r( s) 4 with probability rs 3

So H(Y ) = H ( r, r( s), rs) (30) = ( r) log 2 ( r) r( s) log 2 (r( s)) rs log 2 (rs) (3) = ( r) log 2 ( r) r( s) log 2 (r) r( s) log 2 ( s) rs log 2 (r) rs log 2 (s) (32) = ( r) log 2 ( r) r log 2 (r) r [( s) log 2 ( s) + s log 2 (s)] (33) = H(r) + rh(s) (34) Here we can set s = /2 as before to maximize the region such that H(s) =. Hence: R + R 2 H(Y ) (35) H(r) + rh(s) (36) H(r) + r (37) To answer the question: Is the capacity region decreased? Why or why not?, we need to plot both capacity regions and compare them. However we can pick up some rate pairs (R, R 2 ) and see whether they are achievable for both schemes or not, so that we can have at least one argument to compare. The rate pair (H(0.8) = 0.729, 0.8) is achievable in the region for part (b), we obtain R + R 2 =.529. However, this rate is clearly outside the capacity region in part(a) since R + R 2.5. Also, we can take the rate pair (0.5, ) which is achievable in capacity region in part (a), we can see however that to achieve a rate R 2 =, we need to choose r =, then we have that R = H() = 0, therefore we have that this rate is not achievable in the capacity region of part (b). We can conclude that are some rates achievables in one region that are not in the other so the a plot will be the best way to see the differences. 4

2. Problem 5.29. Trilingual-speaker broadcast channel. A speaker of Dutch, Spanish, and French wishes to communicate simultaneously to three people: D, S, and F. D knows only Dutch but can distinguish when a Spanish word is being spoken as distinguished from a French word; similarly for the other two, who know only Spanish and French, respectively, but can distinguish when a foreign word is spoken and which language is being spoken. Suppose that each language, Dutch, Spanish, and French, has M words: M words of Dutch, M words of French, and M words of Spanish. (a) What is the maximum rate at which the trilingual speaker can speak to D? Following example 5.6.4 of textbook, we see that the maximum rate at which the trilingual speaker can speak to D is log 2 (M + 2) since D is able to distinguish M words from his mother tongue plus two other words spoken by the speaker, one for each foreign language. (b) If he speaks to D at the maximum rate, what is the maximum rate at which he can speak simultaneously to S? The trilingual speaker sends a word in Spanish M+2 of the time, so he can encode log 2 (M) data of Spanish at a rate of log 2 (M) M+2. (c) If he is speaking to D and S at the joint rate in part (b), can he also speak to F at some positive rate? If so, what is it? If not, why not? As we remember form part (a), the maximum rate at which the trilingual speaker is communicating to D is log 2 (M + 2) since D can recognize when a Spanish word or French word, so here, the same logic used in part (b) can be applied since the speaker will say a French word in M+2 of the time so, he can encode log 2 (M) data of French at a rate of log 2 (M) M+2. 5

3. Problem 5.35. Multiple-access channel capacity with costs. The cost of using symbol x is r(x). The cost of a codeword x n is r(x n ) = n n r(x i). A (2 nr, n) codebook satisfies cost constraint r if n n r(x i(w)) r for all w 2 nr. (a) Find an expression for the capacity C(r) of a discrete memoryless channel with cost constraint r. We can introduce the cost to the capacity expression in a similar way that the power constraint is considered. Going back to Chapter 9 of the textbook, we can see how the power constraint is inserted in the achievability proof of the information capacity of the Gaussian channel with power constraint P, given as: C = max f(x):e[x 2 ] P I(X; Y ). Specifically it is introduced in the probability of error part computation as a new event that yields an error when the power constraint is violated. (Eq. 9.23) We can do something similar with the cost such that an error is produced when n n r(x i(w)) > r for any w. Then, the capacity expression given the constraint can be written as: C(r) = max p(x): P I(X; Y ) x p(x)r(x) r (b) Find an expression for the multiple-access channel capacity region for (X X 2, p(y x, x 2 ), Y) if sender X has cost constraint r and sender X 2 has cost constraint r 2. Here we can apply Theorem 5.3.4 of the textbook: we have that the set of achievable rates is given by the closure of the set of all (R, R 2 ) pairs satisfying: R < I(X ; Y X 2, Q) (38) R < I(X 2 ; Y X, Q) (39) R + R 2 < I(X, X 2 ; Y Q) (40) for some choice of the joint distribution p(q)p(x q)p(x 2 q)p(y x, x 2 ) with Q 4. However we need to consider the cost constraints in the same format they were introduced in part (a) but now for the specific case of each sender: x p(x )r (x ) r (c) Prove the converse for part (b). x 2 p(x 2 )r 2 (x 2 ) r 2 We assume we have a sequence of ((2 nr, 2 nr 2 ), n) codes with probability of error going asymptotically to zero that must satisfy the cost constraints. From the text of the problem 6

we have that: A (2 nr, n) codebook satisfies cost constraint r if n n r(x i(w)) r for all w 2 nr. We need to extend this to our codebook for the messages: w i =, 2,..., 2 nr and w 2i =, 2,..., 2 nr 2 : n n r (x i (w i )) r (4) r 2 (x 2i (w 2i )) r 2 (42) From the proof of the converse for the Multiple-Access Channel in textbook we have that: R n R 2 n R + R 2 n I(X i ; Y i X 2i ) + ɛ n (43) I(X 2i ; Y i X i ) + ɛ n (44) I(X i, X 2i ; Y i ) + ɛ n (45) Then, we can introduce the time-sharing variable Q = i {, 2,..., n} with probability /n in order to avoid dealing with the cost constraint for each i, as it is done in the textbook pp.542 the capacity region can be written as: R n R 2 n R + R 2 n I(X q ; Y q X 2q, Q = i) + ɛ n (46) I(X 2q ; Y q X q, Q = i) + ɛ n (47) I(X q, X 2q ; Y q Q = i) + ɛ n (48) R I(X Q ; Y Q X 2Q, Q) + ɛ n (49) R 2 I(X 2Q ; Y Q X Q, Q) + ɛ n (50) R + R 2 I(X Q, X 2Q ; Y Q Q) + ɛ n (5) Now we can define the following random variables: X = X Q, X 2 = X 2Q and Y = Y Q whose distributions depend on Q in the same way as the distributions of X i, X 2i and Y i depend on i. The last three equations become: R I(X ; Y X 2 ) + ɛ n (52) R 2 I(X 2 ; Y X ) + ɛ n (53) R + R 2 I(X, X 2 ; Y ) + ɛ n (54) 7

for a joint distribution p(q)p(x q)p(x 2 q)p(y x, x 2 ) and Q 4. Recalling the power constraints: x p(x )r (x ) r x 2 p(x 2 )r 2 (x 2 ) r 2 The following analysis is valid for both cost constraints: r and r 2. We can check the power constraint for each i: x P (X = x )r (x ) = x P (X Q = x )r (x ) (55) = P (Q = i)p (X Q = x Q = i)r (x ) (56) x = n = n P (X i = x )r (x ) (57) x E[r (X i )] (58) This result is actually the expectation of the definition of the cost constraint extended to the new codebook (defined at the beginning of this proof: Equation 4) with respect to random message W. It has an equivalent form for the constraint in the cost r 2. (59) 8