ECE Information theory Final (Fall 2008)

Similar documents
ECE Information theory Final

National University of Singapore Department of Electrical & Computer Engineering. Examination for

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

Note that the new channel is noisier than the original two : and H(A I +A2-2A1A2) > H(A2) (why?). min(c,, C2 ) = min(1 - H(a t ), 1 - H(A 2 )).

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Lecture 22: Final Review

X 1 : X Table 1: Y = X X 2

On the Duality between Multiple-Access Codes and Computation Codes

EE5585 Data Compression May 2, Lecture 27

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

Network coding for multicast relation to compression and generalization of Slepian-Wolf

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

Reliable Computation over Multiple-Access Channels

Lecture 3: Channel Capacity

Upper Bounds on the Capacity of Binary Intermittent Communication

Solutions to Homework Set #1 Sanov s Theorem, Rate distortion

Shannon s noisy-channel theorem

On the Capacity of the Two-Hop Half-Duplex Relay Channel

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

Chapter 9 Fundamental Limits in Information Theory

Secrecy in the 2-User Symmetric Interference Channel with Transmitter Cooperation: Deterministic View

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Exercise 1. = P(y a 1)P(a 1 )

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12

Appendix B Information theory from first principles

16.36 Communication Systems Engineering

Interference Channel aided by an Infrastructure Relay

Lecture 2. Capacity of the Gaussian channel

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016

LECTURE 18. Lecture outline Gaussian channels: parallel colored noise inter-symbol interference general case: multiple inputs and outputs

Variable Length Codes for Degraded Broadcast Channels

Energy State Amplification in an Energy Harvesting Communication System

II. THE TWO-WAY TWO-RELAY CHANNEL

ELEC546 Review of Information Theory

Secret Key Agreement Using Asymmetry in Channel State Knowledge

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

The Gallager Converse

ProblemsWeCanSolveWithaHelper

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR

Distributed Source Coding Using LDPC Codes

Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel

SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003


Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

On Capacity Under Received-Signal Constraints

(Classical) Information Theory III: Noisy channel coding

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Feedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

Lecture 4 Noisy Channel Coding

Multi-Input Multi-Output Systems (MIMO) Channel Model for MIMO MIMO Decoding MIMO Gains Multi-User MIMO Systems

Lecture 11: Polar codes construction

On the Secrecy Capacity of Fading Channels

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Multi-Input Multi-Output Systems (MIMO) Channel Model for MIMO MIMO Decoding MIMO Gains Multi-User MIMO Systems

ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016

Lecture 14 February 28

An Achievable Rate for the Multiple Level Relay Channel

Chapter 4: Continuous channel and its capacity

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying

Simultaneous SDR Optimality via a Joint Matrix Decomp.

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Lecture 11: Quantum Information III - Source Coding

Revision of Lecture 5

ECE 534 Information Theory - Midterm 2

Cases Where Finding the Minimum Entropy Coloring of a Characteristic Graph is a Polynomial Time Problem

Source-Channel Coding Theorems for the Multiple-Access Relay Channel

Distributed Lossless Compression. Distributed lossless compression system

An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel

Solutions to Homework Set #3 Channel and Source coding

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels

Lecture 8: Shannon s Noise Models

On Network Interference Management

K User Interference Channel with Backhaul

Lecture 1. Introduction

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.

Interactive Decoding of a Broadcast Message

Source and Channel Coding for Correlated Sources Over Multiuser Channels

Cut-Set Bound and Dependence Balance Bound

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

EE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes

A Practical and Optimal Symmetric Slepian-Wolf Compression Strategy Using Syndrome Formers and Inverse Syndrome Formers

A digital interface for Gaussian relay and interference networks: Lifting codes from the discrete superposition model

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157

Sum Capacity of General Deterministic Interference Channel with Channel Output Feedback

Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel

Physical Layer and Coding

Lecture 2: August 31

The Sensor Reachback Problem

MULTITERMINAL SECRECY AND TREE PACKING. With Imre Csiszár, Sirin Nitinawarat, Chunxuan Ye, Alexander Barg and Alex Reznik

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)

Interference Channels with Source Cooperation

Capacity of All Nine Models of Channel Output Feedback for the Two-user Interference Channel

Secret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper

Quantization for Distributed Estimation

Approximately achieving the feedback interference channel capacity with point-to-point codes

Transcription:

ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1 x2 i (w) P for all messages w). The transmitter sends "0" except at times 0, k,2k, 3k,... for a given integer k>0. Notice that in these time instants, the power can be increased to kp without violating the average power constraint. Find the maximum achievable rate with this type of schemes. Then, consider P to be very small. Verify that in this regime the considered bursty scheme is optimal (i.e., it is equal to the capacity of the Gaussian channel). Sol.: The channel at hand is equivalent to a Gaussian channel with n/k channel uses and average power constraint kp so that the maximum achievable rate (bits per total channel use) is R = 1 n n 1 k 2 log 2(1 + kp) = 1 2k log 2(1 + kp). Now, for small P, log 2 (1 + kp) =log 2 e log(1 + kp) ' (log 2 e) kp, so that R ' 1 2 (log 2 e) P. Comparing this with the capacity C =1/2log 2 (1 + P ) ' 1/2(log 2 e) P, we see that the bursty coding scheme suffers no loss of optimality in the low-snr regime. Q.2. (1 point) The histogram (or type) of a given sequence x n X n is defined as N(a x n )= 1/n Pn i=1 1{x i = a}, where 1{ } is an indicator function (equal to 1 if the argument is true and zero otherwise). (a) Show that the number of all possible types for a sequence of n letters is less or equal than (n +1) X (Hint: How many values can the function N(a x n ) take for every possible value a X?). (b) Based on the previous result, show that transmitting information about the type of a sequence requires a rate (bit/ symbol) that vanishes ( 0) for n. (c) (optional) Consider now the case of two sequences x n X n and y n Y n observed by two independent encoders. Assume that the receiver wants to recover the joint type N(a, b x n,y n ), do you think (intuitively) thatasimilarconclusiontopoint(b)wouldapply? Sol.: (a) For every possible value a X, the function N(a x n ) can take only n +1values, namely 0, 1/n, 2/n,..., 1. Therefore, an upper bound to the number of functions N(a x n ) is obtained by assuming that every value N(a x n ) is selected from this set with replacement. It follows that (n +1) X is an upper bound. (b) Since the number of histograms (types) is less or equal than (n +1) X, the required rate per symbol is R log 2(n +1) X 0 for n. n 1

(c) (optional) The number of joint types, following point (a), is clearly less or equal than (n+1) X Y. However, since x n X n and y n Y n are observed by two independent encoders, one cannot directly index the joint type with a vanishing rate as at point (b). In fact, encoder 1onlyknowsthetypeN(a x n ) and similarly encoder 2 only knows N(b y n ). Therefore, it is expected that a vanishing rate would not be enough here. It turns out that in many cases one has to choose rates so that the entire sequences x n and y n (not only their type, which is what is actually requested) can be recovered at the destination. This corresponds to the Slepian-Wolf rate region. Further details on this can be found in the paper: R. Ahlswede and I. Csiszar, "To get a bit of information may be as hard as to get full information," IEEE Trans. Inf. Theory, vol. IT-27, no. 4, July 1981. Q.3. (1 point) Consider a Gaussian multiple access channel in which the received signal is Y =[Y 1 Y 2 ] T with Y1 1 0 X1 N1 Y = = +, Y 2 0 1 X 2 N 2 with independent Gaussian noises N 1 and N 2 with unit power. The average power constraints on X 1 and X 2 are P 1 and P 2, respectively. (a) Find the capacity region. (b) Consider the capacity region for the case where the two transmitters can perfectly cooperate, that is, R 1 + R 2 max f(x1,x 2 ): E[X1 2] P 1,E[X2 2] P I(X 2 1,X 2 ; Y ). Compare with point (a) and comment. Sol: (a)wehave R 1 I(X 1 ; Y X 2 )=h(y X 2 ) h(y X 1,X 2 ) = h(y 1,N 2 X 2 ) h(n 1,N 2 ) = h(y 1 )+h(n 2 ) h(n 1 ) h(n 2 ) 1 2 log 2(1 + P 1 ), whereinthethirdlinewehaveusedthefactthatn 1,N 2,X 1 and X 2 are independent, and thelastinequalityisanequalityifx 1 is Gaussian. We similarly have that the second bound R 2 I(X 2 ; Y X 1 ) is maximized by a Gaussian distribution for X 2. Finally, for the sum-rate R 1 + R 2 I(X 1,X 2 ; Y )= = h(y 1 )+h(y 2 ) h(n 1 ) h(n 2 ) 1 2 log 2(1 + P 1 )+ 1 2 log 2(1 + P 2 ), which is again maximized by Gaussian distributions. We have concluded that the capacity region is given by the rectangle R 1 1 2 log 2(1+P 1 ),R 2 1 2 log 2(1+P 2 ). This is not surprising since the two users have orthogonal channels to the destination (and, in fact, we could have concluded the above by this simple observation alone). (b) For the perfect cooperation case R 1 + R 2 I(X 1,X 2 ; Y ) h(y 1 )+h(y 2 ) h(n 1 ) h(n 2 ) 1 2 log 2(1 + P 1 )+ 1 2 log 2(1 + P 2 ), 2

where the second inequality is tight if X 1 and X 2 are independent, i.e., f(x 1,x 2 )=f(x 1 )f(x 2 ). The full cooperation region then contains rate points not contained in the multiple-access region derived at the previous point. Q.4. (1 point) Assume that you have two parallel Gaussian channels with inputs X 1 and X 2 and noises Z 1 and Z 2, respectively. Assume that the noise powers are E[Z 2 1]=0.5 and E[Z 2 2]=0.7, while the total available power is P = E[X 2 1]+E[X 2 2]=0.4. Findtheoptimal power allocation and corresponding capacity. Sol.: The waterfilling conditions are: P 1 = E[X 2 1]=(μ 0.5) + P 2 = E[X 2 2]=(μ 0.7) + with P 1 + P 2 =0.4. It follows that (μ 0.5) + +(μ 0.7) + =0.4, which is satisfied by μ =0.8, yielding P 1 =0.3 and P 2 =0.1 The corresponding capacity is then C = 1 µ 2 log 2 1+ 0.3 + 1 µ 0.5 2 log 2 1+ 0.1 = 0.435 bits/ channel use. 0.7 P.1. (2 points) Consider the network in Fig. 1-(a), where two nodes communicate with each other, with rates R 1 (W 1 [1, 2 nr 1 ])fromnode1to2,andr 2 ( [1, 2 nr 2 ])from2to1. Available coding strategies are of the type X 1,k = f 1,k (W 1,Y1 k 1 ) and X 2,k = f 2,k (,Y2 k 1 ), while the decoding functions are Ŵ 1 = g 1 (Y1 n ) and Ŵ 2 = g 2 (Y2 n ). This is the two-way channel studied by Shannon. (a) Evaluate the cut-set outer bound to the capacity region of the rates (R 1,R 2 ). (b) Consider now the special case of the network of Fig. 1-(a) shown in Fig. 1-(b). Here we have Y 1,k = Y 2,k = X 1,k X 2,k with X 1,k,X 2,k {0, 1} (binary inputs) and represents the mod-2 (binary) sum. Evaluate the capacity region for this network (Hint: firstevaluatethe cut-set bound for this network and then propose a simple scheme that achieves the cut-set bound). Sol.: (a) Only two cuts are possible. Accordingly, we get that if (R 1,R 2 ) are achievable, then R 1 I(X 1 ; Y 2 X 1 ) R 2 I(X 2 ; Y 1 X 2 ) for some joint distribution p(x 1,x 2 ). (b) Let us evaluate the bound above for the network in Fig. 1-(b) R 1 I(X 1 ; Y X 2 )=I(X 1 ; X 1 X 2 )=H(X 1 X 2 ) H(X 1 ) 1 R 2 I(X 2 ; Y X 1 )=I(X 2 ; X 2 X 1 )=H(X 2 X 1 ) H(X 2 ) 1 3

W 1 X 1k 1 p(y 1,y 2 x 1,x 2 ) ^ Y Y k 1k 1 X 2k 2 ^ (a) X 1k X 2k W 1 ^ W 1 1 Y k Y k 2 ^ mod-2 sum (xor) (b) Figure 1: whereequalityinbothboundsisobtainedforp(x 1,x 2 )=p(x 1 )p(x 2 ) with p(x 1 ) and p(x 2 ) uniform. Asimpleschemethatachievesthisboundisthefollowing. Everyuserj transmits x j,k =0or 1 with equal probability. Having received, say for user 1, y 1,k = x 1,k x 2,k, the transmitted bit x 1,k can be clearly subtracted off and user 1 obtains y 1,k x 1,k = x 2,k. As a result, R 1 1 and R 2 1 are achievable. P.2. (2 points) Consider a source V i,i=1, 2,..., n, i.i.d. and uniformly distributed in the set V ={1, 2, 3, 4, 5}. We are interested in conveying this source to a destination over a Gaussian channel with signal to noise ratio P/N =10. We use as measure for the quality of reconstruction the Hamming distortion (symbol-wise probability of error) ½ 0 if v =ˆv d(v, ˆv) = 1 if v 6= ˆv. (a) What is the minimum average distortion D =1/n Pn i=1 E[d(v i, ˆv i )] that can be obtained? Provide a rough estimate of the solution of the final equation by plotting the two sides (Hint: What is an optimal scheme to convey the source to the destination?) (b) Fix distortion D =0and P/N =2. Assume now that the channel has k channel uses per source symbol (rather than one, as assumed at the previous point). In other words, for every source symbol, the channel can send k channel symbols to the destination. What is the minimum value of k that is necessary to achieve the desired distortion over this channel? Sol.: (a)usingthesource-channelseparationtheorem,wehavethattheminimumaverage distortion D is given by the equation R(D) =C, 4

where C =1/2log 2 (1 + 10) and R(D) is the rate-distortion function for the source at hand. It can be derived (see problem 10.5 and assignments) that if 0 D 4/5, and R(D) = log 2 5 H(D) D log 2 (5 1) = log 2 5 H(D) 2D R(D) =0, if D>4/5. We then need to solve (assuming there is a solution in the interval 0 D 4/5 =0.8): log 2 5 H(D) 2D = 1/2log 2 (1 + 10) H(D)+2D = log 2 (5/ 11), whichturnsouttoleadtod ' 0.085 < 0.8. (b) Since D =0, we need to convey R(0) = H(V )=log 2 5 ' 2.32 bits/ source symbol to the destination. Via a capacity-achieving channel code, we can send 1/2log 2 (1 + 2) ' 0.79 bits/ channel use. But, since we have k channel uses per source symbol, the number of bits we can send to the destination per source symbol is ' k 0.79. Setting k 0.79 2.32 we get a minimum value of k, k = d2.32/0.79e =3. P.3. (2 points) You are given two independent i.i.d. sources X 1,i and X 2,i, i =1, 2,..., n, such that X 1,i Ber(0.5) and X 2,i Ber(0.2). (a) Suppose that the sequences X1 n and X2 n are observed by two independent encoders and that a common destination is interested in recovering losslessly X1 n and X2 n.whatistheset of rates R 1 (from the first encoder to the destination) and R 2 (from the second encoder to the destination) that guarantees vanishing probability of error at the destination (Slepian-Wolf problem)? (b) Repeat the previous point by considering the sources Z 1,i = X 1,i + X 2,i and Z 2,i = X 1,i X 2,i. In other words, here the two encoders observe Z1 n and Z2 n. Compare the rate region with that found at the previous point. (c) Comment on the comparison of points (a) and (b). Sol.: (a) Since X 1 and X 2 are independent, we have R 1 H(X 1 X 2 )=H(0.5) = 1 R 2 H(X 2 X 1 )=H(0.2) ' 0.72 R 1 + R 2 H(X 1,X 2 )=H(0.2) + H(0.5) ' 1.72, so the region is the intersection of the half spaces R 1 1 and R 2 0.72. (b) We need to find the joint distribution p(x 1,x 2 ) of Z 1 and Z 2,whichisgivenby Z 1 \Z 2 0 1 1 0 0.5 0.8 0 0 1 0 0.5 0.8 0.5 0.2 2 0.5 0.2 0 0 5,

since, e.g., Pr[Z 1 =0,Z 2 =0]=Pr[X 1 =0,X 2 =0]=0.5 0.8. We can now calculate the Slepian-Wolf region: R 1 H(Z 1 Z 2 )=Pr[Z 2 =0]H(Z 1 Z 2 =0)=0.5 H(0.2) ' 0.36 R 2 H(Z 2 Z 1 )=Pr[Z 2 =1]H(Z 2 Z 1 =1)=0.5 H(0.2) ' 0.36 R 1 + R 2 H(Z 1,Z 2 )= 2 0.1log 2 0.1 2 0.4log 2 0.4 ' 1.72. It is easy to see that this region is larger than the one found at point (a) in terms of the individual rates R 1 and R 2 (the sum-rate constraint is the same). This is because here the sources are correlated and thus one can exploit the gains arising from distributed coding to reduce rates R 1 and R 2. 6