ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the distortion D = 2 /12 is equal to 01 and then use entropy encoding on the quantized samples X i In the second strategy, we use optimal rate-distortion coding with distortion D =01 Compare the number of bits per symbol needed for the two strategies (in particular, for the first use results valid in the limit of a small ) Sol: For the first strategy, we have D = 2 /12 = 01 sothat = 12 =11 R = H(X )= log 2 + h(x) = log 2 (11) + 1 2 log 2(2πe) =191 bits/symbol, while for the second R(01) = 1 µ 1 2 log 2 = 166 bits/symbol 01 Clearly, the optimal code according to rate-distortion theory requires a smaller number of bits/symbol Q2 (1 point) Find the rate-distortion function R(D) for a Bernoulli source X Ber(06) with distortion 0 x =ˆx d(x, ˆx) = x =1, ˆx =0 1 x =0, ˆx =1 Sol: In order to obtain an average distortion D, we should have D = X x,ˆx p(x, ˆx)d(x, ˆx) It is easy to see that there exists only one joint pmf p(x, ˆx) thatsatisfies the above conditions and the constraint on the marginal pmf of X stated in the text: X/ ˆX 0 1 0 04 D D 1 0 06 To see this, notice that with this choice the following conditions are satisfied: D = X x,ˆx p(x, ˆx)d(x, ˆx) =p(0, 1) p X (1) = 06 The rate-distortion function can be then obtained for D<04 as R(D) = I(X; ˆX) =H(X) H(X ˆX) =H(06) p ˆX(1)H(X ˆX =1)= = H(06) (D +06)H(D/(D +06)) 1
Notice that for D>04, R(D) = 0 Q3 (1 point) Assume that you have two parallel Gaussian channels with inputs X 1 and X 2 and noises Z 1 and Z 2, respectively Assume that the noise powers are E[Z 2 1]=03 and E[Z 2 2]=06, while the total available power is P = E[X 2 1]+E[X 2 2]=01 Find the optimal power allocation (waterfilling) and corresponding capacity Sol: The waterfilling conditions are: with It follows that P 1 = E[X 2 1]=(μ 03) + P 2 = E[X 2 2]=(μ 06) + P 1 + P 2 =01 (μ 03) + +(μ 06) + =01 If you assume both P 1 and P 2 > 0, it is easy to see that there is no solution to the previous equation Instead, if we set P 2 =0, we get (μ 03) = 01, from which we obtain μ =04 and P 1 = (04 03) + =01 P 2 = (04 06) + =0, ie, all the power is used on the best channel The corresponding capacity is then 2 log 2 1+ 01 = 021 bits/symb 03 Q4 (1 point) Two sensors collect information about the temperature of a given process in different locations At each time instant, they both need to communicate whether the temperature of the process is above a given threshold (alarm) or not (normal) Since the sensors are in adjacent areas, their measurements U and V are correlated according to the joint PMF U\V normal alarm normal 08 005 alarm 005 01 How many bits R 1 and R 2 are needed to deliver this information if U and V are encoded independently? What if we use Slepian-Wolf source coding? Sol: If U and V are decoded independently we need R 1 H(U) =H (085) = 061 R 2 H(V )=061, 2
and a total number of bits R 1 + R 2 061 + 061 = 122 On the contrary, if we use Slepian-Wolf coding, we only need R 1 µ µ 08 005 H(U V )=085 H +015 H =041 085 015 R 2 H(V U) =041 R 1 + R 2 H(U, V )= 2 005 log 2 (005) 01 log 2 (01) 08 log 2 (08) = 102 so that the total number of bits needed is only R 1 + R 2 102 provided that R 1 032 and R 2 032 P1 (2 points) We would like to transmit a Bernoulli source X i Ber(07) with Hamming distortion D over a binary symmetric channel (BSC) with probability of error p (a) Assuming that we send one source symbol for each channel symbol, can the source X i be transmitted losslessly (D = 0)over the BSC if p = 03? Sol: The condition to be verified is that R(0) = H(X) C =1 H(p) However, since H(X) =H(07) = 088 bits/ source symbol and C =1 H(03) = 012 bits/ channel symbol, the condition does not hold true, and, as a result, the source cannot be transmitted over the channel (b) In the same condition as at point (a), can we send source X i with distortion D =04? Sol: Since D =04 >min(07, 03) = 03, we have that the corresponding rate is R(04) = 0 bits/ source symbol, so that the source can definitely be transmitted over the channel (c) Going back to the case D = 0, what is the largest rate r=source symbol/channel symbol thatwecantransmitoverthechannel? Sol: The condition reads H(07) r C r (d) Repeat (c) with D =01 Sol: We have C H(07) = 012 =014 source symb/ channel symb 088 R(01) r C r C R(01) = 012 H(07) H(01) = 012 = =029 source symb/ channel symb 088 047 P2 (2 points) Consider the Gaussian channel in the figure, in which the transmitted signal X (E[X 2 ]=P ) is received by two antennas with Y 1 = X + Z 1 Y 2 = X + Z 2 3
X Z 1 Y 1 α Y Y 2 1 α Z 2 Figure 1: where Z 1 and Z 2 are independent, and E[Z 2 i ]=N i with N 1 <N 2 Moreover, the signal at the two antennas is combined as Y = αy 1 +(1 α)y 2 before detection (0 α 1) a) Find the capacity of the channel for a given α Sol:Sincewehave Y = αy 1 +(1 α)y 2 = X + αz 1 +(1 α)z 2, the channel at hand is a Gaussian channel with signal power P and noise power α 2 N 1 +(1 α) 2 N 2 and the corresponding capacity is 2 log P 2 1+ α 2 N 1 +(1 α) 2 N 2 b) Using your result at the previous point, find the optimal α that maximizes the capacity and write down the corresponding maximum capacity Sol: Maximizing the capacity C implies minimizing the noise power α 2 N 1 +(1 α) 2 N 2, which is a convex function of α, so that the optimality condition is d dα (α2 N 1 +(1 α) 2 N 2 )=0 It follows that the optimal α reads α N 2 =, N 1 + N 2 that is, proportionally more weight is given to the less noisy received signal The corresponding capacity is 2 log P 2 1+ = α 2 N 1 +(1 α ) 2 N 2 = 1 2 log 2 µ1+ PN1 + PN2 4
P3 (2 points) You are given a discrete memoryless multiple access channel (MAC) with inputs X 1 {0, 1, 2} and X 2 {0, 1, 2} and output Y = X 1 + X 2 a) Assuming that both X 1 and X 2 are chosen uniformly and independently in their range, findtheachievablerateregionofthemac Sol: For fixed pmfs of the inputs X 1 and X 2,p(x 1 )=u(x 1 )andp (x 2 )=u(x 2 ), we have R 1 I(X 1 ; Y X 2 )=H(Y X 2 ) H(Y X 1,X 2 )=H(Y X 2 )=H(X 1 )=log 2 3=158 R 2 I(X 2 ; Y X 2 )=H(X 2 )=log 2 3=158 R 1 + R 2 I(X 1,X 2 ; Y )=H(Y) H(Y X 1,X 2 )=H(Y) In order to evaluate H(Y )weneedtoevaluatethepmfofy as follows: p X1 (0)p X2 (0) = 1/9 y =0 2 p X1 (1)p X2 (0) = 1/9 y =1 p (y) = 3 1/9 y =2 2 1/9 y =3 1/9 y =4 so that H(Y )= 2 1 9 log 2 µ 1 2 2 9 9 log 2 µ 2 1 9 3 log 2 µ 1 =22 3 b) Assume now that the two transmitters can cooperate for transmission over this MAC What is the capacity region (notice that in this case maximization over the input distribution is required)? Sol: In this case we can achieve R 1 + R 2 I(X 1,X 2 ; Y )=H(Y ) for any joint pmf p(x 1,x 2 ) In particular, we can optimize p(x 1,x 2 ) in order to maximize H(Y ) We notice that H(Y ) log 2 Y =log 2 5=232 with equality if p(y) =u(y) Can this value be achieved with the right choice of p(x 1,x 2 )? Observing that we must have p(y) = p(0, 0) = 1/5 y =0 p(0, 1) + p(1, 0) = 1/5 y =1 p(1, 1) + p(2, 0) + p(0, 2) = 1/5 y =2 p(2, 1) + p(1, 2) = 1/5 y =3 p(2, 2) = 1/5 y =4 we can conclude that it is enough to choose p(x 1,x 2 )as X 1 \X 2 0 1 2 0 1/5 1/10 1/15 1 1/10 1/15 1/10 2 1/15 1/10 1/5 We can then conclude that the capacity region with cooperation is characterized by R 1 +R 2 log 2 5=232 5,