UCSD ECE 255C Handout #14 Prof. Young-Han Kim Thursday, March 9, Solutions to Homework Set #5

Similar documents
UCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, Solutions to Take-Home Midterm (Prepared by Pinar Sen)

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

Solutions to Homework Set #1 Sanov s Theorem, Rate distortion

Lecture 22: Final Review

EE 4TM4: Digital Communications II. Channel Capacity

Lecture 5 Channel Coding over Continuous Channels

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach

Solutions to Homework Set #3 Channel and Source coding

EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

The Gallager Converse

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Solutions to Homework Set #2 Broadcast channel, degraded message set, Csiszar Sum Equality

Capacity Theorems for Relay Channels

Relay Networks With Delays

A Comparison of Superposition Coding Schemes

UCSD ECE153 Handout #30 Prof. Young-Han Kim Thursday, May 15, Homework Set #6 Due: Thursday, May 22, 2011

Distributed Lossless Compression. Distributed lossless compression system

Interactive Hypothesis Testing with Communication Constraints

Lecture 4 Noisy Channel Coding

ECE Information theory Final

The Role of Directed Information in Network Capacity

ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Solutions to Homework Set #4 Differential Entropy and Gaussian Channel

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

Lecture 1. Introduction

ECE Information theory Final (Fall 2008)

Lecture 14 February 28

Lecture 2. Capacity of the Gaussian channel

Ch. 8 Math Preliminaries for Lossy Coding. 8.4 Info Theory Revisited

Exercises with solutions (Set D)

Katalin Marton. Abbas El Gamal. Stanford University. Withits A. El Gamal (Stanford University) Katalin Marton Withits / 9

arxiv: v2 [cs.it] 28 May 2017

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)

Information Theoretic Limits of Randomness Generation

(Classical) Information Theory III: Noisy channel coding

UCSD ECE250 Handout #20 Prof. Young-Han Kim Monday, February 26, Solutions to Exercise Set #7

Capacity bounds for multiple access-cognitive interference channel

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157

Advanced Topics in Information Theory

LECTURE 13. Last time: Lecture outline

Lecture 10: Broadcast Channel and Superposition Coding

Can Feedback Increase the Capacity of the Energy Harvesting Channel?

Lecture 15: Conditional and Joint Typicaility

X 1 : X Table 1: Y = X X 2

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem

An Outer Bound for the Gaussian. Interference channel with a relay.

Remote Source Coding with Two-Sided Information

Lecture 3: Channel Capacity

SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003

On Capacity Under Received-Signal Constraints

( 1 k "information" I(X;Y) given by Y about X)

Lecture 4 Channel Coding

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels

Arimoto Channel Coding Converse and Rényi Divergence

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Generalized Writing on Dirty Paper

Simultaneous Nonunique Decoding Is Rate-Optimal

Feedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case

Example: Letter Frequencies

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

Example: Letter Frequencies

Chapter 8: Differential entropy. University of Illinois at Chicago ECE 534, Natasha Devroye

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Analyzing Large Communication Networks

Lecture 17: Differential Entropy

LECTURE 3. Last time:

ECE 4400:693 - Information Theory

Solutions to Set #2 Data Compression, Huffman code and AEP

ProblemsWeCanSolveWithaHelper

Bounds on Capacity and Minimum Energy-Per-Bit for AWGN Relay Channels

On the Capacity of Interference Channels with Degraded Message sets

Chapter I: Fundamental Information Theory

UCSD ECE153 Handout #34 Prof. Young-Han Kim Tuesday, May 27, Solutions to Homework Set #6 (Prepared by TA Fatemeh Arbabjolfaei)

LECTURE 2. Convexity and related notions. Last time: mutual information: definitions and properties. Lecture outline

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

National University of Singapore Department of Electrical & Computer Engineering. Examination for

An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel

The binary entropy function

Source Coding with Lists and Rényi Entropy or The Honey-Do Problem

UCSD ECE 153 Handout #20 Prof. Young-Han Kim Thursday, April 24, Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei)

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

Joint Write-Once-Memory and Error-Control Codes

Chapter 2: Entropy and Mutual Information. University of Illinois at Chicago ECE 534, Natasha Devroye

UCSD ECE153 Handout #27 Prof. Young-Han Kim Tuesday, May 6, Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei)

On the Duality between Multiple-Access Codes and Computation Codes

The Capacity of the Semi-Deterministic Cognitive Interference Channel and its Application to Constant Gap Results for the Gaussian Channel

Capacity of a Class of Semi-Deterministic Primitive Relay Channels

Concentration Inequalities

Information Theory Primer:

Information Theory and Communication

On Common Information and the Encoding of Sources that are Not Successively Refinable

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.

Source-channel coding for coordination over a noisy two-node network

Draft. On Limiting Expressions for the Capacity Region of Gaussian Interference Channels. Mojtaba Vaezi and H. Vincent Poor

Capacity of a Class of Cognitive Radio Channels: Interference Channels with Degraded Message Sets

Transcription:

UCSD ECE 255C Handout #14 Prof. Young-Han Kim Thursday, March 9, 2017 Solutions to Homework Set #5 3.18 Bounds on the quadratic rate distortion function. Recall that R(D) = inf F(ˆx x):e(x ˆX)2 DI(X; ˆX). Let ˆX = (1 D )(X +Z), P where Z = N(0,PD/(P D)). It is straightforward to check that for this choice of F(ˆx x), E((X ˆX) 2 ) D, and therefore I(X; ˆX) is an upper bound on R(D). But I(X; ˆX) = h( ˆX) h( ˆX X) ( = h( ˆX) 1 2 log 2πe(1 D/P) 2 PD ) P D ( 1 (2πeE( 2 log ˆX 2 ) ) 1 2 log 2πe(1 D/P) 2 fpd ) P D 1 2 log( 2πe(1 D/P) 2 (P +PD/(P D))) ) ( 1 2 log 2πe(1 D/P) 2 PD ) P D = 1 2 log(p D ). Thus, R(D) R(P/D). (Note that by the discretization method, any rate R > R(P/D) is achievable.) For the lower bound, consider R(D) = min h(x) h(x ˆX) = h(x) max h(x ˆX) = h(x) max h(x h(x) max h(x) 1 2 log(2πed). h(x ˆX) ˆX ˆX) Since the upper bound is equal to the rate distortion function of the Gaussian source of power P, this suggests that among all sources of the same power, the Gaussian source are the hardest to compress. 3.19 Lossy source coding from a noisy observation. Since the encoder has access only to Y n, the distortion 1

becomes Ed(X n,ˆx n (m(y n )))]. This distortion can be rewritten as follows. Ed(X n,ˆx n (m(y n )))] = 1 n E Xi,Y n d(x i,ˆx i (m(y n )))] (a) = y n p(y n ) 1 n (b) = y n p(y n ) 1 n (c) = y n p(y n ) 1 n = 1 n Ed(X i,ˆx i (m(y n ))) Y n = y n ] Ed(X i,ˆx i (m(y n ))) Y i = y i ] d (y i,ˆx i (m(y n ))) Ed (Y i,ˆx i (m(y n )))] = Ed (Y n,ˆx n (m(y n )))] where(a)followsfromthepropertyofconditionalexpectations; (b)followsfromthefactthat ˆx i (m(y n )) is a function of y n ; (c) follows from the definition of d (y,ˆx) in the hint. Hence, this DMS equivalent to the one with source Y and a new distortion measure d. Therefore, the rate distortion function of this DMC is given by R(D) = min I(Y; ˆX) p(ˆx y):e(d (Y, ˆX)) D = min I(Y; ˆX) p(ˆx y):e Y E(d(X, ˆX) Y)] D = min I(Y; ˆX). p(ˆx y):e(d(x, ˆX)) D 11.6 Side information with occasional erasures. We show that { p(1 H(D/p)) 0 D/p 1/2, R SI-D = R SI-ED = pr(d/p) = 0 otherwise, where R(D) is the rate distortion function without side information. For the proof of the converse, consider R SI-ED (D) = min p(ˆx x,y):e(d(x, ˆX)) D I(X; ˆX Y) and note that for p(ˆx x,y) such that we must have D Ed(X, ˆX)] ped(x, ˆX) Y = e], I(X; ˆX Y) = pi(x; ˆX Y = e) (a) pr(d/p), where (a) follows by noting that X {Y = e} p(x) and p ˆX X,Y (ˆx x,e) satisfies Ed(X, ˆX) Y = e] D/p, and concluding that I(X; ˆX Y = e) R(D/p). Thus, R SI-ED (D) pr(d/p). 2

For achievability, consider R SI-D (D) = and set p(u x) to attain the minimum of and ˆx(u,y) to satisfy Then, and Thus, R X (D/p) = min I(X;U Y), p(u x),ˆx(u,y):e(d(x, ˆX)) D ˆX = min I(X;U) p(u x):ed(x,u)] D/p { U Y = e, Y otherwise. I(X;U Y) = pi(x;u Y = e) = pr(d/p) Ed(X, ˆX)] = ped(x, ˆX) Y = e]+(1 p)ed(x, ˆX) Y = X] = p D p = D. R SI-D pr(d/p). Notethat this argumentdue toperron, Diggavi,and Telagar(2007)isapplicable to anydms X p(x) and the erasure side information { X w.p. 1 p, Y = e w.p. p. 15.7 Triangular cyclic network. (a) The cutset bound is the set of rate triples (R 1,R 2,R 3 ) such that R 1 1, R 2 1, R 3 1. (b) Achievability follows simply by routing. For the proof of the converse, consider any sequence of codes with lim n P e (n) = 0. Since, by assumption, M 3 can be recovered from M 12 and M 2 with high probability at node 2, M 3 can also be recovered at node 1 with high probability. By applying the similar arguments to other nodes, it can be easily shown that the capacity region is the same as the one for the new network depicted in Figure 1. Now the cutset bound for the new network is the set of rate triples (R 1,R 2,R 3 ) such that This completes the proof of the converse. R 1 +R 2 1, R 2 +R 3 1, R 3 +R 1 1. 16.13 Properties of the Gaussian relay channel capacity. The first property follows from the fact that C(P) C(g 2 31P), the capacity of the direct channel, which is strictly greater than zero for P > 0, and tends to infinity as P. The second property follows from the fact that the cutset bound, which is greater than or equal to C(P) tends to zero as P 0. To prove the third property, use the following 3

M 2 ˆM 3, ˆM 1 2 1 3 ˆM 2, ˆM 3 M 3 M 1 ˆM1, ˆM 2 Figure 1: Equivalent triangular cyclic network. time-sharing argument. For any blocklength n, let k = αn and k = n k. For any P,P > 0 and ǫ > 0, there exists n such that C(P) < C (k) (P) + ǫ and C(P ) < C (k ) (P ) + ǫ. Consider the distribution F(x k 1 ) and functions {x 2i} k that satisfy the power constraint P and attains C(k) (P), and F (x k 1 ) and {x 1i }k that satisfy the power constraint P and attain C (k ) (P ). Now the distribution F(x k 1)F (x n k+1 ) and functions ({x 2i} n,{x 2i }n i=k+1 ) satisfy the power constraint αp +ᾱp and attain αc (k) (P)+ βc (k ) (P ). Hence αc(p)+ᾱc(p ) αc (k) (P)+ᾱC (k ) (P )+ǫ C (n) (αp +ᾱp )+ǫ C(αP +ᾱp )+ǫ. Taking ǫ 0 establishes the concavity. Note that C (k) (P) may not be concave for fixed k. Finally, that C(P) is strictly monotonically increasing in P follows from the first two parts and concavity. This completes the proofs of the properties of C(P). 17.11 Directed information. (a) Consider (b) Since p(y n,x n ) = = n p(y i,x i y i 1,x i 1 ) n p(x i y i 1,x i 1 )p(y i y i 1,x i ) = p(x n y n 1 )p(y n x n ). I(X n Y n ) = I(X i ;Y i Y i 1 ) and each term in the summation is nonnegative, so is I(X n Y n ). Moreover, it is equal to zero iff each term is equal to 0, or equivalently, p(y i y i 1,x i ) = p(y i y i 1 ), i 1 : n]. In the causal conditioning notation, this can be rewritten as p(y n x n ) = p(y n ). 4

(c) Consider I(X n ;Y n ) = E log p(y n,x n ] ) p(y n )p(x n ) = E log p(y n X n )p(x n X n 1 ] ) = E log p(y n X n ) p(y n ) p(y n )p(x n ) ] +E log p(xn Y n 1 ) p(x n ) = I(X n Y n )+I(Y 0,Y n 1 X n ). ] (d) The inequality I(X n Y n ) I(X n ;Y n ) follows immediately from part (c). By a similar argument to part (b), the equality holds iff p(x n y n 1 ) = p(x n ). 17.14 Gaussian two-way channel. Assume without loss of generality that ] 1 ρ K =. ρ 1 By the maximum differential entropy lemma, the outer bound on the capacity region in Proposition 17.3 simplifies as R 1 I(X 1 ;Y 2 X 2 ) = 1 2 log(1+g2 12Var(X 1 X 2 )) = 1 2 log(1+g2 12P), R 2 I(X 2 ;Y 1 X 1 ) = 1 2 log(1+g2 21Var(X 2 X 1 )) = 1 2 log(1+g2 21P), (1) which is attained by X 1 N(0,P) and X 2 N(0,P), independent of each other. By setting the same Gaussian (X 1,X 2 ) and Q = in the inner bound in Proposition 17.2 and taking the standard discretization argument, the rectangular rate region in (1) is achievable. Note that point-to-point capacities are achieved and that the noise correlation is irrelevant. 17.15 Common-message feedback capacity of broadcast channels. The common-message feedback capacity is C F = max p(x) min{i(x;y 1), I(X;Y 2 )}. Achievability follows by the case without feedback. For the converse, consider nr = H(M) I(M;Y1 n )+nǫ n = I(M;Y 1i Y1 i 1 )+nǫ n (a) (b) = I(M,Y i 1 1 ;Y 1i )+nǫ n I(M,Y i 1 2 ;Y 1i )+nǫ n I(M,Y i 1 2,X i ;Y 1i )+nǫ n I(X i,y 1i )+nǫ n, 5

where(a)followssincex i isafunctionof(m,y i 1 2 )and(b)followssincei(m,y i 1 2 ;Y 1i X i ) = 0 for all i. Following the usual procedure by introducing the time-sharing random variable, we have shown that R I(X;Y 1 ). Similarly, we have R I(X;Y 2 ). Thus, R min{i(x;y 1 ), I(X;Y 2 )} for some pmf p(x). 18.7 Broadcasting over a diamond network. The capacity is C = max min{i(x 1,Y 2 ), I(X 1,Y 3 ), I(X 2,X 3 ;Y 4 )}. p(x 1),p(x 2,x 3) For achievability, consider a two-hop relaying scheme in which node 1 communicates to nodes 2 and 3 using common-message broadcasting at rate R 1 < min{i(x 1 ;Y 2 ), I(X 1 ;Y 3 )} and nodes 2 and 3, in turn, communicate to node 4 using cooperative multiple access at rate R 2 < I(X 2,X 3 ;Y 4 ). The minimum of R 1 and R 2 is certainly achievable by two-hop relaying over multiple blocks. For the proof of the converse, consider the cutset bound C = max p(x 1,x 2,x 3) min S:1 S,S {1,2,3,4} I(X(S);Y(Sc ) X(S c )) with S = {1,3,4}, S = {1,2,4}, and S = {1,2,3}. The corresponding mutual information terms can be upper bounded as C I(X 1,X 3 ;Y 2 X 2 ) I(X 1,X 2,X 3 ;Y 2 ) = I(X 1 ;Y 2 ), C I(X 1,X 2 ;Y 3 X 3 ) I(X 1,X 2,X 3 ;Y 3 ) = I(X 1 ;Y 3 ), C I(X 1,X 2,X 3 ;Y 4 ) I(X 1,X 2,X 3 ;Y 4 ) = I(X 2,X 3 ;Y 4 ). Noting that these bounds depend on p(x 1,x 2,x 3 ) only through the marginals p(x 1 ) and p(x 2,x 3 ) completes the proof. 6