Bounds on the Effective-length of Optimal Codes for Interference Channel with Feedback

Similar documents
On the Correlation between Boolean Functions of Sequences of Random Variables

Lecture 3: Shannon s Theorem

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Rate-Memory Trade-off for the Two-User Broadcast Caching Network with Correlated Sources

ECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006)

EGR 544 Communication Theory

More metrics on cartesian products

State Amplification and State Masking for the Binary Energy Harvesting Channel

Chapter 7 Channel Capacity and Coding

Chapter 7 Channel Capacity and Coding

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem.

ECE559VV Project Report

Error Probability for M Signals

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1

VQ widely used in coding speech, image, and video

Refined Coding Bounds for Network Error Correction

Entropy Coding. A complete entropy codec, which is an encoder/decoder. pair, consists of the process of encoding or

Edge Isoperimetric Inequalities

Lossy Compression. Compromise accuracy of reconstruction for increased compression.

Approximately achieving Gaussian relay network capacity with lattice codes

Channel Encoder. Channel. Figure 7.1: Communication system

NP-Completeness : Proofs

The Order Relation and Trace Inequalities for. Hermitian Operators

Games of Threats. Elon Kohlberg Abraham Neyman. Working Paper

APPENDIX A Some Linear Algebra

Pulse Coded Modulation

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

A note on almost sure behavior of randomly weighted sums of φ-mixing random variables with φ-mixing weights

Quantum and Classical Information Theory with Disentropy

A new construction of 3-separable matrices via an improved decoding of Macula s construction

Complete subgraphs in multipartite graphs

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009

Assortment Optimization under MNL

Consider the following passband digital communication system model. c t. modulator. t r a n s m i t t e r. signal decoder.

ON A DETERMINATION OF THE INITIAL FUNCTIONS FROM THE OBSERVED VALUES OF THE BOUNDARY FUNCTIONS FOR THE SECOND-ORDER HYPERBOLIC EQUATION

Simultaneous Optimization of Berth Allocation, Quay Crane Assignment and Quay Crane Scheduling Problems in Container Terminals

Tornado and Luby Transform Codes. Ashish Khisti Presentation October 22, 2003

Two-Way and Multiple-Access Energy Harvesting Systems with Energy Cooperation

Digital Modems. Lecture 2

On Network Coding of Independent and Dependent Sources in Line Networks

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method

Connecting Multiple-unicast and Network Error Correction: Reduction and Unachievability

On mutual information estimation for mixed-pair random variables

arxiv: v1 [math.co] 1 Mar 2014

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

The Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD

The Two-User Gaussian Fading Broadcast Channel

MAXIMUM A POSTERIORI TRANSDUCTION

On Information Theoretic Games for Interference Networks

Low Complexity Soft-Input Soft-Output Hamming Decoder

MMA and GCMMA two methods for nonlinear optimization

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction

Notes prepared by Prof Mrs) M.J. Gholba Class M.Sc Part(I) Information Technology

Notes on Frequency Estimation in Data Streams

Exact-Regenerating Codes between MBR and MSR Points

Lecture 3: Probability Distributions

Application of Nonbinary LDPC Codes for Communication over Fading Channels Using Higher Order Modulations

x = x 1 + :::+ x K and the nput covarance matrces are of the form ± = E[x x y ]. 3.2 Dualty Next, we ntroduce the concept of dualty wth the followng t

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

Estimation: Part 2. Chapter GREG estimation

Maximizing the number of nonnegative subsets

IN THE paradigm of network coding, a set of source nodes

(1 ) (1 ) 0 (1 ) (1 ) 0

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

Research Article Green s Theorem for Sign Data

Appendix B. Criterion of Riemann-Stieltjes Integrability

arxiv:cs.cv/ Jun 2000

Introduction to Information Theory, Data Compression,

A be a probability space. A random vector

Graph Reconstruction by Permutations

Power Allocation/Beamforming for DF MIMO Two-Way Relaying: Relay and Network Optimization

Introduction to Random Variables

Online Appendix: Reciprocity with Many Goods

Lecture Notes on Linear Regression

Secret Communication using Artificial Noise

Affine transformations and convexity

Real-Time Systems. Multiprocessor scheduling. Multiprocessor scheduling. Multiprocessor scheduling

ISSN On error probability exponents of many hypotheses optimal testing illustrations

Winter 2008 CS567 Stochastic Linear/Integer Programming Guest Lecturer: Xu, Huan

CLOUD Radio Access Network (C-RAN) is an emerging

CSCE 790S Background Results

Supplementary material: Margin based PU Learning. Matrix Concentration Inequalities

n α j x j = 0 j=1 has a nontrivial solution. Here A is the n k matrix whose jth column is the vector for all t j=0

BOUNDEDNESS OF THE RIESZ TRANSFORM WITH MATRIX A 2 WEIGHTS

CHAPTER III Neural Networks as Associative Memory

Power law and dimension of the maximum value for belief distribution with the max Deng entropy

Chapter 11: Simple Linear Regression and Correlation

Continuous Time Markov Chain

Lecture 5 Decoding Binary BCH Codes

Amiri s Supply Chain Model. System Engineering b Department of Mathematics and Statistics c Odette School of Business

Perfect Competition and the Nash Bargaining Solution

Coding over an Erasure Channel with a Large Alphabet Size

Dimensionality Reduction Notes 1

FACTORIZATION IN KRULL MONOIDS WITH INFINITE CLASS GROUP

Predictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore

Mathematical Models for Information Sources A Logarithmic i Measure of Information

Lecture 10 Support Vector Machines II

MATH 241B FUNCTIONAL ANALYSIS - NOTES EXAMPLES OF C ALGEBRAS

Transcription:

Bounds on the Effectve-length of Optmal Codes for Interference Channel wth Feedback Mohsen Hedar EECS Department Unversty of Mchgan Ann Arbor,USA Emal: mohsenhd@umch.edu Farhad Shran ECE Department New York Unversty New York, New York, 11201 Emal: fsc265@nyu.edu S. Sandeep Pradhan EECS Department Unversty of Mchgan Ann Arbor,USA Emal: pradhanv@umch.edu arxv:1801.05294v1 [cs.it] 16 Jan 2018 Abstract In ths paper, we nvestgate the necessty of fnte blocklength codes n dstrbuted transmsson of ndependent message sets over channels wth feedback. Prevously, t was shown that fnte effectve length codes are necessary n dstrbuted transmsson and compresson of sources. We provde two examples of three user nterference channels wth feedback where codes wth asymptotcally large effectve lengths are suboptmal. As a result, we conclude that coded transmsson usng fnte effectve length codes s necessary to acheve optmalty. We argue that the sub-optmal performance of large effectve length codes s due to ther neffcency n preservng the correlaton between the nputs to the dstrbuted termnals n the communcaton system. Ths correlaton s made avalable by the presence of feedback at the termnals and s used as a means for coordnaton between the termnals when usng fnte effectve length codng strateges. I. INTRODUCTION Most of the codng strateges developed n nformaton theory are based on random code ensembles whch are constructed usng ndependent and dentcally dstrbuted (IID) sequences of random varables [1] [4]. The codes assocated wth dfferent termnals n the network are mutually ndependent. Moreover, the blocklengths assocated wth these codes are asymptotcally large. Ths allows the applcaton of the laws of large numbers and concentraton of measure theorems when analyzng the performance of codng strateges; and leads to characterzatons of ther achevable regons n terms of nformaton quanttes that are the functonals of the underlyng dstrbuton used to construct the codes. These characterzatons are often called sngle-letter characterzatons. Although the orgnal problem s to optmze the performance of codes wth asymptotcally large blocklengths, the soluton s characterzed by a functonal (such as mutual nformaton) of just one realzaton of the source or the channel under consderaton. It s well-known that unstructured random codes wth asymptotcally large blocklength can be used to acheve optmalty n terms of achevable rates n pont-to-pont communcatons. In fact, t can be shown that large blocklength codes are necessary to approach optmal performance. At a hgh level, ths s due to the fact that the effcency of fundamental tasks of communcaton such as coverng and packng ncreases as the nput dmenson s ncreased [5]. In network communcaton, one needs to (a) remove redundancy among correlated nformaton sources [2], [4] n a dstrbuted manner n the source codng problems, and (b) nduce redundancy among dstrbuted termnals to facltate [1], [3] cooperaton among them. For example, n the network source codng problems such as dstrbuted source codng and multple descrpton codng, the objectve s to explot the statstcal correlaton of the dstrbuted nformaton sources. Smlarly, n the network channel codng problems, such as the nterference channels and broadcast channels, correlaton of nformaton among dfferent termnals s nduced for better cooperaton among them. At a hgh level, n addton to the basc objectves of effcent packng and coverng at every termnal, the network codng strateges need to explot statstcal correlaton among dstrbuted nformaton sources or nduce statstcal correlaton among nformaton accessed by termnals n the network. Wtsenhausen [6] and Gacs-Korner [7] made the observaton that dstrbuted processng of pars of sequences of random varables leads to outputs whch are less correlated than the orgnal nput sequences. In the network communcatons context, ths mples that the outputs of encodng functons at dfferent termnals n a network are less correlated wth each other than the orgnal nput sequences. In [8], [9], we bult upon these observatons and showed that the correlaton between the outputs of pars of encodng functons operatng over correlated sequences s nversely proportonal to the effectve length of the encodng functons. Based on these results, t can be concluded that whle random unstructured codng strateges wth asymptotcally large blocklengths are effcent n performng the tasks of coverng and packng, they are neffcent n facltatng coordnaton between dfferent termnals. Usng these results, we showed that fnte effectve codes are necessary to acheve optmalty n varous setups nvolvng the transmsson of correlated sources. Partcularly, we showed that the effectve length of optmalty achevng codes s bounded from above n the dstrbuted source codng problem as well as the problem of transmsson of correlated sources over the multple access channel (MAC) and the nterference channel (IC) [8], [10]. So far, all of the results showng the necessty of fnte effectve length codes pertan to stuatons nvolvng the ds-

trbuted transmsson of sources over channels and dstrbuted compresson of sources. However, the queston of whether such codes are necessary n mult-termnal channel codng has remaned open. The reason s that the applcaton of the results n [8], [9] requres the presence of correlated nputs n dfferent termnals of the network. In the case of dstrbuted processng of sources, such correlaton s readly avalable n the form of the dstrbuted source. Whereas, n dstrbuted transmsson of ndependent message t s unclear how such a correlaton can be created and exploted. In ths work, we argue that n channel codng wth feedback, correlaton s nduced because of the feedback lnk. More precsely, the feedback sequence at one termnal s correlated wth the message set n the other termnal. In order to explot ths correlaton effcently, fnte effectve length codes are necessary. The contrbutons of ths paper can be summarzed as follows. We provde two examples of nterference channels wth feedback where fnte effectve length codes are necessary to approach optmalty. For each of these examples, we provde an outer bound on the achevable regon as a functon of the effectve-length of the encodng functons used at the transmtters. Furthermore, we use fnte effectve length codes to prove the achevablty of certan rate vectors whch le outsde of the outer bound when the effectve length s large. The combnaton of these two results shows that n these examples any codng strategy whch uses encodng functons wth asymptotcally large effectve lengths s sub-optmal. The rest of the paper s organzed as follows: In Secton II we ntroduce the problem formulaton. Secton III provdes the pror results whch are used n ths paper. Secton IV explans our man results. Fnally, secton V concludes the paper. A. Notatons II. DEFINITIONS AND MODEL Random varables are denoted usng captal letters such as X,Y. The random vector (X 1,X 2,...,X n ) s represented by X n. Smlarly, we use underlne letters to denote vectors of numbers and functons. For shorthand, vectors are sometmes represented usng underlne letters wthout any superscrpt such as X,f, and a. Callgraphc letters such as C and M are used to represent sets. B. Model The problem of Interference Channel wth Feedback (IC- FB) s studed n [11] and [12]. A three-user nterference channel wth generalzed feedback (IC-FB) s characterzed by three nput alphabets (X 1,X 2,X 3 ), three output alphabets (Y 1,Y 2,Y 3 ), three feedback alphabets (Z 1,Z 2,Z 3 ), and transton probablty dstrbutons (Q Y X,P Z Y ). We assume that all the alphabets are fnte and that the channel s memoryless. Let x n,yn,zn, [1,3], be the channel nputs, outputs and Enc. 1 Enc. 2 Enc. 3 Z 1 X 1 X 2 X 3 Z 3 P Z1jY1 Q Y1Y2Y3jX1X2X3 P Z3jY3 Y 1 Y 2 Y 3 Dec. 1 Dec. 2 Dec. 3 Fg. 1. An nstance of the three-user IC wth generalzed feedback. Here transmtters 1 and 3 receve nosy feedback, whereas transmtter 2 does not receve feedback. the channel feedback after n uses of the channel, respectvely. The memoryless property mples that: p(y j,n,z j,n,j [1,3] y n 1,z n 1,x n, [1,3]) = Q Y X (y 1,n,y 2,n,y 3,n x 1,n,x 2,n,x 3,n ) P Z Y (z 1,n,z 2,n,z 3,n y 1,n,y 2,n,y 3,n ) In the three user IC-FB, there are three transmtters and three recevers. The th transmtter [1,3], ntends to transmt the message ndex W to the th recever. It s also assumed that the feedback Z, [1,3] s causally avalable at transmtter wth one unt of delay. An example of such setup s depcted n Fgure 1. In ths fgure, Z 2 s trval, and P Z1,Z 3 Y 1,Y 2,Y 3 = P Z1 Y 1 P Z3 Y 3 (.e. the second transmtter does not receve any feedback). For postve ntegers M 1,M 2,M 3 and N are arbtrary postve ntegers. Defnton 1. A (M 1,M 2,M 3,N) feedback-block-code for the three user IC-FB conssts of Three sets of messages M = {1,2,...,M }, [1,3]. Three sequences of encodng functons f,n : M Z n 1 X, 1 n N, Three decodng functons: g : Y N M, [1,3]. The message for transmtter s denoted by a random varable W. It s assumed that the messages W are mutually ndependent and unformly dstrbuted on M, [1,3]. The output of the th transmtter at the nth use of the channel s denoted by X,n = f,n (W,Z n 1 ). The rate-trple of an (M 1,M 2,M 3,N) code s defned as R = logm N, [1,3]. Let Ŵ, [1,3], be the decoded message at the recever. Then, the probablty of the error s defned as P e P((W 1,W 2,W 3 ) (Ŵ1,Ŵ2,Ŵ3)). For ths problem, at tme n, each transmtter can chose an encodng functon randomly usng a probablty measure defned over the set of all encodng functons f N descrbed

n Defnton 1. The followng defnes a randomzed codng strategy. Defnton 2. A(M 1,M 2,M 3,N)-randomzed codng strategy s characterzed by a probablty measure P N on the set of all functons (f N ), [1,3] descrbed n Defnton 1. Next, we defne the achevable regon for the three user IC-FB. Defnton 3. For ǫ > 0, a rate-trple (R 1,R 2,R 3 ) s sad to be ǫ-achevable by a feedback-block-code wth parameters (M 1,M 2,M 3,N), f the followng condtons are satsfed. P e ǫ, 1 N logm R ǫ, [1,3]. Defnton 4. For ǫ > 0, a rate-trple (R 1,R 2,R 3 ) s sad to be ǫ-achevable by a (M 1,M 2,M 3,N)-randomzed codng strategy wth probablty measure P, f, wth probablty one wth respect to P, there exsts a feedback-block-code for whch (R 1,R 2,R 3 ) s ǫ-achevable. Defnton 5. For ǫ > 0, a rate-trple (R 1,R 2,R 3 ) s sad to be ǫ-achevable, f there exst a (M 1,M 2,M 3,N) feedback-block-code (randomzed codng strategy) for whch (R 1,R 2,R 3 ) s ǫ-achevable. Defnton 6. A rate-trple (R 1,R 2,R 3 ) s sad to be achevable, f t sǫ-achevable for anyǫ > 0. Gven an IC-FB, the set of all achevable rate-trples s called the feedback-capacty. III. BACKGROUND AND PRIOR RESULTS In ths secton, we summarze the results n [8] on the correlaton between the outputs of Boolean functons of pars of sequences of random varables. These results are used n the next secton to prove the necessty of fnte effectve length codes. Defnton 7. (X,Y) s called a par of DMS s f we have P Xn,Y n(xn,y n ) = [1,n] P X,Y (x,y ), n N,x n X n,y n Y n, where P X,Y = P X,Y, [1,n], for some jont dstrbuton P X,Y. Defnton 8. A Bnary-Block-Encoder (BBE) s characterzed by the trple (e,x,n), where e s a mappng e : X n {0,1} n, X s a fnte set, and n s an nteger. Defnton 9. For a BBE (e, X, n) and DMS X, let P (e (X n ) = 1) = q. For each Boolean functone, [1,n], the real-valued functon correspondng to e s defned as follows: { ẽ (X n 1 q, f e (X n ) = 1, ) = (1) q. otherwse. Defnton 10. For a BBE (e,x,n), defne the decomposton ẽ = ẽ, whereẽ = E X n X (ẽ X ) j<ẽj. Then,ẽ s the component of ẽ whch s only a functon of {X j j = 1}. The collecton {ẽ j [1,n] j = k}, s called the set of k-letter components of ẽ. Defnton 11. For a functon e : X n {0,1}, wth real decomposton vector (ẽ ) {0,1} n, the dependency spectrum s defned as the vector (P ) {0,1} n of the varances, where P = Var(ẽ ), {0,1} n. The effectve length s defned as the expected value L = 1 n {0,1} w H() P n, where w H ( ) s the Hammng weght. Lemma 1. Let ψ sup(e(e(x)f(y)), where the supremum s taken over all sngle-letter functons e : X R, and f : Y R such that h(x) and g(y) have unt varance and zero mean. the followng bound holds: 2 P Q 2 C P 1 2 Q 1 2 P(e(X n ) f(y n )) 1 2 P Q +2 C P 1 2 Q 1 2, where 1) C ψ N, 2) P s the varance of ẽ, 3) ẽ s the real functon correspondng to e, 4) Q s the varance of f, and 5) N w H (). Remark 1. The value C s decreasng n N. So, P(e(X n ) f(y n )), s maxmzed when most of the varance P s dstrbuted on ẽ whch have lower N (.e. operate on smaller blocks). Ths mples that encodng functons wth smaller effectve-lengths can have hgher correlaton between ther outputs. IV. MAIN RESULTS In ths secton, we ntroduce two examples of three user IC-FBs where fnte effectve length codes are necessary to approach optmalty. Example 1. Consder the setup shown n Fgure 2. Here, (X 11,X 12 ),X 2,(X 32,X 33 ) are the outputs of the th Encoder [1,3], respectvely. The channel outputs Y 1, (Y 2,Y 2 ), and Y 3 are receved at decoders 1,2 and 3, respectvely. The channel correspondng to the transton probabltyp Y 2 X 12X 32 s descrbed by the followng relaton: Y 2 = X 12 +N δ +(X 12 X 32 ) E, where N δ and E are ndependent Bernoull random varables wth P(N δ = 1) = δ and P(E = 1) = 1 2. Also, the random varablesn ǫ, andn p n Fgure 2 are Bernoull random varables wth P(N ǫ = 1) = ǫ and P(N p = 1) = p, respectvely. The varables N δ,e,n ǫ and N P are mutually ndependent. In ths setup feedback s only avalable at encoder 1 and 3. The feedback at the frst transmtter s Z 1 = Y 1 wth probablty one. The feedback at the thrd transmtter s Z 3 = Y 3 wth probablty one. In other words the two transmtters receve noseless feedback. The followng provdes an outer bound on the achevable rates after n channel uses. The bound s provded as a functon of the average probablty of agreement between the encoder outputs X 12 and X 32.

Z 1 Enc. 1 Enc. 2 Enc. 3 Z 3 N p X 11 Y 1 X 12 X 2 X 32 X 33 P Y 0 2 jx12x32 N p N p N ǫ Y 0 2 Y 2 Y 3 Dec. 1 Dec. 2 Dec. 3 Fg. 2. The dagram of the IC-FB gven n Example 1. In ths setup, Z 1 s the feedback at Transmtter 1, and Z 3 s the feedback at transmtter 3. Theorem 1. Any (M 1,M 2,M 3,n)-randomzed codng strategy for the channel n Example 1 acheves a rate vector (R 1,R 2,R 3 ) satsfyng the followng nequaltes: R 1 1 h(p), R 2 1 h b (p) (1 h b (δ))( 1 n R 3 1 h b (p ǫ). n P(X 12, = X 32, ) + ), =1 Proof. The proof s gven n Appendx A. Corollary 1. Defne the set R as the unon of all rate-trples (R 1,R 2,R 3 ) such that R 1 1 h b (p), R 2 1 h b (p) (1 h b (δ)) +, R 3 1 h b (p ǫ). Then, the feedback-capacty of the channel n Example 1 s contaned n R. Corollary 2. Suppose p and δ are such that h(p) 1 h(δ), and ǫ = 0. Then the feedback-capacty of Example 1 s characterzed by the followng R 1 1 h(p),r 2 1,R 3 1 h(p). Proof. The converse follows by Theorem 1. For the achevablty, we use standard Shannon random codes at encoder 1 and 3. Then, the rates R 1 1 h(p), and R 3 1 h(p) are achevable. Because of the feedbacks Z 1,Z 3, the nose N p s avalable at encoder 1 and 3. Transmtters one and three send N p to recever 2. We requre N p to be decoded at recever 2 losslessly. Consder a good source-channel code for transmsson of N p over a Bnary Symmetrc Channel wth nose bas δ. We use ths codebook both at encoder 1 and 3. Snce the source N p and the codebook are avalable at encoder 1 and 3, then X 12 = X 32 wth probablty one. As a result, the channel P Y 2 X 12X 32 becomes a bnary symmetrc channel wth bas δ. Therefore, as h b (p) 1 h(δ) then N p s reconstructed at recever 2 wthout any nose. By subtractng N p from Y 2 the channel from X 2 to Y 2 becomes a nosless channel. Thus, R 2 = 1 s achevable. Lemma 2. Let C ǫ denote the feedback-capacty regon of the IC-FB n Example 1. For any(r 1,R 2,R 3 ) C 0, there exsts a contnuous functon ζ(ǫ) such that for suffcently small ǫ > 0 the rate-trple (R 1 ζ(ǫ),r 2 ζ(ǫ),r 3 ζ(ǫ)) C ǫ, where ζ(ǫ) 0, as ǫ 0. Proof. The proof s gven n Appendx B. Theorem 2. There exst γ > 0 and ǫ > 0, such that for any codng strategy achevng the rate-trple (1 h b (p),1 γ,1 h b (p)) the effectve length of the encodng functons producng X 12 and X 32 are bounded from above by a constant. Furthermore, the effectve length s greater than 1 (.e. uncoded transmsson s not optmal). proof outlne. From Theorem 1 the followng upper-bound holds for R 2. R 2 1 h b (p)(1 1 n P(X 12, = X 32, )) n Therefore, t s requred that 1 N =1 N P(X 12, = X 32, ) 1. =1 Ths mples that n N, P(X 12,n = X 32,n ) 1. However, by Lemma 1, ths requres that the effectve length be bounded from above. If the effectve length s equal to 1, then P n 1 for all n N, ths mples that P(F 1,n (Z1 n 1 ) = Z 1,n 1 ) 1. Thus, P(Y 2,n = N p +N δ ) 1. However, 1 N H(NN p,n 0 Y N 2,N0 ) (1 N 0 N )(2h b(p) h b (p p)) 2h b (p) h b (p p) > 0 As a result t s not possble to reconstruct N p at the decoder losslessly. More precsely, R 2 1+h b (p p) 2h b (p) Ths contradcts wth R 2 1. A. The Second Example In ths subsecton, we provde another example to llustrate more the necessty of codng strategy wth effectve fnte length for communcatons over IC-FB. Example 2. Consder the IC shown n 3. The outputs of encoder 1 are denoted by (X 11,X 12 ), the output of encoder 2 s X 2, and the outputs of encoder 3 are (X 32,X 33,X 33 ). In ths setup, Z 1 and Z 2 represent the feedback avalable at encoder 1 and encoder 3, respectvely. All the nputs alphabets n ths channel are bnary. All the output alphabets are bnary; except Y 1 whch a ternary. In ths setup N 1, N 3,N δ,n ǫ and E are mutually ndependent Bernoull random varables wth parameter p 1,p 3,δ,ǫ, and 1/2, respectvely. Fnally, t s assumed that p 1,p 3,δ,ǫ < 1/2.

Z 1 Enc. 1 N ǫ E X 11 Y 1 X 12 N 1 3 Dec. 1 d s dscontnuous as a functon of ǫ. Ths mples that the achevable rates usng sngle letter codng scheme strctly decreases comparng to the case when ǫ = 0. Hence, there exsts a γ > 0 such that any rate-trple (R 1,R 2,R 3 ) wth R 2 > 1 h b (δ) γ s not achevable usng sngle-letter codng strateges, where d s as n Lemma 3. More precsely, the followng Lemma holds. Enc. 2 X 2 X 32 N δ E N 3 N δ Y 12 Y 22 Y 32 Dec. 2 Lemma 4. There exst γ > 0 and ǫ > 0, such that for any codng strategy achevng the rate-trple (log 3 1,1 h b (d) γ,1 h b (δ)) the effectve length of the encodng functons producng X 12 and X 32 are bounded from above by a constant. Furthermore, the effectve length s greater than 1 (.e. uncoded transmsson s not optmal). Enc. 3 Z 3 X 33 X33 0 E Y 33 Y 0 33 Dec. 3 Proof. The proof for ths lemma follows from a smlar argument as n Theorem 2. V. CONCLUSION Fg. 3. The dagram of the IC-FB n Example 2. In ths setup, Z 1, the feedback at Transmtter 1, s a nosy verson of Y 1. We frst study the case n whch ǫ = 0. The followng lemma provdes an achevable rates for ths example. Lemma 3. For ǫ = 0 n the setup gven n Example 2, the rate-trple (log3 1,1 h b (d),1 h b (δ)) s achevable, where d = h 1 b ( h b (p 1 δ)+h(p 3 ) 1 + ). proof outlne. The bounds on R 1,R 3 follow from the standard arguments as n pont-to-pont channel codng (Fano s nequalty). Next, we show the bound on R 2. Upon recevng Z 1, the frst encoder recovers E. The thrd encoder receves Z 3 and recovers (E,N δ ). Encoder 1 and 3 employ a sourcechannel codng scheme to encode the sources E,N δ such that decoder 2 be able to reconstruct E +N δ wthn a Hammng dstorton d. Ths s a smlar problem to the Common-Bt One-Help-One Problem ntroduced n [13]. Usng the results from [13] (Theorem 3), we can show that decoder 2 s able to reconstruct E +N δ wthn a Hammng dstorton d, f the bounds R 32 h b ( δ) h b (d), and R 12 1 h b ( ), hold for some 0 1/2. From standard channel codng arguments the transmtted codewords from encoder 1 and 3 are decoded at recever 2 wth small probablty of error, f R 12 1 h b (p 1 ) ζ, R 32 1 h b (p 3 ) ζ, where ζ > 0 s a suffcently small number. Fnally the proof follows by settng p 1, and d as n the statement of the lemma. For the case when ǫ > 0, there s no common nformaton between encoder 1 and 3. From the dscontnuty argument as n [13], we can show that the mnmum dstorton level We provded two examples of channel codng wth feedback over nterference networks where fnte effectve length codng s necessary to acheve optmal performance. We showed that n these examples, optmalty achevng codng strateges utlze the feedback avalable n dfferent termnals to coordnate ther outputs. We showed that codng strateges wth asymptotcally large effectve lengths are neffcent n preservng the correlaton among ther outputs and are hence unable to coordnate ther nputs to the channel effectvely. APPENDIX A PROOF OF THEOREM 1 Proof. The boundsr 1 1 h b (p) andr 3 1 h b (p) follows from standard arguments as n pont-to-pont channel codng problem. Note that the feedback does not ncrease the rate of R 1 and R 3 snce these upper-bounds correspond to the pontto-pont capacty and feedback does not ncrease pont-to-pont capacty. To bound R 2 we use Fano s nequalty. Therefore, ; nr 2 H(W 2 ) (a) = H(W 2 Y n 2 ) (b) I(W 2 ;Y n 2 Y n = H(Y n 2 Y n 2 ) H(Y n 2 W 2,Y n (c) = H(Y n 2 Y n 2 ) H(Y n 2 W 2,X n 2,Y n (d) n H(Y n 2 W 2,X n 2,Y n (e) = n H(N n p Y n, (2) where (a), (c) and (e) follow from the fact that X n 12,Xn 32,Y n 2 are ndependent of W 2 and that X n 2 s a functon of W 2 snce the second transmtter does not receve feedback. (b) follows from Fano s nequalty and (d) follows from the fact that Y 2 s bnary.

Defne the random vector Z, [1,n] as the ndcator functon of the event that X 12, = X 32,. Then, H(Np n Y n 2 ) H(Np n Y n 2,Z n ) = p(z n = z)h(np n Y n 2,z) z {0,1} n For the nnermost term n the above nequalty we have: H(N n p Y n 2,z) (a) = H(N n p Xz 12 Nz δ ) = H(N n p,x z 12 Nz δ ) H(Xz 12 Nz δ ) (b) H(N n p,xz 12 Nz δ ) w H(z) (c) H(N n p,nz δ ) w H(z) = H(N n p)+h(n z δ ) w H(z) = nh b (p) w H (z)(1 h b (δ)), (3) where (a) follows from the defnton of Y 2, (b) follows from the fact that the bnary entropy s upper bounded by one and fnally, (c) follows form the fact that X 12 s ndependent of N δ. Combnng equatons (2) and (3), we get: H(N n p Y n 2 ) n h b (p) (1 h b (δ)) 1 n E[w H(z)] + APPENDIX B PROOF OF LEMMA 2 Proof. We can show that C ǫ s the set of all rate-trples (R 1,R 2,R 3 ) for whch N N such that R 1 1 N I(XN 1 ;Y N 1 ) R 3 1 N I(XN 33X N 33 ;Y N 33Y N 33 ) R 2 1 N I(XN 2 ;Y N 2 Y N 2 ). where the jont dstrbuton of the varables s n some set P ǫ. The proof for ths statement follows by a converse and an achevablty argument. For the converse, we use Fanoe s nequalty as n Theorem 1. The achevablty s straghtforward and follows by employng a mult-letter random codng scheme. For the case when ǫ = 0, and any achevable rate-trple (R 1,R 2,R 3 ) there exst N and γ > 0 such that R 1 1 h b (p) γ (4) R 3 1 h b (p) γ (5) R 2 1 N I(XN 2 ;Y N 2 Y N 2 ) γ, (6) where the jont dstrbuton of the random varables nvolved s denoted by P 0 P 0. Snce the bnary entropy functon s contnuous, then there exsts ζ(ǫ) such that 1 h b (p) 1 h b (p ǫ)+ζ(ǫ). Next, we show contnuty for the rghthand sde of (6). Fx N, P ǫ P ǫ, and consder the thrd nequalty n the characterzaton of C ǫ. Note that the only probablty dstrbuton dependng on ǫ s P(y 3,n x 3,n ). Snce ths condtonal probablty s contnuous wth ǫ then so s P ǫ. Thus, for any fxed N, the bound on R 2 n C ǫ s contnuous as a functon functon of ǫ. As a result, there exsts a functon ζ (ǫ) such that the rght-hand sde of the nequalty (6) s upper bounded by 1 N I(XN 2 ;Y 2 N Y N 2 ) + ζ (ǫ) γ for some jont dstrbuton P ǫ P ǫ. As a result, the followng bounds hold for (R 1,R 2,R 3 ). R 1 1 h b (p)+ζ(ǫ) γ R 3 1 h b (p ǫ)+ζ(ǫ) γ R 2 1 N I(XN 2 ;Y N 2 Y N 2 )+ζ(ǫ) γ. Ths mples that there exsts ǫ > 0 suffcently small such that (R 1 ζ(ǫ),r 2 ζ(ǫ),r 3 ζ(ǫ)) C ǫ. Thus, we establsh the contnuty of C ǫ at ǫ = 0. REFERENCES [1] Te Han and K. Kobayash. A new achevable rate regon for the nterference channel. IEEE Transactons on Informaton Theory, 27(1):49 60, Jan 1981. [2] S. Y. Tung. Multtermnal Source Codng. PhD thess, Cornell Unversty, Ithaca, NY, 1978. [3] K. Marton. A codng theorem for the dscrete memoryless broadcast channel. IEEE Transactons on Informaton Theory, 25(3):306 311, May 1979. [4] Zhen Zhang and T. Berger. New results n bnary multple descrptons. IEEE Transactons on Informaton Theory, 33(4):502 521, Jul 1987. [5] I. Csszár and J. Korner. Informaton Theory: Codng Theorems for Dscrete Memoryless Systems. Academc Press Inc. Ltd., 1981. [6] H. S. Wtsenhausen. On sequences of par of dependent random varables. SIAM Journal of Appled Mathematcs, 28(1):100 113, 1975. [7] P. Gacs and J. Körner. Common nformaton s far less than mutual nformaton. Problems of Control and Informaton Theory, 2(2):119 162, 1972. [8] Farhad Shran and S Sandeep Pradhan. On the sub-optmalty of sngle-letter codng n mult-termnal communcatons. arxv preprnt arxv:1702.01376, 2017. [9] Farhad Shran Chaharsoogh and S Sandeep Pradhan. On the correlaton between boolean functons of sequences of random varables. In Informaton Theory (ISIT), 2017 IEEE Internatonal Symposum on, pages 1301 1305. IEEE, 2017. [10] F. Shran and S. S. Pradhan. Fnte block-length gans n dstrbuted source codng. In 2014 IEEE Internatonal Symposum on Informaton Theory, pages 1702 1706, June 2014. [11] S. Yang and D. Tunnett, Interference Channel Wth Generalzed Feedback (a.k.a. Wth Source Cooperaton): Part I: Achevable Regon," n IEEE Transactons on Informaton Theory, vol. 57, no. 5, pp. 2686-2710, May 2011. [12] G. Kramer, Feedback strateges for whte Gaussan nterference networks," IIEEE Trans. Inf. Theory, vol. 48, no. 6, pp. 1423 1438, Jun. 2002. [13] A. B. Wagner, B. G. Kelly and Y. Altug, Dstrbuted Rate-Dstorton Wth Common Components," n IEEE Transactons on Informaton Theory, vol. 57, no. 7, pp. 4035-4057, July 2011.