Graph Coloring and Conditional Graph Entropy
|
|
- Emery Parker
- 5 years ago
- Views:
Transcription
1 Graph Coloring and Conditional Graph Entropy Vishal Doshi, Devavrat Shah, Muriel Médard, Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology Cambridge, MA 0239 {vdoshi, devavrat, medard, mit.edu Abstract We consider the remote computation of a function of two sources where one is receiver side information. Specifically, given side information, we wish to compute f(, ) based on information transmitted by over a noise-less channel. The goal is to characterize the minimal rate at which must transmit information to enable the computation of the function f. Recently, Orlitsky and Roche (200) established that the conditional graph entropy of the characteristic graph of the function is a solution to this problem. Their achievability scheme does not separate functional coding from the well understood distributed source coding problem. In this paper, we seek a separation between the functional coding and the coding for correlation. In other words, we want to preprocess (with respect to f), and then transmit the preprocessed data using a standard Slepian-Wolf coding scheme at the Orlitsky-Roche rate. We establish that a minimum (conditional) entropy coloring of the product of characteristic graphs is asymptotically equal to the conditional graph entropy. This can be seen as a generalization of a result of Körner (973) which shows that the minimum entropy coloring of product graphs is asymptotically equal to the graph entropy. Thus, graph coloring provides a modular technique to achieve coding gains when the receiver is interested in decoding some function of the source. I. INTRODUCTION Consider a network of sensors relaying correlated measurements to a central receiver with measurements of its own. Usually, the receiver is interested in computing a function of the received data. Hence, to the receiver the bulk of the data received from the various sensors is irrelevant. Contemporary data compression techniques (see [], [2]) compress the data utilizing only the correlation of the measurements and do not take the receiver s function of interest into account. To increase the efficiency for such a system, we seek to implement better compression rates by exploiting the end-user s intentions (the computation of the function f of the data received). Consider the following example: suppose there is a source (uniformly) producing an n-bit integer. The receiver also produces an n-bit integer and is only interested in the parity of the sum. Assuming independence of the integer data, contemporary techniques would have the source transmit all n bits, when it is clear that only the final bit is necessary. The use of such compression techniques enables possibly large gains over contemporary methods depending upon the complexity of the function. As a first step towards the implementation of methods like this, we consider the case (as in the example) where there is a single source, denoted by the random variable with support. The information available at the receiver is denoted by ENCODER DECODER f(,) Fig.. is encoded such that f(, ) is recovered based on the coded information and side information. the random variable with support. Both and are discrete memoryless sources. The receiver wishes to compute a function f : Z, where Z is some finite set. The goal here is to minimize the rate at which transmits information to the receiver so as to enable computation of f at the receiver. Figure illustrates the question of interest. A. Setup and Related Work This question has been very well studied in different contexts in the literature [3], [4], [5]. Among them, Orlitsky and Roche [5] recently characterized the minimum rate requirement as the conditional graph entropy of the characteristic graph. We introduce the necessary definitions and notation to explain these results as well as formally state our question of interest. Assume that the random variables and have some joint distribution p(x, y). The n-sequence of random variables, denoted by and are such that (, ) ={( i, i )} n i= are i.i.d. with each ( i, i ) drawn according to p(x, y). The receiver is interested in computing a function f : Z or f : n n Z n, its obvious extension. The primary goal is to determine the minimal rate for to transmit data to the receiver so that the receiver can compute the function f with vanishing probability of error. Wyner and Ziv [6] considered the side information rate distortion problem. amamoto [4] extended their rate distortion function to the case of functional compression. Their functions relied on auxilliary random variables. In a sense, this paper makes those variables explicit for the zero distortion case. Further, we describe them in such a way that our scheme builds upon a Slepian-Wolf coding scheme [7] layered with a preprocessing of the source data. Define the characteristic graph of with respect to, f, and p(x, y) is G =(V,E) with V = and E defined as follows: an edge (x,x 2 ) 2 is in E if there exists a y /06/$
2 such that p(x,y),p(x 2,y) > 0 and f(x,y) f(x 2,y). Effectively, G is the confusability graph from the perspective of the receiver with side information. This was first defined by Witsenhausen [8]. The graph entropy of a graph G with some distribution on its vertices was introduced by Korner [3] and is defined as: H G () = min I(W ; ), W Γ(G) where Γ(G) is the set of all independent sets of G. Here, W means that the joint distribution p(w, x) on Γ(G) is such that p(w, x) > 0 implies x w. To motivate the importance of graph entropy, consider the following: given a function g : Z, define deterministic random variable on = {} and set f(x, ) = g(x) for all x. Then, the graph entropy H G () is the minimal rate required to distinguish functional values of g [8]. Orlitsky and Roche [5] established that the conditional graph entropy, an extension of graph entropy, is the minimum required transmission rate from to receiver to compute function f(, ) with small probability of error. The conditional graph entropy is defined as H G ( )= min W Γ(G) W I(W ; ), A natural extension of this problem is the computation of rate-distortion region. amamoto characterized the ratedistortion function for the question posed here by extending the Wyner-Ziv side information result [6]. This was recently built upon by Feng, Effros and Savari [9]. Our work in this paper is closer (in spirit) to the result by Körner [3]. In essence, Körner showed that achieving the graph entropy is the same as coloring a large graph and sending the colors. Consider the OR-product graph of G, denoted as G n = (V n,e n ) where V n = V n = n and two vertices (x, x 2 ) are in E n if any component (x i,x 2i ) E. By coloring, we mean standard graph vertex coloring, i.e. a function c : V N of a graph G =(V,E) such that (x,x 2 ) E implies c(x ) c(x 2 ). The entropy of a coloring is given by the induced distribution on colors p(c(x)) = p(c (c(x))) where c (x) ={ x : c( x) =c(x)}. Next, we define ε-colorings of any graph G (for any ε>0). Let A be any subset such that p(a) ε. Define ˆp(x, y) =p(x, y) (x,y) A (where Q is the indicator function for condition Q). While this is not a true probability, our definition of characteristic graph of with respect to, f, and ˆp is valid. Denote this characteristic graph as Ĝ. Note that the set of edges of Ĝ is a subset of the set of edges of G. Finally, we say c is an ε-coloring of G if it is a valid colorings of Ĝ defined with respect to some high probability set A. The chromatic entropy of a graph is () =inf{h(c()) : c is an ε-coloring of G}. ε>0 A subset of vertices of a graph is an independent set if no two nodes in the subset are adjacent to each other in G. COLOR Distributed Source Coding ENCODE DECODE LOOKUP f(,) Fig. 2. Coloring based coding allows separation between functional coding and distributed source coding as shown above. We define it s natural extension conditional chromatic entropy as follows: ( ) = inf {H(c() ):c is an ε-coloring of G}. ε>0 Now, we state the Körner s result [3] as follows: n n Hχ G n() =H G(). The result requires ε-colorings and not the simple colorings because, as seen in [0], the ratio of the above quantities can be arbitrarily large. The implications of this result are that for large enough n, one can color a high probability subgraph of the product graph G n to achieve near optimum functional source coding when the function is only of one source. This effectively layers the solution to a problem of graph coloring followed by using standard coding techniques to transmit the color sequence. The Orlitsky and Roche s solution is an end-to-end solution. That is, it includes functional and distributed source coding. Their achievability scheme involves coding over the space of independent sets. It is a larger class than the space of colors because each coloring gives rise to disjoint independent sets. Like the implication of Körner s result, the coloring will allow for separation between functional coding and distributed source coding. Hence we want a result similar to that of Körner s result so that the problem can be decoupled without losing any coding gains asymptotically. B. Our Contribution As the main result of this paper, we show that the minimum conditional entropy coloring of the OR-product graph is asymptotically the same as the conditional chromatic number. Theorem : n n Hχ G n( ) =H G( ) This validates the idea of using graph coloring to achieve better compression ratios. The variable W and the distribution on it that achieves H G ( ) is the auxilliary random variable from the amamoto result. In the entropy sense, the induced variable, c(), converges to W. This method thus leads to a natural layering of the problem just as in Körner s case. Figure 2 illustrates the layering. Nevertheless, we need large enough n to achieve rates close to the conditional graph entropy. Consider the following 238
3 example from [5]. We have = = {, 2, 3}, p(x, y) = 6 for all x y, and f(x, y) =if x>yand 0 otherwise. Then, our graph G has only one edge: (, 3). Because of uniformity, we know ( )= 2 3 using the coloring c() = c(2) = and c(3) = 2. But H G ( ) 0.54 bits as calculated in [5]. This is worrisome because minimum entropy graph coloring has been shown to be NP complete []. Nevertheless, the problem is layered into two. The second part is that of distributed source coding which is well understood. There exist methods to achieve corner points of the Slepian-Wolf rate region (e.g. []), and there exist methods to achieve all other points on the rate region (e.g. [2]). Further, the first part, that of graph coloring, has very rich literature (e.g. [0], [2]) and heuristics exist (e.g. [3]) for this hard problem. II. PROOF OF MAIN RESULT We borrow some techniques that are present in Körner [3], Orlitsky and Roche [5]. First, we recall some useful known results. Then, we establish the claimed equality in Theorem. A. Preinaries We begin by listing the results that we will use in our proof. Definition (Typicality): We use the notion of ε-strong typicality defined in [4, p. 358]. For any n, and any sequence x of length n, define the empirical frequency of the symbol x in x as ν x (x) = {i : x i = x}. n Then a sequence x is ε-strongly typical for ε>0 if for all x with p(x) > 0, ν x (x) p(x) ε, and for all x with p(x) =0, ν x (x) =0. The set of all such ε-strongly typical sequences, called the ε-typical set will be denoted as Tε n (), ortε n when the variables are clear from context. A similar definition naturally extends for the case of joint variables. Lemma : Suppose (, ) is a sequence of n random variables drawn independently and according to the joint distribution p(x, y), which is the marginal of p(w, x, y). Let an n-sequence W be drawn independently according to its marginal, p(w). Suppose the joint distribution p(w, x, y) forms a Markov Chain, W (i.e. p(w, x, y) =p(w x)p(x, y)). Then, for all ε>0, there is a ε = k ε, where k depends only on the distribution p(w, x, y), such that for sufficiently large n, ) P [ / Tε n ] <ε, P [ / Tε n ] <ε, and P [(, ) / Tε n ] <ε, 2) For all x Tε n, P [(x, W) Tε n ] 2 n(i(w ;)+ε). 3) For all y Tε n, P [(y, W) Tε n ] 2 n(i(w ;) ε). 4) For all (w, x) Tε n, P [(w, ) T n ε (x, ) T n ε ] ε. Part follows from Lemma 3.6., parts 2 and 3 follow from Lemma 3.6.2, and part 4 follows from Lemma 4.8., all from [4]. COLOR Distributed Source Coding ENCODE DECODE LOOKUP ENCODE f(,) Fig. 3. We use Slepian-Wolf codes with sending at full rate to achieve the corner point R x = H(c() ). B. Lower bound To prove the lower bound, consider any n and the corresponding OR-product graph G n. Let the coloring c of G n be such that n( ) =H(c()). Such a coloring exists because the number of possible colorings for a graph with n vertices is a finite set. Using this coloring, we will show that there is a scheme that requires rate n Hχ Gn( ) and provides a way to compute the function at receiver with small probability of error. This will establish the achievability of rate n Hχ Gn( ). By Orlitsky and Roche [5], no scheme can achieve rate smaller than H G ( ). Lemma 2: inf n n Hχ G n( ) H G( ). Proof: Given n>0, let the coloring c, as above, denote one the coloring that achieves n( ) for the OR-product graph G n. For every color, σ and y, define g(σ, y) =f(x, y) where x is any element of c (σ) ={x : c(x) =σ} such that p(x, y) > 0. If no such x exists, g is undefined. Suppose x is the information at the source, and y is the information at the receiver. Without loss of generality, presume p(x, y) > 0. Suppose c(x) and y are available at the decoder. Then, we have an error if g(σ, y) f(x, y). Suppose x, c(x )=c(x), and p(x, y) > 0. Because x and x have the same color, we know (x, x ) / E n, and therefore for all y where p(x, y),p(x, y) > 0, including ours, we must have f(x, y) =f(x, y). Because f has the same value for all x with p(x, y) > 0, f(x,y)=g(σ, y). Thus, we have shown that the color of and are sufficient to determine the function f(, ). The Slepian-Wolf Theorem [7] states that for two sources, and, there exists a sequence of codes with vanishing probability of error as n grows without bound for the rates: R H( ), R 2 H( ), and R + R 2 H(, ). We apply that theorem to the source (c(), ). In our case, is side information. In other words, R 2 H( ). See Figure 3 for an illustration. Thus, we get the following per symbol rates are achievable for an encoding of where c is any ε-coloring of G n (with 239
4 ε>0). R n H(c() ). Thus, the minimal value of R is by definition n( ). Putting the above discussion together, we obtain the desired result: inf n n Hχ G n( ) H G( ) C. Upper Bound Next we show that, indeed, the information needed to recover c n () given is at most H G ( ). Lemma 3: inf n n Hχ G n( ) H G( ). Proof: Suppose ε, δ > 0. Suppose n is large enough so that the following holds: () Lemma applies with some ε <, (2) 2 nδ <ε, and (3) n>2+ 3 2ε. Let p(w, x, y) be the distribution that achieves H G ( ) with the Markov chain, W. Let marginals be denoted simply as p(w), p(x), p(y). For any integer M, we define an M-system (W,...,W M ) where each W i is drawn independently according to p(w) = n i= p(w i). Declare an error if (x, y) / Tε n. (Thus, we are in fact looking at colorings over Tε n, or ε -colorings.) This, by construction, happens with probability less than ε. Henceforth assume that (x, y) Tε n. Declare an error if there is no i such that (W i, x) Tε n. The probability for this event is P [(W i, x) / T n ε i] (a) i= (b) M P [(W i, x) / Tε n ] =( P [(W, x) Tε n ]) M ( 2 n(i(w ;)+ε)) M (c) (d) 2 M 2 n(i(w ;)+ε ) where (a) and (b) follow because the W i are i.i.d., (c) follows from Lemma, part (2), and (d) follows because for α [0, ], ( α) n 2 nα. Assuming M>2 n(i(w ;)+ε+δ), P [(W i, x) / Tε n i] 2 δn <ε, because n is large enough. Henceforth, fix an M-system (W,...,W M ) for some M>2 n(i(w ;)+ε+δ). Further, assume there is some i such that (W i, x) Tε n. For every x, let the smallest (or any) such i be denoted as ĉ(x). Note that ĉ is a valid coloring of the graph G n. Next, for each y, define the following: S(y) ={ĉ(x) :(x, y) Tε n }, Z(y) ={W i :(W i, y) Tε n }, s(y) = S(y), z(y) = Z(y). Using these definitions, s(y) = M i= i S(y), where we have used the fact that our coloring scheme ĉ is simply an assignment of the indices of the M-system. Thus, we know E[s()] = P [i S()]. i= Similarly, we get z(y) = M i= W i Z(y). Thus, E[z()] = = P [W i Z()] i= P [W i Z() and i S()] i= P [i S()]P [W i Z() i S()] i= We know that if i S(), there is some x such that ĉ(x) =i and (x, ) T n ε. For each such x, we must have (by definition of our coloring), (W i, x) T n ε. For each such x where (W i, x) T n ε, P [W i Z() i S()] = P [(W i, ) Tε n (x, ) Tε n ] ε by Lemma, part (4). Thus, we have E[z()] ( ε )E[s()]. This and Jensen s inequality imply that E[log s()] loge[s()] [ ] z() log E. ε Finally, using a Taylor series expansion, we know log ε ε 2 =2ε + 2 when 0 <ε <. Thus, Next, we compute E[log s()] log E[z()] + ε 2, () E[z()] = P [(W i, ) Tε n ] i= = M P [(W i, ) T n ε ] because the W i are i.i.d. Therefore, E[z()] M 2 n(i(w ; ) ε) (2) by Lemma, part (3). By the definition of S(y), we know that determining ĉ given = y requires at most log s(y) bits. Therefore, we have H(ĉ() ) E[log s()]. (3) 240
5 Putting it all together, we have (a) n( ) H(ĉ() ) (b) E[log s()] (c) log E[z()] + ε 2 (d) ( log M 2 n(i(w ; ) ε)) + ε 2 ( ) (e) =log 2 n(i(w ;) I(W ; )+2ε+δ) + + ε 2 (f) n(i(w ; ) I(W ; )+2ε + δ)++ε 2 where (a) follows by definition of the conditional chromatic entropy, (b) follows from inequality (3), (c) follows from inequality (), (d) follows from inequality (2), (e) follows by setting M = 2 n(i(w ;)+ε+δ), and (f) follows because log(α +) log(α)+ for α. We know for Markov chains W, I(W ; ) I(W ; )=I(W ; ). Thus, for our optimal distribution p(w, x, y), we have n( ) n(h G( )+2ε + δ)++ε 2 +ε 2 Because n > ε, n < ε. Thus, n Hχ G n( ) H G ( )+3ε + δ. Therefore, sup n n Hχ G n( ) H G( ). Combining the lower bound, Lemma 2, and the upper bound, Lemma 3, we have our result: n n Hχ G n( ) =H G( ). (4) III. CONCLUSION We have validated the idea of graph coloring to achieve all possible rates. As a consequence, it is possible to decouple the side information functional source coding problem into a problem of graph coloring followed by a coding scheme that removed the redundancy between the colors (like Slepian- Wolf). While the graph coloring problem is shown to be NP complete, this decoupling allows for the decomposition of a hard problem into one for which many heuristics and copious literature exist (graph coloring) and one that is solved (distributed source coding). We plan to show that this decoupling is possible in the more general case when is a separate source (and not side information). In this case, we will wish to code both and. This will generalize our current results where we assume the rate for is H( ). Further, we plan to consider a relaxation of the fidelity criterion. We will wish to see if amamoto s rate distortion function can be achieved with graph coloring, as well as considering the case where is not side information. Thus, we hope to have a unified solution that will allow for the distributed compression of functions of correlated sources. REFERENCES [] S. S. Pradhan and K. Ramchandran, Distributed source coding using syndromes (DISCUS): design and construction, IEEE Trans. Inform. Theory, vol. 49, no. 3, pp , Mar [2] T. P. Coleman, A. H. Lee, M. Médard, and M. Effros, Low-complexity approaches to Slepian-Wolf near-lossless distributed data compression, IEEE Trans. Inform. Theory, vol. 52, no. 8, pp , Aug [3] J. Körner, Coding of an information source having ambiguous alphabet and the entropy of graphs, 6th Prague Conference on Information Theory, 973, pp [4] H. amamoto, Wyner-ziv theory for a general function of the correlated sources, IEEE Trans. Inform. Theory, vol. 28, no. 5, pp , Sept [5] A. Orlitsky and J. R. Roche, Coding for computing, IEEE Trans. Inform. Theory, vol. 47, no. 3, pp , Mar [6] A. Wyner and J. Ziv, The rate-distortion function for source coding with side information at the decoder, IEEE Trans. Inform. Theory, vol. 22, no., pp. 0, Jan [7] D. Slepian and J. K. Wolf, Noiseless coding of correlated information sources, IEEE Trans. Inform. Theory, vol. 9, no. 4, pp , July 973. [8] H. S. Witsenhausen, The zero-error side information problem and chromatic numbers, IEEE Trans. Inform. Theory, vol. 22, no. 5, pp , Sept [9] H. Feng, M. Effros, and S. Savari, Functional source coding for networks with receiver side information, Allerton Conference on Communication, Control, and Computing, Sept [0] J. Cardinal, S. Fiorini, and G. Joret, Minimum entropy coloring, Lecture Notes on Computer Science, ser. International Symposium on Algorithms and Computation,. Deng and D. Du, Eds., vol Springer-Verlag, 2005, pp [] J. Cardinal, S. Fiorini, and G. V. Assche, On minimum entropy graph colorings, ISIT 2004, June July 2004, p. 43. [2] C. McDiarmid, Colourings of random graphs, in Graph Colourings, ser. Pitman Research Notes in Mathematics Series, R. Nelson and R. J. Wilson, Eds. Longman Scientific & Technical, 990, pp [3] G. Campers, O. Henkes, and J. P. Leclerq, Graph coloring heuristics: A survey, some new propositions and computational experiences on random and Leighton s graphs, in Operations research 87, G. K. Rand, Ed., 988, pp [4] T. M. Cover and J. A. Thomas, Elements of Information Theory. New ork: Wiley,
Distributed Functional Compression through Graph Coloring
Distributed Functional Compression through Graph Coloring Vishal Doshi, Devavrat Shah, Muriel Médard, and Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology
More informationCases Where Finding the Minimum Entropy Coloring of a Characteristic Graph is a Polynomial Time Problem
Cases Where Finding the Minimum Entropy Coloring of a Characteristic Graph is a Polynomial Time Problem Soheil Feizi, Muriel Médard RLE at MIT Emails: {sfeizi,medard}@mit.edu Abstract In this paper, we
More informationSource Coding and Function Computation: Optimal Rate in Zero-Error and Vanishing Zero-Error Regime
Source Coding and Function Computation: Optimal Rate in Zero-Error and Vanishing Zero-Error Regime Solmaz Torabi Dept. of Electrical and Computer Engineering Drexel University st669@drexel.edu Advisor:
More informationOn Network Functional Compression
On Network Functional Compression Soheil Feizi, Student Member, IEEE, Muriel Médard, Fellow, IEEE arxiv:0.5496v2 [cs.it] 30 Nov 200 Abstract In this paper, we consider different aspects of the problem
More informationCoding for Computing. ASPITRG, Drexel University. Jie Ren 2012/11/14
Coding for Computing ASPITRG, Drexel University Jie Ren 2012/11/14 Outline Background and definitions Main Results Examples and Analysis Proofs Background and definitions Problem Setup Graph Entropy Conditional
More informationRepresentation of Correlated Sources into Graphs for Transmission over Broadcast Channels
Representation of Correlated s into Graphs for Transmission over Broadcast s Suhan Choi Department of Electrical Eng. and Computer Science University of Michigan, Ann Arbor, MI 80, USA Email: suhanc@eecs.umich.edu
More informationSecret Key and Private Key Constructions for Simple Multiterminal Source Models
Secret Key and Private Key Constructions for Simple Multiterminal Source Models arxiv:cs/05050v [csit] 3 Nov 005 Chunxuan Ye Department of Electrical and Computer Engineering and Institute for Systems
More informationMultiterminal Source Coding with an Entropy-Based Distortion Measure
Multiterminal Source Coding with an Entropy-Based Distortion Measure Thomas Courtade and Rick Wesel Department of Electrical Engineering University of California, Los Angeles 4 August, 2011 IEEE International
More informationA Genetic Algorithm to Minimize Chromatic Entropy
A Genetic Algorithm to Minimize Chromatic Entropy Greg Durrett 1,MurielMédard 2, and Una-May O Reilly 1 1 Computer Science and Artificial Intelligence Laboratory 2 Research Laboratory for Electronics Massachusetts
More informationOn Scalable Coding in the Presence of Decoder Side Information
On Scalable Coding in the Presence of Decoder Side Information Emrah Akyol, Urbashi Mitra Dep. of Electrical Eng. USC, CA, US Email: {eakyol, ubli}@usc.edu Ertem Tuncel Dep. of Electrical Eng. UC Riverside,
More informationDistributed Source Coding Using LDPC Codes
Distributed Source Coding Using LDPC Codes Telecommunications Laboratory Alex Balatsoukas-Stimming Technical University of Crete May 29, 2010 Telecommunications Laboratory (TUC) Distributed Source Coding
More informationCommon Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014
Common Information Abbas El Gamal Stanford University Viterbi Lecture, USC, April 2014 Andrew Viterbi s Fabulous Formula, IEEE Spectrum, 2010 El Gamal (Stanford University) Disclaimer Viterbi Lecture 2
More informationReliable Computation over Multiple-Access Channels
Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,
More informationNetwork coding for multicast relation to compression and generalization of Slepian-Wolf
Network coding for multicast relation to compression and generalization of Slepian-Wolf 1 Overview Review of Slepian-Wolf Distributed network compression Error exponents Source-channel separation issues
More informationCut-Set Bound and Dependence Balance Bound
Cut-Set Bound and Dependence Balance Bound Lei Xiao lxiao@nd.edu 1 Date: 4 October, 2006 Reading: Elements of information theory by Cover and Thomas [1, Section 14.10], and the paper by Hekstra and Willems
More informationVariable Length Codes for Degraded Broadcast Channels
Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates
More informationRemote Source Coding with Two-Sided Information
Remote Source Coding with Two-Sided Information Basak Guler Ebrahim MolavianJazi Aylin Yener Wireless Communications and Networking Laboratory Department of Electrical Engineering The Pennsylvania State
More informationECE Information theory Final (Fall 2008)
ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1
More informationOn Common Information and the Encoding of Sources that are Not Successively Refinable
On Common Information and the Encoding of Sources that are Not Successively Refinable Kumar Viswanatha, Emrah Akyol, Tejaswi Nanjundaswamy and Kenneth Rose ECE Department, University of California - Santa
More informationOn Scalable Source Coding for Multiple Decoders with Side Information
On Scalable Source Coding for Multiple Decoders with Side Information Chao Tian School of Computer and Communication Sciences Laboratory for Information and Communication Systems (LICOS), EPFL, Lausanne,
More informationLinear Coding Schemes for the Distributed Computation of Subspaces
Linear Coding Schemes for the Distributed Computation of Subspaces V. Lalitha, N. Prakash, K. Vinodh, P. Vijay Kumar and S. Sandeep Pradhan Abstract arxiv:1302.5021v1 [cs.it] 20 Feb 2013 Let X 1,...,X
More informationAalborg Universitet. Bounds on information combining for parity-check equations Land, Ingmar Rüdiger; Hoeher, A.; Huber, Johannes
Aalborg Universitet Bounds on information combining for parity-check equations Land, Ingmar Rüdiger; Hoeher, A.; Huber, Johannes Published in: 2004 International Seminar on Communications DOI link to publication
More informationSOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003
SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003 SLEPIAN-WOLF RESULT { X i} RATE R x ENCODER 1 DECODER X i V i {, } { V i} ENCODER 0 RATE R v Problem: Determine R, the
More informationLecture 11: Polar codes construction
15-859: Information Theory and Applications in TCS CMU: Spring 2013 Lecturer: Venkatesan Guruswami Lecture 11: Polar codes construction February 26, 2013 Scribe: Dan Stahlke 1 Polar codes: recap of last
More informationDistributed Lossy Interactive Function Computation
Distributed Lossy Interactive Function Computation Solmaz Torabi & John MacLaren Walsh Dept. of Electrical and Computer Engineering Drexel University Philadelphia, PA 19104 Email: solmaz.t@drexel.edu &
More informationA Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels
A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels S. Sandeep Pradhan a, Suhan Choi a and Kannan Ramchandran b, a {pradhanv,suhanc}@eecs.umich.edu, EECS Dept.,
More informationDesign of Optimal Quantizers for Distributed Source Coding
Design of Optimal Quantizers for Distributed Source Coding David Rebollo-Monedero, Rui Zhang and Bernd Girod Information Systems Laboratory, Electrical Eng. Dept. Stanford University, Stanford, CA 94305
More informationLecture 11: Quantum Information III - Source Coding
CSCI5370 Quantum Computing November 25, 203 Lecture : Quantum Information III - Source Coding Lecturer: Shengyu Zhang Scribe: Hing Yin Tsang. Holevo s bound Suppose Alice has an information source X that
More informationSuperposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels
Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:
More informationLattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function
Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Dinesh Krithivasan and S. Sandeep Pradhan Department of Electrical Engineering and Computer Science,
More informationLecture 4 Noisy Channel Coding
Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem
More informationChapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University
Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission
More informationChapter 9 Fundamental Limits in Information Theory
Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For
More informationThe Poisson Channel with Side Information
The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH
More informationUpper Bounds on the Capacity of Binary Intermittent Communication
Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,
More informationSHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe
SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/41 Outline Two-terminal model: Mutual information Operational meaning in: Channel coding: channel
More informationEE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15
EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability
More informationAN INFORMATION THEORY APPROACH TO WIRELESS SENSOR NETWORK DESIGN
AN INFORMATION THEORY APPROACH TO WIRELESS SENSOR NETWORK DESIGN A Thesis Presented to The Academic Faculty by Bryan Larish In Partial Fulfillment of the Requirements for the Degree Doctor of Philosophy
More informationInteractive Decoding of a Broadcast Message
In Proc. Allerton Conf. Commun., Contr., Computing, (Illinois), Oct. 2003 Interactive Decoding of a Broadcast Message Stark C. Draper Brendan J. Frey Frank R. Kschischang University of Toronto Toronto,
More informationEE5585 Data Compression May 2, Lecture 27
EE5585 Data Compression May 2, 2013 Lecture 27 Instructor: Arya Mazumdar Scribe: Fangying Zhang Distributed Data Compression/Source Coding In the previous class we used a H-W table as a simple example,
More informationOn The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers
On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers Albrecht Wolf, Diana Cristina González, Meik Dörpinghaus, José Cândido Silveira Santos Filho, and Gerhard Fettweis Vodafone
More informationExtended Subspace Error Localization for Rate-Adaptive Distributed Source Coding
Introduction Error Correction Extended Subspace Simulation Extended Subspace Error Localization for Rate-Adaptive Distributed Source Coding Mojtaba Vaezi and Fabrice Labeau McGill University International
More informationChannel Coding: Zero-error case
Channel Coding: Zero-error case Information & Communication Sander Bet & Ismani Nieuweboer February 05 Preface We would like to thank Christian Schaffner for guiding us in the right direction with our
More informationCOMPLEMENTARY graph entropy was introduced
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 6, JUNE 2009 2537 On Complementary Graph Entropy Ertem Tuncel, Member, IEEE, Jayanth Nayak, Prashant Koulgi, Student Member, IEEE, Kenneth Rose, Fellow,
More informationSource and Channel Coding for Correlated Sources Over Multiuser Channels
Source and Channel Coding for Correlated Sources Over Multiuser Channels Deniz Gündüz, Elza Erkip, Andrea Goldsmith, H. Vincent Poor Abstract Source and channel coding over multiuser channels in which
More informationLecture 8: Shannon s Noise Models
Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have
More informationDistributed Lossless Compression. Distributed lossless compression system
Lecture #3 Distributed Lossless Compression (Reading: NIT 10.1 10.5, 4.4) Distributed lossless source coding Lossless source coding via random binning Time Sharing Achievability proof of the Slepian Wolf
More informationMultimedia Communications. Mathematical Preliminaries for Lossless Compression
Multimedia Communications Mathematical Preliminaries for Lossless Compression What we will see in this chapter Definition of information and entropy Modeling a data source Definition of coding and when
More informationECE Information theory Final
ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the
More information(Classical) Information Theory III: Noisy channel coding
(Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way
More informationSlepian-Wolf Code Design via Source-Channel Correspondence
Slepian-Wolf Code Design via Source-Channel Correspondence Jun Chen University of Illinois at Urbana-Champaign Urbana, IL 61801, USA Email: junchen@ifpuiucedu Dake He IBM T J Watson Research Center Yorktown
More informationThe Gallager Converse
The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing
More informationSHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe
SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/40 Acknowledgement Praneeth Boda Himanshu Tyagi Shun Watanabe 3/40 Outline Two-terminal model: Mutual
More information6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011
6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011 On the Structure of Real-Time Encoding and Decoding Functions in a Multiterminal Communication System Ashutosh Nayyar, Student
More informationThe Communication Complexity of Correlation. Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan
The Communication Complexity of Correlation Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan Transmitting Correlated Variables (X, Y) pair of correlated random variables Transmitting
More informationAmobile satellite communication system, like Motorola s
I TRANSACTIONS ON INFORMATION THORY, VOL. 45, NO. 4, MAY 1999 1111 Distributed Source Coding for Satellite Communications Raymond W. Yeung, Senior Member, I, Zhen Zhang, Senior Member, I Abstract Inspired
More informationLecture 5 Channel Coding over Continuous Channels
Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From
More informationOn Optimum Conventional Quantization for Source Coding with Side Information at the Decoder
On Optimum Conventional Quantization for Source Coding with Side Information at the Decoder by Lin Zheng A thesis presented to the University of Waterloo in fulfillment of the thesis requirement for the
More informationVariable-Rate Universal Slepian-Wolf Coding with Feedback
Variable-Rate Universal Slepian-Wolf Coding with Feedback Shriram Sarvotham, Dror Baron, and Richard G. Baraniuk Dept. of Electrical and Computer Engineering Rice University, Houston, TX 77005 Abstract
More informationDigital Communications III (ECE 154C) Introduction to Coding and Information Theory
Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I
More informationThe Capacity Region of the Gaussian Cognitive Radio Channels at High SNR
The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR 1 Stefano Rini, Daniela Tuninetti and Natasha Devroye srini2, danielat, devroye @ece.uic.edu University of Illinois at Chicago Abstract
More informationApproximately achieving the feedback interference channel capacity with point-to-point codes
Approximately achieving the feedback interference channel capacity with point-to-point codes Joyson Sebastian*, Can Karakus*, Suhas Diggavi* Abstract Superposition codes with rate-splitting have been used
More informationCommunication Cost of Distributed Computing
Communication Cost of Distributed Computing Abbas El Gamal Stanford University Brice Colloquium, 2009 A. El Gamal (Stanford University) Comm. Cost of Dist. Computing Brice Colloquium, 2009 1 / 41 Motivation
More informationIN this paper, we study the problem of universal lossless compression
4008 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 9, SEPTEMBER 2006 An Algorithm for Universal Lossless Compression With Side Information Haixiao Cai, Member, IEEE, Sanjeev R. Kulkarni, Fellow,
More informationChain Independence and Common Information
1 Chain Independence and Common Information Konstantin Makarychev and Yury Makarychev Abstract We present a new proof of a celebrated result of Gács and Körner that the common information is far less than
More informationEquivalence for Networks with Adversarial State
Equivalence for Networks with Adversarial State Oliver Kosut Department of Electrical, Computer and Energy Engineering Arizona State University Tempe, AZ 85287 Email: okosut@asu.edu Jörg Kliewer Department
More informationNotes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel
Introduction to Coding Theory CMU: Spring 2010 Notes 3: Stochastic channels and noisy coding theorem bound January 2010 Lecturer: Venkatesan Guruswami Scribe: Venkatesan Guruswami We now turn to the basic
More informationA Practical and Optimal Symmetric Slepian-Wolf Compression Strategy Using Syndrome Formers and Inverse Syndrome Formers
A Practical and Optimal Symmetric Slepian-Wolf Compression Strategy Using Syndrome Formers and Inverse Syndrome Formers Peiyu Tan and Jing Li (Tiffany) Electrical and Computer Engineering Dept, Lehigh
More informationQuantization for Distributed Estimation
0 IEEE International Conference on Internet of Things ithings 0), Green Computing and Communications GreenCom 0), and Cyber-Physical-Social Computing CPSCom 0) Quantization for Distributed Estimation uan-yu
More information5218 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER 2006
5218 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER 2006 Source Coding With Limited-Look-Ahead Side Information at the Decoder Tsachy Weissman, Member, IEEE, Abbas El Gamal, Fellow,
More informationGeneralized Writing on Dirty Paper
Generalized Writing on Dirty Paper Aaron S. Cohen acohen@mit.edu MIT, 36-689 77 Massachusetts Ave. Cambridge, MA 02139-4307 Amos Lapidoth lapidoth@isi.ee.ethz.ch ETF E107 ETH-Zentrum CH-8092 Zürich, Switzerland
More information4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information
4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information Ramji Venkataramanan Signal Processing and Communications Lab Department of Engineering ramji.v@eng.cam.ac.uk
More informationCommunicating the sum of sources in a 3-sources/3-terminals network
Communicating the sum of sources in a 3-sources/3-terminals network Michael Langberg Computer Science Division Open University of Israel Raanana 43107, Israel Email: mikel@openu.ac.il Aditya Ramamoorthy
More informationShannon s noisy-channel theorem
Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for
More informationInformation Theoretic Limits of Randomness Generation
Information Theoretic Limits of Randomness Generation Abbas El Gamal Stanford University Shannon Centennial, University of Michigan, September 2016 Information theory The fundamental problem of communication
More informationInformation Theory. Coding and Information Theory. Information Theory Textbooks. Entropy
Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is
More informationOptimality of Walrand-Varaiya Type Policies and. Approximation Results for Zero-Delay Coding of. Markov Sources. Richard G. Wood
Optimality of Walrand-Varaiya Type Policies and Approximation Results for Zero-Delay Coding of Markov Sources by Richard G. Wood A thesis submitted to the Department of Mathematics & Statistics in conformity
More informationOptimal matching in wireless sensor networks
Optimal matching in wireless sensor networks A. Roumy, D. Gesbert INRIA-IRISA, Rennes, France. Institute Eurecom, Sophia Antipolis, France. Abstract We investigate the design of a wireless sensor network
More informationAN INTRODUCTION TO SECRECY CAPACITY. 1. Overview
AN INTRODUCTION TO SECRECY CAPACITY BRIAN DUNN. Overview This paper introduces the reader to several information theoretic aspects of covert communications. In particular, it discusses fundamental limits
More information18.2 Continuous Alphabet (discrete-time, memoryless) Channel
0-704: Information Processing and Learning Spring 0 Lecture 8: Gaussian channel, Parallel channels and Rate-distortion theory Lecturer: Aarti Singh Scribe: Danai Koutra Disclaimer: These notes have not
More informationCapacity Region of the Permutation Channel
Capacity Region of the Permutation Channel John MacLaren Walsh and Steven Weber Abstract We discuss the capacity region of a degraded broadcast channel (DBC) formed from a channel that randomly permutes
More informationIndex coding with side information
Index coding with side information Ehsan Ebrahimi Targhi University of Tartu Abstract. The Index Coding problem has attracted a considerable amount of attention in the recent years. The problem is motivated
More informationEfficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel
Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel Ahmad Abu Al Haija ECE Department, McGill University, Montreal, QC, Canada Email: ahmad.abualhaija@mail.mcgill.ca
More informationarxiv: v1 [cs.it] 5 Sep 2008
1 arxiv:0809.1043v1 [cs.it] 5 Sep 2008 On Unique Decodability Marco Dalai, Riccardo Leonardi Abstract In this paper we propose a revisitation of the topic of unique decodability and of some fundamental
More informationSecond-Order Asymptotics in Information Theory
Second-Order Asymptotics in Information Theory Vincent Y. F. Tan (vtan@nus.edu.sg) Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) National Taiwan University November 2015
More information10-704: Information Processing and Learning Fall Lecture 9: Sept 28
10-704: Information Processing and Learning Fall 2016 Lecturer: Siheng Chen Lecture 9: Sept 28 Note: These notes are based on scribed notes from Spring15 offering of this course. LaTeX template courtesy
More informationUniversal Incremental Slepian-Wolf Coding
Proceedings of the 43rd annual Allerton Conference, Monticello, IL, September 2004 Universal Incremental Slepian-Wolf Coding Stark C. Draper University of California, Berkeley Berkeley, CA, 94720 USA sdraper@eecs.berkeley.edu
More informationIntroduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information.
L65 Dept. of Linguistics, Indiana University Fall 205 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission rate
More informationBounded Expected Delay in Arithmetic Coding
Bounded Expected Delay in Arithmetic Coding Ofer Shayevitz, Ram Zamir, and Meir Feder Tel Aviv University, Dept. of EE-Systems Tel Aviv 69978, Israel Email: {ofersha, zamir, meir }@eng.tau.ac.il arxiv:cs/0604106v1
More informationExercise 1. = P(y a 1)P(a 1 )
Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a
More informationDept. of Linguistics, Indiana University Fall 2015
L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 28 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission
More informationAn Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel
Forty-Ninth Annual Allerton Conference Allerton House, UIUC, Illinois, USA September 8-3, An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel Invited Paper Bernd Bandemer and
More informationOn bounded redundancy of universal codes
On bounded redundancy of universal codes Łukasz Dębowski Institute of omputer Science, Polish Academy of Sciences ul. Jana Kazimierza 5, 01-248 Warszawa, Poland Abstract onsider stationary ergodic measures
More informationPerformance of Polar Codes for Channel and Source Coding
Performance of Polar Codes for Channel and Source Coding Nadine Hussami AUB, Lebanon, Email: njh03@aub.edu.lb Satish Babu Korada and üdiger Urbanke EPFL, Switzerland, Email: {satish.korada,ruediger.urbanke}@epfl.ch
More informationLower Bounds on the Graphical Complexity of Finite-Length LDPC Codes
Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Igal Sason Department of Electrical Engineering Technion - Israel Institute of Technology Haifa 32000, Israel 2009 IEEE International
More information1 Introduction to information theory
1 Introduction to information theory 1.1 Introduction In this chapter we present some of the basic concepts of information theory. The situations we have in mind involve the exchange of information through
More informationOn Zero-Error Source Coding With Decoder Side Information
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 49, NO. 1, JANUARY 2003 99 On Zero-Error Source Coding With Decoder Side Information Prashant Koulgi, Student Member, IEEE, Ertem Tuncel, Student Member, IEEE,
More informationSubset Typicality Lemmas and Improved Achievable Regions in Multiterminal Source Coding
Subset Typicality Lemmas and Improved Achievable Regions in Multiterminal Source Coding Kumar Viswanatha, Emrah Akyol and Kenneth Rose ECE Department, University of California - Santa Barbara {kumar,eakyol,rose}@ece.ucsb.edu
More informationLecture 15: Conditional and Joint Typicaility
EE376A Information Theory Lecture 1-02/26/2015 Lecture 15: Conditional and Joint Typicaility Lecturer: Kartik Venkat Scribe: Max Zimet, Brian Wai, Sepehr Nezami 1 Notation We always write a sequence of
More informationLecture 22: Final Review
Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information
More information