Rate region for a class of delay mitigating codes and P2P networks

Size: px
Start display at page:

Download "Rate region for a class of delay mitigating codes and P2P networks"

Transcription

1 Rate region for a class of delay mitigating codes and P2P networks Steven Weber, Congduan Li, John MacLaren Walsh Drexel University, Dept of ECE, Philadelphia, PA Abstract This paper identifies the relevance of a distributed source coding problem first formulated by Yeung and Zhang in 1999 to two applications in network design: i) the design of delay mitigating codes, and ii) the design of network coded P2P networks When transmitting time-sensitive frames from a source to a destination over a multipath network using a collection of coded packets, the decoding requirements determine which subsets of packets will be sufficient for decoding which frames The rate region of packet sizes consistent with these requirements is shown to be an instance of the aforementioned distributed source coding problem When encoding file chunks into packets in a peer to peer system, the peers wish to receive their chunks as soon as possible while uploading data at as low a rate as possible It is shown that the region of encoded packet sizes consistent with the decoding constraints is another instance of the aforementioned distributed source coding problem These rate regions are placed in the larger context of rate-delay tradeoffs in designing delay mitigating codes and efficient P2P systems I INTRODUCTION This paper identifies the relevance of a distributed source coding problem first formulated by Yeung and Zhang in 1999 [1] to two applications in network design: i) the design of delay mitigating codes, and ii) the design of network coded P2P networks The paper is organized as follows II discusses a broad array of related work in source and network coding (comprising much of the paper), focusing in particular on i) Yeung and Zhang s distributed source coding problem [1] in II-A, ii) multilevel diversity coding [2] in II-B, iii) the multi-source network coding problem (with independent sources) as formulated by Yeung in his book [3] (Chapter 21) in II-C, iv) the Han and Kobayashi multiterminal source coding problem [4] in II-D, v) multi-source network coding with correlated sources (eg, [5]) in II-E, and vi) categorizing these problems in II-F III discusses the design of delay mitigating codes we highlight the inherent rate delay tradeoff, and identify the rate region as an instance of [1] IV discusses the design of network coded P2P networks with decoding constraints we identify key design issues and tradeoffs, and identify the rate region as an instance of [1] There are no new theorems in this paper the contribution is the identification of the relevance of these various source and network coding problems for a structured class of rate delay codes and P2P network coded networks II RELATED WORK A Yeung and Zhang s distributed source coding problem Yeung and Zhang (1999) [1] consider the distributed source coding (YZDSC) problem in Fig 1 Their motivation is satellite communications where encoders E at satellites connect sources S with destinations D The problem consists of: 1) A set of independent sources (with index set S) with each source S j holding an iid sequence of discrete random variables (RVs) {X ji } with specified entropy H(X j ) The RV Y j is associated with source S j in the analysis 2) A set of encoders (with index set E) where encoder E l has access to a subset U l Sof the sources The source to encoder mapping is specified by the relation A S E The RV Z l is associated with encoder E l in the analysis Each encoder operates at rate R l in that H(Z l ) R l, and the rate vector is R =(R l,l E) 3) A set of decoders (with index set D) where decoder D m has access to a subset V m E of the encoded messages The encoder to decoder mapping is specified by the relation B E D Decoder D m is to decode a subset F m Sof the sources A rate vector R is admissible if for arbitrary error probability bound >0there exists a sufficiently large blocklength n such that a code exists with encoder rates n 1 log η l R l + for l S, and decoder Hamming error probabilities m, for m D Here η l is the number of codewords available to encoder l and m is the Hamming error probability for decoder m, cf [1] The rate region is the set of admissibile rate vectors: R = {R : R is admissible} Y j H(X j ) Z l S j S U l A S E R l E l E V m B E D D m D F m S Fig 1 Yeung and Zhang s distributed source coding problem [1] Yeung and Zhang express the rate region R in terms of the region of entropic vectors, Γ N, cf [3] (Chapters 13 16) The entropy vector h R 2N 1 for a finite set of N discrete RVs (W 1,,W N ) (with arbitrary joint distribution) has a component h G = H(W i,i G) for each nonempty G {1,,N} (note there are 2 N 1 such Gs) In turn, an arbitrary vector h R 2N 1 is said to be entropic if there exist a joint distribution on N discrete RVs with h as the corresponding entropy vector

2 Collect the RVs for sources and encoders as N = {(Y j,j S) (Z l,l E)}, N = N (1) Define the region R in as the set of rate vectors R such that there exists h Γ N satisfying h Yj,j S = j S h Yj (2) h Zl (Y j,j U l ) = 0, l E (3) h (Yj,j F m) (Z l,l V m) = 0, m D (4) h Yj > H(X j ), j S (5) h Zl R l, l E (6) where expressions involving conditional entropies are understood as the corresponding unconditioned entropy vector inequalities ([3] Chapter 13) These constraints admit natural interpretations: (2) requires the sources be independent; (3) requires each codeword is a function of the available inputs; (4) requires each decoder is able to recover its intended subset of sources; (5) requires each source exceed its rate constraint; and (6) requires each encoder to obey its rate constraint Yeung and Zhang prove R in is an inner bound They also prove an outer bound R out obtained from R in by i) relaxing the strict inequality in (5) to allow for equality (h Yj H(X j ) for j S), and ii) expanding Γ N to include its closure, Γ N Theorem 1 ([1]) R in R R out The outer bound was later shown to be tight for the more general problem discussed in II-C B Multilevel diversity coding (MDC) Multilevel diversity coding (MDC) is the problem of taking several encodings of a set of sources (each encoder with its own rate), and asking each of several decoders (each with access to a specific subset of the various encodings) to decode a specified subset of the sources Applications have been found in distributed storage, network reliability, secret sharing, and satellite communications MDC was first introduced in 1992 in the thesis of Roche [6] Early seminal papers include Yeung (1995) [7], Roche, Yeung, and Hau (1995) [8], [9], and Yeung and Zhang (1999) [2] The connection between MDC and the multiple descriptions problem is explored in [7], Fu and Yeung (2002) [10], as well as in the more recent work of Mohajer, Tian and Diggavi (2008) [11], [12] Fig 2 shows the symmetric MDC (SMDC) problem ([9], [2]) with K co-located iid sources Y 1,,Y K (each Y k representing an iid sequence {X ki }) that are prioritized in that Y k is deemed more important than Y k+1 for each k [K 1] These sources are made available to K encoders E 1,,E K, with rates R 1,,R K There are 2 K 1 decoders one for each nonempty subset of [K] The K k decoders corresponding to subsets of [K] of size k each seek to decode sources Y 1,,Y k, for each k [K] The source prioritization is reflected in the fact that source Y k is recoverable from any decoder of level k or higher Y 1 Y K iid S R 1 E 1 R k E l E K { K decoders at level 1 K k { R K { Y 1 Y 1 decoders at level k Y 1 Y k Y 1 Y k 1 decoder at level K Y 1 Y K Fig 2 Yeung & Zhang s symmetric multilevel diversity coding problem [2] It is shown in [2] that superposition coding suffices to achieve any point in the SMDC rate region Superposition coding in this context means each encoder E k with rate R k separately encodes each of the K sources with rates (rk 1,,rK k ) such that l K rl k = R k [7] ( IIB) The symmetry in SMDC refers to the fact that all decoders at level k recover the same subset of the sources, ie, Y 1,,Y k The optimality of superposition coding is dependent upon the symmetry of the problem A simple counter-example showing the inadequacy of superposition coding for general MDC is shown in [7] ( IIB), while [13], [14] identifies that superposition coding is optimal for 86 of the 100 MDC configurations with three encoders The asymmetric (AMDC) problem of encoding K sources with three encoders and 2 K 1 decoders, where the decoding capability depends upon the particular subset of encodings (ie, not just the cardinality of that set), is studied in [11], [12] They also find superposition coding is in general suboptimal (although linear network coding suffices) 1 The SMDC rate region is given below Theorem 2 ([2]) The rate region R for SMDC with K encoders is the set of R =(R 1,,R K ) such that K R k = rk, l k [K], (7) l=1 for some (rk l 0,l [K],k [K]) satisfying r D k H(X D ), D [K], D = (8) k D The (rk l ) are the superposition coding vectors (7) states the superposition coding vector must cover the rate constraint at each encoder, and (8) states the decoder associated with subset D should receive sufficient rate to decode X D Decoder D is able to decode X k for k< D due to the corresponding constraints for decoders D with D = k This result generalizes the explicit rate region for SMDC with 3 encoders given in [9] An explicit rate region for AMDC with 3 encoders is given in [12] The AMDC problem is an instance of the YZDSC problem in II-A; of course the same is also true of the SMDC problem 1 The problem definitions of AMDC appear to be slightly different in [10] (Fig 8) and [12] (Fig 1)

3 C Multi-source network coding A key structural property of the YZDSC problem in II-A is the absence of intermediate relay nodes between the encoders and decoders The multi-source network coding (MSNC) problem generalizes the YZDSC problem by allowing for an arbitrary topology of intermediate encoding nodes between the source encoders and decoders Notably, the MSNC problem retains the assumption of independent sources, at least as formulated by Yeung [3] (Chapter 21), whose presentation and notation we follow below An acyclic graph G =(V, E) has two disjoint subsets S, T representing sources and destinations The source RVs (Y s,s S) are independent with entropies H(Y s ) ω s for each s S Each destination t T requests a subset β(t) Sof the sources Each edge e E has an associated RV U e with an entropy H(U e ) R e The following constraints hold: H(Y s ) ω s, s S (9) H(Y S ) = s S H(Y s ), H(U Out(s) Y s ) = 0, s S H(U Out(i) U In(i) ) = 0, i V\(S T) H(U e ) R e, e E H(Y β(t) U In(t) ) = 0, t T These constraints model i) the source rates obey the constraints ω, ii) the sources are independent, iii) the messages sent on edges Out(s) (outgoing from source s) are functions of Y s, iv) the outgoing messages on edges Out(i) are functions of the incoming messages on edges In(i) at each intermediate node i, v) the edge message variables obey the edge rate constraints R, and vi) each destination t is able to recover its subset of source messages β(t) from the messages incoming to t Collect the RVs for the problem as the set N = {(Y s,s S) (U e,e E)} (10) with N = N, and let Γ N be the region of entropic vectors for these variables Yeung [3] (Theorem 215) gives the rate region R for the sources ω =(ω s,s S) as the intersection of Γ N with the constraints in (9) Theorem 3 ([3]) The rate region R for the MSNC problem is the set of ω =(ω s,s S) such that there exist RVs N in (10) in Γ N satisfying (9) Several technical caveats hold, see [3] for a more precise statement of the theorem The rate regions for both the YZDSC and MSNC problems are in terms of the region of entropic vectors Γ N (more precisely, its closure, Γ N ) Although Γ N is known to be a convex cone ([3] Theorem 155), it has not been characterized for N 4 RVs In fact for N 4 it is known that Γ N is curved and hence cannot be described by any finite set of linear inequalities [15] Consequently we must employ computable inner and outer bounds on Γ N The Shannon linear programming outer bound Γ N is the intersection of Shannon s classical information inequalities, which is equivalent to the nonnegativity of conditional mutual information I(X A ; X B X C ) 0, A, B, C N (11) for all (possibly empty and possibly overlapping) subsets A, B, C of N The important fact that the Shannon LP outer bound is loose is due to Zhang and Yeung [16] Ingleton s inequality [17], in its information theoretic form [16] (p 1444), states that for N =4RVs I(X 1 ; X 2 )+I(X 3 ; X 4 X 1 )+I(X 3 ; X 4 X 2 ) I(X 3 ; X 4 ) 0 (12) There are six nonredundant permutations of this inequality Zhang and Yeung prove that the intersection of the Shannon LP outer bound with Ingleton s inequality gives an inner bound on Γ N for N =4variables; see [18] for a discussion Finally, linear network coding is in general suboptimal in that there exist MSNC problems for which the rate region achievable by linear network coding is a loose inner bound [19] Y s s ω s S In(i) i e U e R e Out(i) t β(t) Fig 3 Yeung s formulation of the multi-source network coding problem [3] D Han and Kobayashi multiterminal source coding The YZDSC problem in II-A is similar to the Han and Kobayashi (1980) multiterminal source coding (HKMSC) problem [4], shown in Fig 4 A collection of correlated sources (Y e,e E) is made available to a collection of encoders E each source to exactly one encoder The encoders have rates R = (R e,e E), and each encoder sends its message to a subset of decoders D Each decoder D m receives a subset of encoded messages, and must decode a particular source Y k(m), where k(m) is the index of the encoder of one of the received messages This problem generalizes the YZDSC problem in that it relaxes the assumption of independent sources One may observe two structural differences between YZDSC and HKMSC: i) in YZDSC each encoder receives a subset of the source messages whereas in HKMSC each encoder receives a single source message, and ii) in YZDSC each decoder is responsible for a subset of source messages whereas in HKMSC each decoder must decode a single source message In both cases we can always construct an instance of HKMSC for any instance of YZDSC Namely, i) for every encoder E l in YZDSC, create an encoder E e in HKMSC with the source Y e =(Y j,j U l ) (recall Fig 1), and ii) for every source s F m to be recovered by decoder D m in YZDSC create a decoder (d, s) for Y s in HKMSC with the same set T

4 of incoming edges V m as D m 2 Thus we see YZDSC is in fact a generalization of HKMSC on account of allowing for correlation among the sources The main result in [4] is an achievable region (the Berger- Tung inner bound [20], [21]) that is tight for all known sub-cases for which the rate region is known (Theorem 2) Further, it is worth emphasizing that the achievable region is defined as the union of a collection of half-spaces, where each half-space is defined over all collections of message variables (the (U e,e E) in [4]) obeying certain distribution constraints Taking a union over all possible collections of random variables is equivalent to requiring membership in Γ N, the region of entropic vectors Y e R e E e E Y D k(m) m Fig 4 The Han and Kobayashi multiterminal source coding problem [4] E Multi-source network coding with correlated sources Finally, we briefly mention more recent work on multisource network coding with correlated sources (MSNCCS) Key references include Ho, Médard, Effros, Koetter (2004) [22], Ho et al (2006) [5], Barros and Servetto (2006) [23], Ramamoorthy, Jain, Chou and Effros (2006) [24], Han (2010) [25], and Ramamoorthy (2011) [26] As a representative significant result we mention [5] (cf [26]) Consider a capacitated directed acyclic graph G with a collection of correlated sources (Y s,s S) (with rates R =(R s,s S)) and assume a set of terminals T such that each terminal wishes to receive all sources This last assumption is critical for what follows Ignoring the network for a minute, the rate region R is the contra-polymatroid Slepian Wolf region [27], ie, those R obeying the following 2 S 1 linear inequalities: s S R s H(Y S Y c S ), S S (13) The capacity region is determined by the cutsets in G between each subset of sources and each terminal Namely, it is the set of R obeying the following 2 S 1 linear inequalities: R s min c G(S,T t ), S S, (14) t T s S where c G (S,T t ) is the min cut in G between sources S and terminal T t Theorem 4 ([5]) If there exists a rate vector R satisfying both the Slepian-Wolf constraints (13) and the cutset constraints 2 Some care must be taken in this construction, the details are omitted D (14) then random linear network coding suffices for the terminals to recover the sources F Classification of related work Here is a coarse classification of these problems Let intermediate denote that there are intermediate nodes that re-encode encoded messages, while no intermediate means encoded source messages are sent directly to decoders Let prioritized ind mean the sources are independent and prioritized into levels; independent refers to independent sources, and correlated refers to general correlated sources no intermediate intermediate prioritized ind SMDC ( II-B) independent YZDSC ( II-A) MSNC ( II-C) correlated HKMSC ( II-D) MSNCCS ( II-E) Note the rows are increasingly general from top to bottom and the columns are increasingly general from left to right Thus SMDC is an instance of YZDSC; YZDSC is an instance of both MSNC and HKMSC, and both MSNC and HKMSC are instances of MSNCCS Our focus in the remainder of this paper is on the YZDSC problem, highlighting structural reasons when restrictions to SMDC instances are of interest III DESIGN OF DELAY MITIGATING CODES AND RATE DELAY TRADEOFFS Consider the problem faced by a source wishing to transmit time-sensitive data to a destination over a multipath network The source consists of K temporally ordered independent B- bit frames (F k,k [K]), which are to be encoded into N packets (C n,n [N]) Each packet is to be sent over one of P paths between the source and destination peer At the sink, the desired frame playback schedule and the time at which frames may actually be decoded is quantified with a delay measure The key decision variables under control of the source are: the time t n R + at which the nth encoded packet C n is sent Collect these into a vector t the path p n P along which the nth encoded packet is sent Collect these into a vector p the subsets of received packets from which each frame may be recovered: i k 1, A = Fk can be determined from packets A 0, else (15) Collect these into a vector i =(i k A,k [K], A [N]) the sizes s =(s n,n [N]) of the encoded packets The sizes must lie in S(i), the region of packet sizes compatible with the decoding requirements i Given t, p, s, the network determines the series of random packet arrival times a = (a n,n [N]) according to the probability distribution p(a t, p, s) From a and i one can determine the first times d =(d k,k [K]) at which each source frame k and all frames prior to it are decodable at the destination We can compare these times with a desired playback schedule b =(b k,k [K]) of times the destination would like to decode these ordered frames While alternative

5 schedules may be interesting, in the framework of rate delay tradeoffs we will select b k = k R where R is the rate of playback in source frames per second One then selects a delay metric based on the difference between these desired playback times and when the associated frames are first available (d k b k ) + Viable options include the i) mean sum delay and ii) outage sum delay: D EΣ = E (d k b k ) + (16) k [K] D Σ = min D : P (d k b k ) + >D (17) k [K] and the iii) mean worst delay (DÊ), and iv) outage worst delay (Dˆ ), defined by replacing the sum with the max in (16) and (17), respectively These depend on decision variables t, p, i, s and frame rate R The rate delay tradeoff 3 in this context is the tradeoff between the frame rate R in frames per second (equivalently the data rate RB in bits per second) and the minimum delay: D o (R) = min D o(r, t, p, i, s), o {EΣ, Ê,Σ, ˆ} (18) t,p,i,s Here the minimum is take over all viable values of the control variables: t R N +, p [P ] N, i {0, 1} K2N, s S(i) When set up this way, this problem breaks up naturally into the following sub-problems Information theory part: find region S(i) for each i Coding part: given an i and s S(i), build a code with a low complexity encoder and decoder achieving the decoding requirements in i with packet lengths s Modeling and estimation part: determine a family of statistical models p(a t, p, s) appropriate for the networks of interest and provide estimators of their parameters Optimization part: given region S(i) from the information theory part, and model p(a t, p, s) from the modeling and estimation part, perform the optimization in (18) We now make a useful observation about S(i) Observation 1 The region S(i) is the rate region of a particular instance of the YZDSC problem in II-A In the specific case where K = N and i is such that frames 1,,k are recoverable from any subset of packets of size k, then the region S(i) is that of the SMDC problem in II-B Given i, we construct the corresponding instance of the YZDSC problem as follows There are K sources S 1,,S K and N encoders E 1,,E N All source variables Y 1,,Y K are available to all encoders, ie, A in Fig 1 is the complete bipartite graph connecting S to E Equivalently, there is a single source S with variables Y 1,,Y K connected to N encoders, as in Fig 2 There are 2 N 1 decoders, one for each nonempty subset of [N], and B in Fig 1 is configured accordingly Each decoder D m (with associated subset D m [N]) 3 A special issue on rate delay tradeoffs has recently appeared [28] seeks to recover the source variables (frames): {Y k : i k A =1, A D m } (19) The particular case where the rate region S(i) is that of the SMDC problem was addressed in our earlier work [29] We observe each i is an instance of YZDSC with a specific structure that conceivably may yield a more explicit representation of S(i) than that of Theorem 1 This is the subject of our current work It does not seem likely that superposition coding will achieve the entire rate region S(i) based on the examples demonstrating the suboptimality of superposition coding for non-symmetric MDC IV DESIGN OF NETWORK CODED P2P NETWORKS WITH DECODING CONSTRAINTS A peer to peer (P2P) file sharing system must be designed with several important metrics and constraints in mind, including: i) peers wish to obtain their desired chunks/files in a minimal amount of time, and ii) peers wish to minimize their required upload rate These two desires are in tension with one another Hence it is of interest to design P2P file sharing systems to optimally tradeoff between these two metrics [30] We consider the problem of optimally trading rate for delay in a class of P2P file sharing systems utilizing a restricted form of network coding The problem is formulated from a centralized omniscient standpoint, and it is shown that the overall problem of trading rate for delay decomposes naturally into several decoupled parts The solution of one of these parts is linked to an important result in distributed source coding As stated, the form of network coding we exploit for the P2P system is restricted in that we assume an overlay network directly connecting peers, ie, the encoded packets traversing the multi-hop paths underyling the overlay network are not encoded by intermediate routers, as in network coding Rather than being done for theoretical convenience, this model restriction is done with practical several system level issues in mind, in particular: Security: allowing for coded files/chunks to be combined/reencoded at intermediate network nodes, as in network coding, increases the risk of malicious or accidental file corruption Incentives: peers have a stronger incentive to obtain and then share chunks/files they can decode than to obtain and share chunks/files they cannot decode In particular, let us consider a model for a P2P file sharing system utilized for the dissemination of a certain collection of source file chunks F 1,,F K (these chunks may be chunks from several files) Peer p seeks file chunks with indices D(p) [K] by communicating with other peers over time Time is slotted, and at the beginning of any given time slot t, peer p has a subset of chunks H(p, t) D(p) it has decoded thus far During the time slot, it can encode the chunks it has into messages sent to other peers In particular, let the set of peers that peer p uploads messages to during time slot t be P(p) and the associated size of the message that p sends to p P(p, t) be R p,p (t) Naturally, the

6 uploaded amount of information between peer p and p can not exceed the amount of free capacity between these peers C p,p (t) during this time slot, ie R p,p (t) C p,p (t) The messages received during time slot t are then decoded, and peer p then can start the next time slot with a new, possible larger, set of decoded chunks H(p, t + 1) The delay p that a peer experiences is naturally the minimum t such that H(p, t) =D(p) The aggregate upload rate from peer p in time slot t is naturally U p (t) = p P(p,t) R p,p(t) The simplest model reflecting the desire to share the load of uploading files and minimizing the upload rate would choose the maximum (R = max p,t U p (t)) or total (R = p,t U p(t)) peer upload rate as the metric The delay experienced by different peers can similarly be aggregated into either a maximum or average delay One view of designing efficient P2P systems is then optimally trading these two aggregrate metrics R and against one another The centralized problem of trading rate for delay in peer to peer networks utilizing this class of coding schemes can be broken up into several parts: System monitoring part: monitor the network to determine D(p) the desired chunks for peer p and conservative estimates of C p,p (t) the free capacities between peers p, p in time t Information theory part: find the region R(t, {P(p, t), H(p, t), H(p, t + 1)}) of rates {R p,p (t)} such that peers have chunks {H(p, t)} and {H(p, t + 1)} at the beginning and end of the t time slot, respectively Coding part: given rates [R p,p (t)] design low complexity codes with these rates capable of satisfying the requirements reflected by {H(p, t)} and {H(p, t + 1)} and P Optimization part: given R(t, {P(p, t), H(p, t), H(p, t + 1)}) from the information theory part, and D(p) and C p,p (t) from the network monitoring part, select {P(p, t), H(p, t), H(p, t + 1)} and R p,p (t) Rto minimize the rate metric R subject to a bound on the delay metric We make an observation about the information theory part Observation 2 The region R(t, {P(p, t), H(p, t), H(p, t + 1)}) is an instance of the YZDSC problem ( II-A) In particular, i) add a source variable Y k for each chunk F k, ii) add an encoder for each peer p transmitting in t, iii) add an edge from each chunk to each peer consistent with H(p, t), iv) add a decoder for each peer p receiving in t, v) connect encoders with decoders specified by P(p, t), vi) require each decoder to recover source variables consistent with H(p, t+1) REFERENCES [1] R Yeung and Z Zhang, Distributed source coding for satellite communications, IEEE Transactions on Information Theory, vol 45, no 4, pp , May 1999 [2], On symmetrical multilevel diversity coding, IEEE Transactions on Information Theory, vol 45, no 2, pp , March 1999 [3] R W Yeung, Information Theory and Network Coding Springer, 2008 [4] T S Han and K Kobayashi, A unified achievable rate region for a general class of multiterminal source coding systems, IEEE Transactions on Information Theory, vol IT-26, no 3, pp , May 1980 [5] T Ho, M Médard, R Koetter, D R Karger, M Effros, J Shi, and B Leong, A random linear network coding approach to multicast, IEEE Transactions on Information Theory, vol 52, no 10, pp , October 2006 [6] J Roche, Distributed information storage, PhD dissertation, Stanford University, March 1992 [7] R Yeung, Multilevel diversity coding with distortion, IEEE Transactions on Information Theory, vol 41, no 2, pp , March 1995 [8] J Roche, R Yeung, and K P Hau, Multilevel diversity coding with symmetrical connectivity, in IEEE International Symposium on Information Theory (ISIT), September 1995, p 262 [9], Symmetrical multilevel diversity coding, IEEE Transactions on Information Theory, vol 43, no 3, pp , May 1997 [10] F-W Fu and R W Yeung, On the rate distortion region for multiple descriptions, IEEE Transactions on Information Theory, vol 48, no 7, pp , July 2002 [11] S Mohajer, C Tian, and S Diggavi, Asymmetric gaussian multiple descriptions and asymmetric multilevel diversity coding, in IEEE International Symposium on Information Theory (ISIT), July 2008, pp [12], Asymmetric multilevel diversity coding and asymmetric gaussian multiple descriptions, IEEE Transactions on Information Theory, vol 56, no 9, pp , September 2010 [13] K P Hau and R W Yeung, Multilevel diversity coding with three encoders, in IEEE International Symposium on Information Theory (ISIT), Ulm, Germany, June 1997, p 440 [14] K P Hau, Multilevel diversity coding with independent data streams, Master s thesis, The Chinese University of Hong Kong, 1995 [15] F Matúš, Infinitely many information inequalities, in IEEE International Symposium on Information Theory (ISIT), June 2007, pp [16] Z Zhang and R W Yeung, On characterization of entropy function via information inequalities, IEEE Transactions on Information Theory, vol 44, no 4, July 1998 [17] A W Ingleton, Representation of matroids, in Combinatorial mathematics and its applications, D Welsh, Ed London: Academic Press, 1971, pp [18] J M Walsh and S Weber, A recursive construction of the set of binary entropy vectors and related algorithmic inner bounds for the entropy region, IEEE Transactions on Information Theory, vol 57, no 10, pp , October 2011 [19] R Dougherty, C Freiling, and K Zeger, Insufficiency of linear coding in network information flow, IEEE Transactions on Information Theory, vol 51, no 8, pp , August 2005 [20] T Berger, Multiterminal source coding, Udine, Italy, 1977, lecture note presented at CISM [21] S Tung, Multiterminal rate-distortion theory, PhD dissertation, Cornell University, Ithaca, NY, 1977 [22] T Ho, M Médard, M Effros, and R Koetter, Network coding for correlated sources, in Conference on Information Science and Systems (CISS), 2004 [23] J Barros and S Servetto, Network information flow with correlated sources, IEEE Transactions on Information Theory, vol 52, no 1, pp , January 2006 [24] A Ramamoorthy, K Jain, A Chou, and M Effros, Separating distributed source coding from network coding, IEEE Transactions on Information Theory, vol 52, no 6, pp , June 2006 [25] T S Han, Polymatroids with network coding, in Information Theory and its Applications Workshop (ITA), San Diego, CA, 2010 [26] A Ramamoorthy, Minimum cost distributed source coding over a network, IEEE Transactions on Information Theory, vol 57, no 1, pp , January 2011 [27] D Slepian and J Wolf, Noiseless coding of correlated information sources, IEEE Transactions on Information Theory, vol IT-19, no 4, pp , July 1973 [28] J M Walsh, S Weber, J C de Oliveira, A Eryilmaz, and M Medard, Trading rate for delay at the application and transport layers (guest editorial), IEEE Journal on Selected Areas in Communications (JSAC), vol 29, no 5, pp , May 2011 [29] J M Walsh, S Weber, and C wa Maina, Optimal rate delay tradeoffs and delay mitigating codes for multipath routed and network coded networks, IEEE Transactions on Information Theory, vol 55, no 12, pp , December 2009 [30] B Li and D Niu, Random network coding in peer-to-peer networks: From theory to practice, Proceedings of the IEEE, vol 99, no 3, pp , March 2011

A computational approach for determining rate regions and codes using entropic vector bounds

A computational approach for determining rate regions and codes using entropic vector bounds 1 A computational approach for determining rate regions and codes using entropic vector bounds Congduan Li, John MacLaren Walsh, Steven Weber Drexel University, Dept. of ECE, Philadelphia, PA 19104, USA

More information

Network Coding on Directed Acyclic Graphs

Network Coding on Directed Acyclic Graphs Network Coding on Directed Acyclic Graphs John MacLaren Walsh, Ph.D. Multiterminal Information Theory, Spring Quarter, 0 Reference These notes are directly derived from Chapter of R. W. Yeung s Information

More information

Linearly Representable Entropy Vectors and their Relation to Network Coding Solutions

Linearly Representable Entropy Vectors and their Relation to Network Coding Solutions 2009 IEEE Information Theory Workshop Linearly Representable Entropy Vectors and their Relation to Network Coding Solutions Asaf Cohen, Michelle Effros, Salman Avestimehr and Ralf Koetter Abstract In this

More information

Multiterminal Networks: Rate Regions, Codes, Computations, & Forbidden Minors

Multiterminal Networks: Rate Regions, Codes, Computations, & Forbidden Minors Multiterminal etworks: Rate Regions, Codes, Computations, & Forbidden Minors Ph D Thesis Proposal Congduan Li ASPITRG & MAL Drexel University congduanli@gmailcom October 5, 204 C Li (ASPITRG & MAL) Thesis

More information

Capacity Region of the Permutation Channel

Capacity Region of the Permutation Channel Capacity Region of the Permutation Channel John MacLaren Walsh and Steven Weber Abstract We discuss the capacity region of a degraded broadcast channel (DBC) formed from a channel that randomly permutes

More information

Amobile satellite communication system, like Motorola s

Amobile satellite communication system, like Motorola s I TRANSACTIONS ON INFORMATION THORY, VOL. 45, NO. 4, MAY 1999 1111 Distributed Source Coding for Satellite Communications Raymond W. Yeung, Senior Member, I, Zhen Zhang, Senior Member, I Abstract Inspired

More information

. Relationships Among Bounds for the Region of Entropic Vectors in Four Variables

. Relationships Among Bounds for the Region of Entropic Vectors in Four Variables . Relationships Among Bounds for the Region of Entropic Vectors in Four Variables John MacLaren Walsh and Steven Weber Department of Electrical and Computer Engineering, Drexel University, Philadelphia,

More information

Generalized Network Sharing Outer Bound and the Two-Unicast Problem

Generalized Network Sharing Outer Bound and the Two-Unicast Problem Generalized Network Sharing Outer Bound and the Two-Unicast Problem Sudeep U. Kamath, David N. C. Tse and Venkat Anantharam Wireless Foundations, Dept of EECS, University of California at Berkeley, Berkeley,

More information

On Multi-source Multi-Sink Hyperedge Networks: Enumeration, Rate Region. Computation, and Hierarchy. A Thesis. Submitted to the Faculty

On Multi-source Multi-Sink Hyperedge Networks: Enumeration, Rate Region. Computation, and Hierarchy. A Thesis. Submitted to the Faculty On Multi-source Multi-Sink Hyperedge Networks: Enumeration, Rate Region Computation, and Hierarchy A Thesis Submitted to the Faculty of Drexel University by Congduan Li in partial fulfillment of the requirements

More information

Communicating the sum of sources in a 3-sources/3-terminals network

Communicating the sum of sources in a 3-sources/3-terminals network Communicating the sum of sources in a 3-sources/3-terminals network Michael Langberg Computer Science Division Open University of Israel Raanana 43107, Israel Email: mikel@openu.ac.il Aditya Ramamoorthy

More information

Network Combination Operations Preserving the Sufficiency of Linear Network Codes

Network Combination Operations Preserving the Sufficiency of Linear Network Codes Network Combination Operations Preserving the Sufficiency of Linear Network Codes Congduan Li, Steven Weber, John MacLaren Walsh ECE Department, Drexel University Philadelphia, PA 904 Abstract Operations

More information

2014 IEEE International Symposium on Information Theory. Two-unicast is hard. David N.C. Tse

2014 IEEE International Symposium on Information Theory. Two-unicast is hard. David N.C. Tse Two-unicast is hard Sudeep Kamath ECE Department, University of California, San Diego, CA, USA sukamath@ucsd.edu David N.C. Tse EECS Department, University of California, Berkeley, CA, USA dtse@eecs.berkeley.edu

More information

Characterizing the Region of Entropic Vectors via Information Geometry

Characterizing the Region of Entropic Vectors via Information Geometry Characterizing the Region of Entropic Vectors via Information Geometry John MacLaren Walsh Department of Electrical and Computer Engineering Drexel University Philadelphia, PA jwalsh@ece.drexel.edu Thanks

More information

Symmetry in Network Coding

Symmetry in Network Coding Symmetry in Network Coding Formalization, Graph-theoretic Characterization, and Computation Jayant Apte John Walsh Department of Electrical and Computer Engineering Drexel University, Philadelphia ISIT,

More information

Bounding the Entropic Region via Information Geometry

Bounding the Entropic Region via Information Geometry Bounding the ntropic Region via Information Geometry Yunshu Liu John MacLaren Walsh Dept. of C, Drexel University, Philadelphia, PA 19104, USA yunshu.liu@drexel.edu jwalsh@coe.drexel.edu Abstract This

More information

Exploiting Symmetry in Computing Polyhedral Bounds on Network Coding Rate Regions

Exploiting Symmetry in Computing Polyhedral Bounds on Network Coding Rate Regions Exploiting Symmetry in Computing Polyhedral Bounds on Network Coding Rate Regions Jayant Apte John Walsh Department of Electrical and Computer Engineering Drexel University, Philadelphia NetCod, 205 NSF

More information

Non-isomorphic Distribution Supports for Calculating Entropic Vectors

Non-isomorphic Distribution Supports for Calculating Entropic Vectors Non-isomorphic Distribution Supports for Calculating Entropic Vectors Yunshu Liu & John MacLaren Walsh Adaptive Signal Processing and Information Theory Research Group Department of Electrical and Computer

More information

A new computational approach for determining rate regions and optimal codes for coded networks

A new computational approach for determining rate regions and optimal codes for coded networks A new computational approach for determining rate regions and optimal codes for coded networks Congduan Li, Jayant Apte, John MacLaren Walsh, Steven Weber Drexel University, Dept. of ECE, Philadelphia,

More information

The Capacity Region for Multi-source Multi-sink Network Coding

The Capacity Region for Multi-source Multi-sink Network Coding The Capacity Region for Multi-source Multi-sink Network Coding Xijin Yan Dept. of Electrical Eng. - Systems University of Southern California Los Angeles, CA, U.S.A. xyan@usc.edu Raymond W. Yeung Dept.

More information

Entropy Vectors and Network Information Theory

Entropy Vectors and Network Information Theory Entropy Vectors and Network Information Theory Sormeh Shadbakht and Babak Hassibi Department of Electrical Engineering Caltech Lee Center Workshop May 25, 2007 1 Sormeh Shadbakht and Babak Hassibi Entropy

More information

Analyzing Large Communication Networks

Analyzing Large Communication Networks Analyzing Large Communication Networks Shirin Jalali joint work with Michelle Effros and Tracey Ho Dec. 2015 1 The gap Fundamental questions: i. What is the best achievable performance? ii. How to communicate

More information

Fundamental rate delay tradeoffs in multipath routed and network coded networks

Fundamental rate delay tradeoffs in multipath routed and network coded networks Fundamental rate delay tradeoffs in multipath routed and network coded networks John Walsh and Steven Weber Drexel University, Dept of ECE Philadelphia, PA 94 {jwalsh,sweber}@ecedrexeledu IP networks subject

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

Symmetries in the Entropy Space

Symmetries in the Entropy Space Symmetries in the Entropy Space Jayant Apte, Qi Chen, John MacLaren Walsh Abstract This paper investigates when Shannon-type inequalities completely characterize the part of the closure of the entropy

More information

Robust Network Codes for Unicast Connections: A Case Study

Robust Network Codes for Unicast Connections: A Case Study Robust Network Codes for Unicast Connections: A Case Study Salim Y. El Rouayheb, Alex Sprintson, and Costas Georghiades Department of Electrical and Computer Engineering Texas A&M University College Station,

More information

A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality

A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality 0 IEEE Information Theory Workshop A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality Kumar Viswanatha, Emrah Akyol and Kenneth Rose ECE Department, University of California

More information

Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function

Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Dinesh Krithivasan and S. Sandeep Pradhan Department of Electrical Engineering and Computer Science,

More information

Matroid Bounds on the Region of Entropic Vectors

Matroid Bounds on the Region of Entropic Vectors Matroid Bounds on the Region of Entropic Vectors Congduan Li, John MacLaren Walsh, Steven Weber Drexel University, Dept of ECE, Philadelphia, PA 19104, USA congduanli@drexeledu, jwalsh@coedrexeledu, sweber@coedrexeledu

More information

Random Linear Intersession Network Coding With Selective Cancelling

Random Linear Intersession Network Coding With Selective Cancelling 2009 IEEE Information Theory Workshop Random Linear Intersession Network Coding With Selective Cancelling Chih-Chun Wang Center of Wireless Systems and Applications (CWSA) School of ECE, Purdue University

More information

Entropic Vectors: Polyhedral Computation & Information Geometry

Entropic Vectors: Polyhedral Computation & Information Geometry Entropic Vectors: Polyhedral Computation & Information Geometry John MacLaren Walsh Department of Electrical and Computer Engineering Drexel University Philadelphia, PA jwalsh@ece.drexel.edu Thanks to

More information

Characteristic-Dependent Linear Rank Inequalities and Network Coding Applications

Characteristic-Dependent Linear Rank Inequalities and Network Coding Applications haracteristic-ependent Linear Rank Inequalities and Network oding pplications Randall ougherty, Eric Freiling, and Kenneth Zeger bstract Two characteristic-dependent linear rank inequalities are given

More information

Extremal Entropy: Information Geometry, Numerical Entropy Mapping, and Machine Learning Application of Associated Conditional Independences.

Extremal Entropy: Information Geometry, Numerical Entropy Mapping, and Machine Learning Application of Associated Conditional Independences. Extremal Entropy: Information Geometry, Numerical Entropy Mapping, and Machine Learning Application of Associated Conditional Independences A Thesis Submitted to the Faculty of Drexel University by Yunshu

More information

On Common Information and the Encoding of Sources that are Not Successively Refinable

On Common Information and the Encoding of Sources that are Not Successively Refinable On Common Information and the Encoding of Sources that are Not Successively Refinable Kumar Viswanatha, Emrah Akyol, Tejaswi Nanjundaswamy and Kenneth Rose ECE Department, University of California - Santa

More information

Alphabet Size Reduction for Secure Network Coding: A Graph Theoretic Approach

Alphabet Size Reduction for Secure Network Coding: A Graph Theoretic Approach ALPHABET SIZE REDUCTION FOR SECURE NETWORK CODING: A GRAPH THEORETIC APPROACH 1 Alphabet Size Reduction for Secure Network Coding: A Graph Theoretic Approach Xuan Guang, Member, IEEE, and Raymond W. Yeung,

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

Sum-networks from undirected graphs: Construction and capacity analysis

Sum-networks from undirected graphs: Construction and capacity analysis Electrical and Computer Engineering Conference Papers Posters and Presentations Electrical and Computer Engineering 014 Sum-networks from undirected graphs: Construction and capacity analysis Ardhendu

More information

Reverse Edge Cut-Set Bounds for Secure Network Coding

Reverse Edge Cut-Set Bounds for Secure Network Coding Reverse Edge Cut-Set Bounds for Secure Network Coding Wentao Huang and Tracey Ho California Institute of Technology Michael Langberg University at Buffalo, SUNY Joerg Kliewer New Jersey Institute of Technology

More information

On Scalable Source Coding for Multiple Decoders with Side Information

On Scalable Source Coding for Multiple Decoders with Side Information On Scalable Source Coding for Multiple Decoders with Side Information Chao Tian School of Computer and Communication Sciences Laboratory for Information and Communication Systems (LICOS), EPFL, Lausanne,

More information

An Equivalence between Network Coding and Index Coding

An Equivalence between Network Coding and Index Coding An Equivalence between Network Coding and Index Coding Michelle Effros, Fellow, IEEE, Salim El Rouayheb, Member, IEEE, Michael Langberg, Member, IEEE 1 Abstract We show that the network coding and index

More information

Upper Bounds on the Capacity of Binary Intermittent Communication

Upper Bounds on the Capacity of Binary Intermittent Communication Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,

More information

Linear Codes, Target Function Classes, and Network Computing Capacity

Linear Codes, Target Function Classes, and Network Computing Capacity Linear Codes, Target Function Classes, and Network Computing Capacity Rathinakumar Appuswamy, Massimo Franceschetti, Nikhil Karamchandani, and Kenneth Zeger IEEE Transactions on Information Theory Submitted:

More information

An Ins t Ins an t t an Primer

An Ins t Ins an t t an Primer An Instant Primer Links from Course Web Page Network Coding: An Instant Primer Fragouli, Boudec, and Widmer. Network Coding an Introduction Koetter and Medard On Randomized Network Coding Ho, Medard, Shi,

More information

Quasi-linear Network Coding

Quasi-linear Network Coding Quasi-linear Network Coding Moshe Schwartz Electrical and Computer Engineering Ben-Gurion University of the Negev Beer Sheva 8410501, Israel schwartz@ee.bgu.ac.il Abstract We present a heuristic for designing

More information

Beyond the Butterfly A Graph-Theoretic Characterization of the Feasibility of Network Coding with Two Simple Unicast Sessions

Beyond the Butterfly A Graph-Theoretic Characterization of the Feasibility of Network Coding with Two Simple Unicast Sessions Beyond the Butterfly A Graph-Theoretic Characterization of the Feasibility of Network Coding with Two Simple Unicast Sessions Chih-Chun Wang Center for Wireless Systems and Applications (CWSA) School of

More information

On Scalable Coding in the Presence of Decoder Side Information

On Scalable Coding in the Presence of Decoder Side Information On Scalable Coding in the Presence of Decoder Side Information Emrah Akyol, Urbashi Mitra Dep. of Electrical Eng. USC, CA, US Email: {eakyol, ubli}@usc.edu Ertem Tuncel Dep. of Electrical Eng. UC Riverside,

More information

Networks and Matroids

Networks and Matroids Adaptive Signal Processing and Information Theory Research Group ECE Department, Drexel University June 1, 2012 Outline 1 Mappings and Definitions Source Mapping: S : V 2 µ, µ is message set. Message Assignment:

More information

Interactive Decoding of a Broadcast Message

Interactive Decoding of a Broadcast Message In Proc. Allerton Conf. Commun., Contr., Computing, (Illinois), Oct. 2003 Interactive Decoding of a Broadcast Message Stark C. Draper Brendan J. Frey Frank R. Kschischang University of Toronto Toronto,

More information

Reliable Computation over Multiple-Access Channels

Reliable Computation over Multiple-Access Channels Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,

More information

Network Routing Capacity

Network Routing Capacity 1 Network Routing Capacity Jillian Cannons (University of California, San Diego) Randy Dougherty (Center for Communications Research, La Jolla) Chris Freiling (California State University, San Bernardino)

More information

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing Lecture 7 Agenda for the lecture M-ary hypothesis testing and the MAP rule Union bound for reducing M-ary to binary hypothesis testing Introduction of the channel coding problem 7.1 M-ary hypothesis testing

More information

Lossy Distributed Source Coding

Lossy Distributed Source Coding Lossy Distributed Source Coding John MacLaren Walsh, Ph.D. Multiterminal Information Theory, Spring Quarter, 202 Lossy Distributed Source Coding Problem X X 2 S {,...,2 R } S 2 {,...,2 R2 } Ẑ Ẑ 2 E d(z,n,

More information

The Sensor Reachback Problem

The Sensor Reachback Problem Submitted to the IEEE Trans. on Information Theory, November 2003. 1 The Sensor Reachback Problem João Barros Sergio D. Servetto Abstract We consider the problem of reachback communication in sensor networks.

More information

Distributed Reed-Solomon Codes

Distributed Reed-Solomon Codes Distributed Reed-Solomon Codes Farzad Parvaresh f.parvaresh@eng.ui.ac.ir University of Isfahan Institute for Network Coding CUHK, Hong Kong August 216 Research interests List-decoding of algebraic codes

More information

SHANNON S information measures refer to entropies, conditional

SHANNON S information measures refer to entropies, conditional 1924 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 43, NO. 6, NOVEMBER 1997 A Framework for Linear Information Inequalities Raymond W. Yeung, Senior Member, IEEE Abstract We present a framework for information

More information

EECS 750. Hypothesis Testing with Communication Constraints

EECS 750. Hypothesis Testing with Communication Constraints EECS 750 Hypothesis Testing with Communication Constraints Name: Dinesh Krithivasan Abstract In this report, we study a modification of the classical statistical problem of bivariate hypothesis testing.

More information

The Capacity of a Network

The Capacity of a Network The Capacity of a Network April Rasala Lehman MIT Collaborators: Nick Harvey and Robert Kleinberg MIT What is the Capacity of a Network? Source a Source b c d e Sink f Sink What is the Capacity of a Network?

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

Low Complexity Encoding for Network Codes

Low Complexity Encoding for Network Codes Low Complexity Encoding for Network Codes Sidharth Jaggi 1 Laboratory of Information and Decision Sciences Massachusetts Institute of Technology Cambridge, MA 02139, USA Email: jaggi@mit.edu Yuval Cassuto

More information

EE5585 Data Compression May 2, Lecture 27

EE5585 Data Compression May 2, Lecture 27 EE5585 Data Compression May 2, 2013 Lecture 27 Instructor: Arya Mazumdar Scribe: Fangying Zhang Distributed Data Compression/Source Coding In the previous class we used a H-W table as a simple example,

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

Capacity Theorems for Distributed Index Coding

Capacity Theorems for Distributed Index Coding Capacity Theorems for Distributed Index Coding 1 Yucheng Liu, Parastoo Sadeghi, Fatemeh Arbabjolfaei, and Young-Han Kim Abstract arxiv:1801.09063v1 [cs.it] 27 Jan 2018 In index coding, a server broadcasts

More information

Symmetries in Entropy Space

Symmetries in Entropy Space 1 Symmetries in Entropy Space Jayant Apte, Qi Chen, John MacLaren Walsh Department of Electrical and Computer Engineering Drexel University Philadelphia, PA jwalsh@ece.drexel.edu Thanks to NSF CCF-1421828

More information

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels S. Sandeep Pradhan a, Suhan Choi a and Kannan Ramchandran b, a {pradhanv,suhanc}@eecs.umich.edu, EECS Dept.,

More information

On queueing in coded networks queue size follows degrees of freedom

On queueing in coded networks queue size follows degrees of freedom On queueing in coded networks queue size follows degrees of freedom Jay Kumar Sundararajan, Devavrat Shah, Muriel Médard Laboratory for Information and Decision Systems, Massachusetts Institute of Technology,

More information

On Network Interference Management

On Network Interference Management On Network Interference Management Aleksandar Jovičić, Hua Wang and Pramod Viswanath March 3, 2008 Abstract We study two building-block models of interference-limited wireless networks, motivated by the

More information

On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers

On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers Albrecht Wolf, Diana Cristina González, Meik Dörpinghaus, José Cândido Silveira Santos Filho, and Gerhard Fettweis Vodafone

More information

Optimal matching in wireless sensor networks

Optimal matching in wireless sensor networks Optimal matching in wireless sensor networks A. Roumy, D. Gesbert INRIA-IRISA, Rennes, France. Institute Eurecom, Sophia Antipolis, France. Abstract We investigate the design of a wireless sensor network

More information

Distributed Lossy Interactive Function Computation

Distributed Lossy Interactive Function Computation Distributed Lossy Interactive Function Computation Solmaz Torabi & John MacLaren Walsh Dept. of Electrical and Computer Engineering Drexel University Philadelphia, PA 19104 Email: solmaz.t@drexel.edu &

More information

The Gaussian Many-Help-One Distributed Source Coding Problem Saurabha Tavildar, Pramod Viswanath, Member, IEEE, and Aaron B. Wagner, Member, IEEE

The Gaussian Many-Help-One Distributed Source Coding Problem Saurabha Tavildar, Pramod Viswanath, Member, IEEE, and Aaron B. Wagner, Member, IEEE 564 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 1, JANUARY 2010 The Gaussian Many-Help-One Distributed Source Coding Problem Saurabha Tavildar, Pramod Viswanath, Member, IEEE, and Aaron B. Wagner,

More information

IN FUTURE communication networks, say next-generation

IN FUTURE communication networks, say next-generation IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, VOL. 36, NO. 4, APRIL 2018 737 Fundamental Limits on a Class of Secure Asymmetric Multilevel Diversity Coding Systems Congduan Li, Member, IEEE, Xuan Guang,

More information

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview AN INTRODUCTION TO SECRECY CAPACITY BRIAN DUNN. Overview This paper introduces the reader to several information theoretic aspects of covert communications. In particular, it discusses fundamental limits

More information

A Numerical Study on the Wiretap Network with a Simple Network Topology

A Numerical Study on the Wiretap Network with a Simple Network Topology 1 A Numerical Study on the Wiretap Network with a Simple Network Topology Fan Cheng, Member, IEEE and Vincent Y. F. Tan, Senior Member, IEEE arxiv:1505.02862v3 [cs.it] 15 Jan 2016 Fig. 1. Abstract In this

More information

Distributed Functional Compression through Graph Coloring

Distributed Functional Compression through Graph Coloring Distributed Functional Compression through Graph Coloring Vishal Doshi, Devavrat Shah, Muriel Médard, and Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology

More information

Multiple Access Network Information-flow And Correction codes

Multiple Access Network Information-flow And Correction codes Multiple Access Network Information-flow And Correction codes Hongyi Yao 1, Theodoros K. Dikaliotis, Sidharth Jaggi, Tracey Ho 1 Tsinghua University California Institute of Technology Chinese University

More information

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel Introduction to Coding Theory CMU: Spring 2010 Notes 3: Stochastic channels and noisy coding theorem bound January 2010 Lecturer: Venkatesan Guruswami Scribe: Venkatesan Guruswami We now turn to the basic

More information

Network Routing Capacity

Network Routing Capacity 1 Network Routing Capacity Jillian Cannons (University of California, San Diego) Randy Dougherty (Center for Communications Research, La Jolla) Chris Freiling (California State University, San Bernardino)

More information

Code Construction for Two-Source Interference Networks

Code Construction for Two-Source Interference Networks Code Construction for Two-Source Interference Networks Elona Erez and Meir Feder Dept. of Electrical Engineering-Systems, Tel Aviv University, Tel Aviv, 69978, Israel, E-mail:{elona, meir}@eng.tau.ac.il

More information

THE idea of network coding over error-free networks,

THE idea of network coding over error-free networks, Path Gain Algebraic Formulation for the Scalar Linear Network Coding Problem Abhay T. Subramanian and Andrew Thangaraj, Member, IEEE arxiv:080.58v [cs.it] 9 Jun 00 Abstract In the algebraic view, the solution

More information

Distributed Decoding of Convolutional Network Error Correction Codes

Distributed Decoding of Convolutional Network Error Correction Codes 1 Distributed Decoding of Convolutional Network Error Correction Codes Hengjie Yang and Wangmei Guo arxiv:1701.06283v2 [cs.it] 18 Feb 2017 Abstract A Viterbi-like decoding algorithm is proposed in this

More information

Simplified Composite Coding for Index Coding

Simplified Composite Coding for Index Coding Simplified Composite Coding for Index Coding Yucheng Liu, Parastoo Sadeghi Research School of Engineering Australian National University {yucheng.liu, parastoo.sadeghi}@anu.edu.au Fatemeh Arbabjolfaei,

More information

Source Coding and Function Computation: Optimal Rate in Zero-Error and Vanishing Zero-Error Regime

Source Coding and Function Computation: Optimal Rate in Zero-Error and Vanishing Zero-Error Regime Source Coding and Function Computation: Optimal Rate in Zero-Error and Vanishing Zero-Error Regime Solmaz Torabi Dept. of Electrical and Computer Engineering Drexel University st669@drexel.edu Advisor:

More information

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Silas Fong (Joint work with Vincent Tan) Department of Electrical & Computer Engineering National University

More information

Characterising Probability Distributions via Entropies

Characterising Probability Distributions via Entropies 1 Characterising Probability Distributions via Entropies Satyajit Thakor, Terence Chan and Alex Grant Indian Institute of Technology Mandi University of South Australia Myriota Pty Ltd arxiv:1602.03618v2

More information

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I

More information

Network Coding for Computing

Network Coding for Computing Networ Coding for Computing Rathinaumar Appuswamy, Massimo Franceschetti, Nihil Karamchandani, and Kenneth Zeger Abstract The following networ computation problem is considered A set of source nodes in

More information

Variable Length Codes for Degraded Broadcast Channels

Variable Length Codes for Degraded Broadcast Channels Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates

More information

Graph-based codes for flash memory

Graph-based codes for flash memory 1/28 Graph-based codes for flash memory Discrete Mathematics Seminar September 3, 2013 Katie Haymaker Joint work with Professor Christine Kelley University of Nebraska-Lincoln 2/28 Outline 1 Background

More information

Linear Exact Repair Rate Region of (k + 1, k, k) Distributed Storage Systems: A New Approach

Linear Exact Repair Rate Region of (k + 1, k, k) Distributed Storage Systems: A New Approach Linear Exact Repair Rate Region of (k + 1, k, k) Distributed Storage Systems: A New Approach Mehran Elyasi Department of ECE University of Minnesota melyasi@umn.edu Soheil Mohajer Department of ECE University

More information

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12 Lecture 1: The Multiple Access Channel Copyright G. Caire 12 Outline Two-user MAC. The Gaussian case. The K-user case. Polymatroid structure and resource allocation problems. Copyright G. Caire 13 Two-user

More information

Design of linear Boolean network codes for combination networks

Design of linear Boolean network codes for combination networks Louisiana State University LSU Digital Commons LSU Master's Theses Graduate School 2006 Design of linear Boolean network codes for combination networks Shoupei Li Louisiana State University and Agricultural

More information

Energy State Amplification in an Energy Harvesting Communication System

Energy State Amplification in an Energy Harvesting Communication System Energy State Amplification in an Energy Harvesting Communication System Omur Ozel Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland College Park, MD 20742 omur@umd.edu

More information

Network Control: A Rate-Distortion Perspective

Network Control: A Rate-Distortion Perspective Network Control: A Rate-Distortion Perspective Jubin Jose and Sriram Vishwanath Dept. of Electrical and Computer Engineering The University of Texas at Austin {jubin, sriram}@austin.utexas.edu arxiv:8.44v2

More information

Common Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014

Common Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014 Common Information Abbas El Gamal Stanford University Viterbi Lecture, USC, April 2014 Andrew Viterbi s Fabulous Formula, IEEE Spectrum, 2010 El Gamal (Stanford University) Disclaimer Viterbi Lecture 2

More information

IN this paper, we show that the scalar Gaussian multiple-access

IN this paper, we show that the scalar Gaussian multiple-access 768 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 50, NO. 5, MAY 2004 On the Duality of Gaussian Multiple-Access and Broadcast Channels Nihar Jindal, Student Member, IEEE, Sriram Vishwanath, and Andrea

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

Online Packet Routing on Linear Arrays and Rings

Online Packet Routing on Linear Arrays and Rings Proc. 28th ICALP, LNCS 2076, pp. 773-784, 2001 Online Packet Routing on Linear Arrays and Rings Jessen T. Havill Department of Mathematics and Computer Science Denison University Granville, OH 43023 USA

More information

Throughput-Delay Analysis of Random Linear Network Coding for Wireless Broadcasting

Throughput-Delay Analysis of Random Linear Network Coding for Wireless Broadcasting Throughput-Delay Analysis of Random Linear Network Coding for Wireless Broadcasting Swapna B.T., Atilla Eryilmaz, and Ness B. Shroff Departments of ECE and CSE The Ohio State University Columbus, OH 43210

More information

The Gallager Converse

The Gallager Converse The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing

More information

PERFECTLY secure key agreement has been studied recently

PERFECTLY secure key agreement has been studied recently IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 45, NO. 2, MARCH 1999 499 Unconditionally Secure Key Agreement the Intrinsic Conditional Information Ueli M. Maurer, Senior Member, IEEE, Stefan Wolf Abstract

More information

On Source-Channel Communication in Networks

On Source-Channel Communication in Networks On Source-Channel Communication in Networks Michael Gastpar Department of EECS University of California, Berkeley gastpar@eecs.berkeley.edu DIMACS: March 17, 2003. Outline 1. Source-Channel Communication

More information