The Capacity Region for Multi-source Multi-sink Network Coding

Size: px
Start display at page:

Download "The Capacity Region for Multi-source Multi-sink Network Coding"

Transcription

1 The Capacity Region for Multi-source Multi-sink Network Coding Xijin Yan Dept. of Electrical Eng. - Systems University of Southern California Los Angeles, CA, U.S.A. xyan@usc.edu Raymond W. Yeung Dept. of Information Eng. The Chinese University of Hong Kong N.T., Hong Kong whyeung@ie.cuhk.edu.hk Zhen Zhang Dept. of Electrical Eng. - Systems University of Southern California Los Angeles, CA, U.S.A. zzhang@commsci1.usc.edu Abstract The capacity problem for general acyclic multisource multi-sink networks with arbitrary transmission requirements has been studied in [1], [2] and [3]. Specifically, inner and outer bounds of the capacity region were derived respectively in terms of Γ n and Γ n, the fundamental regions of the entropy function. In this paper, we show that by carefully bounding the constrained regions in the entropy space, we obtain the exact characterization of the capacity region, thus closing the existing gap between the above inner and outer bounds. I. INTRODUCTION Consider a multi-source multi-sink network in which more than one mutually independent information sources are generated at possibly different nodes and each of the information sources is multicast to a specific set of sink nodes. We assume the network is acyclic and the channels are free of error. Unlike the single-source multicast network coding problem, the capacity region has an explicit Max-flow Mincut representation [4], the counterpart problem for multisource multi-sink network coding with arbitrary transmission requirements is considerably more complex. Except for a few explicit outer bounds that have been discovered recently in [2], [5], [6] and [7], the tightest theoretical characterization which has been obtained so far makes use of the tools developed in the theory of information inequalities [1]. Specifically, an inner bound and an outer bound were derived in terms of Γ n and Γ n respectively, which are fundamental regions of the entropy function. In this paper, we determine the exact capacity region for general acyclic multi-source multi-sink networks using an entropy function characterization. In particular, we show that by carefully bounding the constrained regions in the entropy space, we obtain the exact characterization of the capacity region, thus closing the existing gap between the above inner and outer bounds. The rest of the paper is organized as follows. In section II, we present a formal problem formulation and introduce some preliminaries on strongly typical sequences and entropy functions. In section III, we present the main result, i.e., an exact characterization of the capacity region. Proofs of the main result are given in section IV and final conclusions are drawn in section V. A. Network Model II. PRELIMINARIES Let G = V, E) denote an acyclic multi-source multi-sink communication network, V and E are the set of all nodes and the set of all channels. We assume each channel e E is error-free with a positive capacity constraint R e and all nodes i V are ordered in a way such that if there exists a channel from a node i to a node j, then the node i precedes the node j. We further define Ini) = j, i) E : j V to be the set of channels directed into node i and Outi) = i, j) E : j V to be the set of channels directed from i. Let S V be the set of all source nodes and T V be the set of all sink nodes. Without loss of generality, we assume G has the structure such that each source node has no input channels and each sink node has no outgoing channels. We assume all sources are uniformly distributed and mutually independent with finite alphabet X s = 1, 2,, 2 nτs for all s S, τ s is the source information rate at s S. A sink node t T requires the data from a set of sources βt) S to be decoded. In the case when βt) = S for all t T, the given network can be simply treated as a single source multicast network [4]. To allow for a general treatment of networks with arbitrary transmission requirements, we assume that βt) can be any subset of S for all t T. For clarity of notation, we sometimes use a superscript on a vector e.g. x n ) to specify the dimension of the vector, which will be distinguished from a superscript in parentheses e.g. x k) ) used to specify the index of a vector in a sequence e.g. x 1), x 2), ). The complement of a set A is represented by A c. For consistency, we further assume all the logarithms are in the base 2. B. Capacity Region Consider a block code of length n. Definition 1: An n, η e, e E), τ s, s S), t, t T )) block code of length n on a given communication network is defined by 1) for all source node s S and all channel e Outs), a local encoding mapping k e : X s 0, 1,, η e ; 1)

2 2) for all node i V \ S T ) and all channel e Outi), a local encoding mapping k e : 0, 1,, η d 0, 1,, η e ; 2) d Ini) 3) for all sink node t T, a decoding mapping g t : 0, 1,, η d X s ; 3) d Int) s βt) 4) for all sink node t T, a decoding error probability t = Pr g t X S ) X βt), 4) g t X S ) is the value of g t as a function of X S. Definition 2: An information rate tuple ω = ω s : s S), ω 0 componentwise) is achievable if for any ɛ > 0, there exists for sufficient large n an n, η e, e E), τ s, s S), t, t T )) code such that for all e E, for all s S, and n 1 log η e R e + ɛ 5) τ s ω s ɛ 6) t ɛ 7) for all t T. Definition 3: The capacity region denoted by R is the set of all achievable information rate tuple ω. C. Strongly Typical Sequences Consider an information source X k, k 1 X k are i.i.d. with probability distribution px). Let X denote the generic random variable, HX) < and S X be the support of X. Definition 4: The strong δ-typical set T[X]δ n with respect to px) is the set of sequences x n = x 1, x 2,, x n ) X n such that Nx; x n ) = 0 for x / S X, and 1 n Nx; xn ) px) δ, x Nx; x n ) is the number of occurrences of x in x n, and δ is an arbitrarily small positive real number. Lemma 1: Strong AEP) Let η be a small positive quantity such that η 0 as δ 0. 1) If x n T n [X]δ, then 2 nhx)+η) px n ) 2 nhx) η). 8) 2) For n sufficiently large, P r X n T[X]δ n > 1 δ. 9) 3) For n sufficiently large, 1 δ)2 nhx) η) T n [X]δ 2 nhx)+η). 10) For an i.i.d. bivariate information source X k, Y k ) : k 1 with probability distribution px, y). Let X, Y ) denote the pair of generic random variables, HX, Y ) <. Definition 5: The strongly jointly δ-typical set T[XY n ]δ with respect to px, y) is the set of x n, y n ) X n Y n such that Nx, y; x n, y n ) = 0 for x, y) S XY, and 1 n Nx, y; xn, y n ) px, y) δ, x y Nx, y; x n, y n ) is the number of occurrences of x, y) in x n, y n ) and δ is an arbitrarily small positive real number. Strong typicality satisfies the following properties. Lemma 2: Consistency) If x n, y n ) T[XY n ]δ, then xn T[X]δ n and yn T[Y n ]δ. Lemma 3: Preservation) Let Y = fx). If x n = x 1, x 2,, x n ) T[X]δ n, then fxn ) = y 1, y 2,, y n ) T[Y n ]δ, y i = fx i ) for 1 i n. Lemma 4: Strong JAEP) Let X n, Y n ) = X 1, Y 1 ), X 2, Y 2 ),, X n, Y n )), X i, Y i ) are i.i.d. with generic pair of random variables X, Y ). Let λ be a small positive quantity such that λ 0 as δ 0. 1) If x n, y n ) T n [XY ]δ, then 2 nhx,y )+λ) px n, y n ) 2 nhx,y ) λ). 11) 2) For n sufficiently large, Pr X n, Y n ) T[XY n ]δ > 1 δ. 12) 3) For n sufficiently large, 1 δ)2 nhx,y ) λ) T n [XY ]δ Lemma 5: For any x n T[X]δ n, define T n [Y X]δ xn ) = If T[Y n X]δ xn ) 1, then 2 nhy X) γ) T n 2 nhx,y )+λ). 13) y n T n [Y ]δ : xn, y n ) T n [XY ]δ [Y X]δ. 14) 2 nhy X)+γ), 15) γ 0 as n and δ 0. The generalization to a multivariate distribution is straightforward. A more thorough introduction to strongly δ-typical sequences can be found in [2] Chapter 5. D. The Region Γ N Let N be a nonempty set of random variables and Q N = 2 N \ φ with cardinality Q N = 2 N 1. Let H N be the Q N -dimensional Euclidean space with the coordinates labeled by h A, A Q N. A vector h = h A : A Q N ) in H N is said to be an entropy function if there exists a joint distribution for all X N such that h A = HX : X A) for all A Q N. We then define the region Γ N = h H N : h is an entropy function. 16) Therefore, by the above definition, there exists an oneto-one mapping between each vector h in Γ N and some set of random variables whose joint entropies correspond to

3 the elements in h. Since an arbitrary information inequality equality) can be regarded as a half-space hyperplane) in H N, it cuts Γ N into a subregion that maps to all sets of random variables that possess this property. Lemma 6: Basic properties of Γ N : 1) Γ N contains the origin. 2) Γ N, the closure of Γ N, is convex. 3) Γ N is in the nonnegative orthant of the space H N, i.e., Γ N h H N : h A 0 for all A Q N. III. MAIN RESULT Consider the set of all information rate tuples ω such that there exist auxiliary random variables, s S and U e, e E which satisfy the following conditions: H ) ω s, s S 17) HY S ) = s S H ) 18) HU Outs) ) = 0, s S 19) HU Outi) U Ini) ) = 0, i V \ S T ) 20) HU e ) R e, e E 21) HY βt) U Int) ) = 0, t T, 22) U e is an auxiliary random variable associated with the codeword sent on channel e, and Y S, U Ini) denote respectively the sets : s S, U e : e Ini), etc. For a given acyclic multi-source multi-sink network G, let N = : s S; U e : e E and define the following constrained regions in H N : C 1 = h H N : h YS = h Ys s S 23) C 2 = h H N : h UOuts) = 0, s S 24) C 3 = h H N : h UOuti) U Ini) = 0, i V \ S T ) 25) C 4 = h H N : h Ue R e, e E 26) C 5 = h H N : h Yβt) U Int) = 0, t T, 27) we used the notations h A A, h AA and h A for HA A ), HAA ) and HA ) for brevity. Clearly, 23) to 27) are the corresponding forms of 18) to 22) in H N. Let conγ N ) be the convex hull of Γ N. By time sharing, for any h conλ), Λ Γ N, there exists a random variable W and a set of jointly distributed random variables X : X N such that for any A Q N, h A W = HX : X A W ). Let C α = i α C i, α 1, 2, 3, 4, 5, we have the following theorem. Theorem 1: The capacity region for an arbitrary acyclic multi-source multi-sink network is characterized by )) R = Λ proj YS conγ N C 123) C 4 C 5, 28) for any A H N, proj YS A) = h YS : h A is the projection of A on the coordinates h Ys, s S, ΛA) = h H N : 0 h h, h A and Ā is the closure of region A. A. Proof of Achievability IV. PROOF OF THEOREM Let ω proj YS conγ N C 123) C 4 C 5 ). Then there exists an h conγ N C 123) C 4 C 5 such that ω = proj YS h). This implies that there exists a sequence h k) conγ N C 123) such that h = lim k h k). Note that h k) might not be in Γ N, thus is not necessarily an entropy function. However, this can be resolved by using time-sharing. Let W k) be a time-sharing variable, thus by the definition of Γ N, there exists a set of random variables N k) = W k) ; k) : s S; U e k) : e E whose entropy function corresponds to h k), i.e., H k) W k)) = ω s k), s S 29) H W k)) = H W k)) 30) s S H H U k) U k) Outs) Outi) Y k) S Y k) s k), W k)) = 0, s S 31) U k) Ini), W k)) = 0, i V \ S T ), 32) lim k ω s k) = ω s for all s S. Since h C 4 C 5 and h = lim k h k), it is implied that N k) must also satisfy H Y k) βt) H U e k) W k)) R e + ɛ k, e E 33) U k) Int), W k)) = δ k, t T, 34) ɛ k 0, δ k 0 as k. From the Caratheodory s Theorem, we may assume that the support S W of W k) has size at most 2 N. k) and U e k) are random variables representing the information source X s and the codeword sent on the channel e. View W k) as a time-sharing random variable. To prove the result, we need to construct a code for each value w of W k). The time sharing of the codes gives the overall performance of the code. By 30), are conditionally independent given W k) = w, and we consider n i.i.d. copies of them having generic distributions py s w) : s S. For sequences Y n s with block length n, the corresponding joint distribution is py n s w) = pn y s w) y n s = y1) s,, y s n) ). Since we next consider only the random variables in N k), we temporarily drop the index k for convenience. Similarly, being aware of the conditioning on W k) = w, we drop the conditioning index w unless otherwise specified. Now we construct a random n, η e, e E), ω s ɛ, s S), t, t T )) code by the following procedure: 1. Codebook a) For each source s S, generate 2 nτ s sequences of length n randomly and independently according to p n y s w). We call this codebook C s with indices 0, 1,, 2 nτs 1 and cardinality M s = 2 nτs, we assume 2 nτs is an integer for convenience. b) The construction of the random codes C s for the sources s S are done independently.

4 2. Encoding a) If the message is j at source node s, map it to the jth codeword in C s and call this sequence y n s j. b) By 31) and 32), for each channel e Outi), i V \ S T ), there exists a deterministic function u e such that U e = u e U d : d Ini)). Since the network is acyclic, we can show inductively that there exists another deterministic function ũ e such that U e = ũ e, s S). Let ζ e = T[U n e W =w]δ, by Lemma 5, we have ζ e 2 nhu e W =w)+η), η 0 as δ 0. Then from 33), by time sharing the code with respect to the probability distribution of W and letting δ be sufficiently small, we have the average rate over channel e being HU e W ) + η R e + ɛ k + η. 35) Thus for the fixed value W = w, by choosing an integer η e such that 2 nhu e W =w)+η) η e 2 nr e+ɛ k +η), 36) we can define a local encoding function k e such that if U n Ini) T [U n Ini) W =w]δ, Un e T[U n e W =w]δ. Thus we transmit the index of U n e in T[U n e W =w]δ as the codeword on channel e. Otherwise, we transmit a zero codeword on e. 3. Decoding For t T, define the decoding function g t : 0, 1,, η d C s. 37) d Int) s βt) Let C βt) = s βt) C s be the joint codebook for all the sources in βt) with cardinality M βt) = s βt) M s. We use the following strong typicality decoding. If the received codeword is nonzero for all d Int) and there exists a unique codeword y n βt) in C βt) such that U n Int), yn βt) ) T[U n Int) Y βt) W =w]δ, then let g tu n Int) ) = yn βt). Otherwise, declare a decoding error. For the above coding procedure, assuming that y n S is the source message from all sources and the codewords of C βt) are ordered as c n βt) i : i = 1,, M βt), the joint codeword c n βt) i is concatenated by βt) codewords from C s, s βt). For notational convenience, we omit the dimension index on c n βt) i and use c i instead. Thus Pr error W = w = Pr error y n S T n + Pr error y n S T n = Pr y n S T [Y n λ n + Pr error y n S T n + Pr Pr y n S T n Pr y n S T n error y n S T n 38) λ n 0 as n. Since y n S T [Y n implies Un Int) T [U n Int) W =w]δ, a decoding error occurs if and only if there are more than one codeword in C βt) that satisfy the strong joint typicality condition. Let c i C βt) be such a codeword, that is c i, U n Int) ) T [Y n βt) U Int) W =w]δ and c i y n βt). Call this event E i, then we have Pr error y n S T [Y n = Pr ci y n E i y n βt) S T [Y n = 1 Pr ci y n Ec βt) i y n S T [Y n a) = 1 Pr E1 c y n S T [Y n Mβt) 1) ) y n 1 1 Pr E 1 S T[Y n Mβt) b) c) 1 1 δ 2 nhy βt) W =w) δ k 3η) ) M βt) 1 1 M βt) δ 2 nhy βt) W =w) δ k 3η) ) = M βt) δ 2 nhy βt) W =w) δ k 3η) d) = M βt) δ 2 np s βt) HYs W =w) δ k 3η), 39) δ = 1 1 δ 1 as δ 0. The noted inequalities are explained as follows: a) follows since c i C βt) are i.i.d. uniformly distributed we abuse the notation here by assuming c 1 y n βt) ), and the total number of such codewords is M βt) 1. b) follows from Lemma 4, 5 and the conditioning on W = w. We omit the details for brevity. c) follows from the fact that 1 + a) n 1 + na. d) follows by applying 30). Now combine 38) and 39), we have Prerror W = w λ n + M βt) δ 2 np s βt) H W =w) δ k 3η) = λ n + δ 2 nµ δ k 3η) by choosing the codebook cardinalities M s = 2 nτ s = 2 nhys W =w) µ) and µ > δ k + 3η for all s S. Since the support size S W of W is uniformly bounded above, the time shared code has a vanishing error probability, i.e., Prerror = E[Prerror W = w] S W λ ) n + δ 2 nµ δ k 3η) 0, as n. Thus we proved the existence of N k), which asserts that proj YS h k) ) R. Letting k, we obtain ω = proj YS h) R. Invoking the definition of achievability, we obtain Λω) R, completing the proof of achievability. B. Proof of Converse Let ω R, then for 0 < ɛ k 0, we have a sequence of ) n k, η e k), e E), τ s k), s S), k) t, t T ) codes satisfying n 1 k log ηk) e R e + ɛ k, e E 40) τ s k) ω s ɛ k, s S 41) k) t ɛ k, t T. 42)

5 In the following discussion, we fix the value of k, and drop k from all notations. Let X s be uniformly distributed in the codebook X s and X s : s S be independent, then HX S ) = s S HX s ) 43) HU Outs) X s ) = 0, s S 44) HU Outi) U Ini) ) = 0, i V \ S T ) 45) HU e ) nr e + 2ɛ k ), e E 46) HX βt) U Int) ) nφ t n, ɛ k ), t T, 47) HX s ) nω s ɛ k ), s S 48) 43) follows from the independence of X s, s S. 44), 45) follows from the definition of the code. 46) follows from 40). 47) can be proved similarly as in [3] section 6.3 φ t n, ɛ k ) = 1 n + 2ɛ k R e + ɛ k ). 49) e Int) 48) follows from 41) and the fact that X s is uniformly distributed, i.e., for all s S HX s ) = log X s = log 2 nτs nτ s nω s ɛ k ). By letting = X s for all s S, we see there exists a sequence h k) such that for all s S, and h k) nω s ɛ k ), 50) h k) Γ N C 123 C n 4ɛ k C n 5ɛ k, 51) C n 4ɛ k = h H N : h Ue nr e + 2ɛ k ), e E C n 5ɛ k = h H N : h Yβt) U Int) nφ t n, ɛ k ), t T. Dividing 50) by n, we obtain n 1 h k) ω s ɛ k 52) for all s S. Since conγ N C 123) is convex and contains the zero vector, we have n 1 h k) conγ N C 123) C 4ɛk C 5ɛk, 53) C 4ɛk = h H N : h Ue R e + 2ɛ k, e E C 5ɛk = h H N : h Yβt) U Int) φ t n, ɛ k ), t T. Now define the set B n,k) = h conγ N C 123) C 4ɛk C 5ɛk : h Ys ω s ɛ k, for all s S. 54) Without loss of generality, we let ɛ k 0 monotonically. Note from 49) that φ t n, ɛ k ) is monotonic with respect to n and k, thus B n+1,k) B n,k) and B n,k+1) B n,k). For any fixed k, for sufficiently large n, from 52) and 53), we see that B n,k) is nonempty. Since each B n,k) is compact closeness by the definition, boundedness by the constraint in C 4ɛk ), we see that lim n Bn,k) is both compact and nonempty. By the same argument, lim lim k n Bn,k) is also nonempty. Thus there exists some h such that h conγ N C 123) C 4 C 5, 55) h ω s, for all s S. 56) Let r = proj YS h ), then we have r proj YS conγ N C 123) C 4 C 5 ), 57) r ω componentwise). 58) By 57) and 58), we finally obtain that )) ω Λ proj YS conγ N C 123) C 4 C 5. 59) This completes the proof. V. CONCLUSION In this work, we extended the previous work in [1], [2] and [3] on the capacity region for general acyclic multisource multi-sink networks. Specifically, we closed the gap between the existing inner and outer bounds by refining the constrained regions in the entropy space. This leads to an exact characterization of the capacity region for general acyclic multi-source multi-sink networks with arbitrary transmission requirements and thus completes the work along this line of research. However, how to explicitly evaluate the obtained capacity region remains an open problem in general. ACKNOWLEDGMENT The authors would like to give special thanks to Prof. Ning Cai and Dr. Lihua Song for their valuable comments. REFERENCES [1] L. Song and R. W. Yeung, Zero-error network coding for acyclic network, IEEE Trans. Inform. Theory, vol. 49, no. 12, pp , Dec [2] R. W. Yeung, A First Course in Information Theory. New York: Kluwer/Plenum, [3] R. W. Yeung, N. Cai, S.-Y. R. Li, and Z. Zhang, Network coding theory, Foundations and Trends in Communications and Information Theory, vol. 2, no. 4 and 5, pp , [4] R. Ahlswede, N. Cai, S.-Y. R. Li, and R. W. Yeung, Network information flow, IEEE Trans. Inform. Theory, vol. 46, no. 7, pp , July [5] X. Yan, J. Yang, and Z. Zhang, An outer bound for multisource multisink network coding with minimum cost consideration, IEEE Trans. Inform. Theory & IEEE/ACM Trans. Networking joint issue), vol. 52, no. 6, pp , June [6] N. Harvey, R. Kleinberg, and A. Lehman, On the capacity of information networks, IEEE Trans. Inform. Theory, vol. 52, no. 6, pp , June [7] G. Kramer and S. A. Savari, Edge-cut bounds on network coding rates, Journal of Network and Systems Management, vol. 14, no. 1, pp , Mar

Network Coding on Directed Acyclic Graphs

Network Coding on Directed Acyclic Graphs Network Coding on Directed Acyclic Graphs John MacLaren Walsh, Ph.D. Multiterminal Information Theory, Spring Quarter, 0 Reference These notes are directly derived from Chapter of R. W. Yeung s Information

More information

Network coding for multicast relation to compression and generalization of Slepian-Wolf

Network coding for multicast relation to compression and generalization of Slepian-Wolf Network coding for multicast relation to compression and generalization of Slepian-Wolf 1 Overview Review of Slepian-Wolf Distributed network compression Error exponents Source-channel separation issues

More information

Amobile satellite communication system, like Motorola s

Amobile satellite communication system, like Motorola s I TRANSACTIONS ON INFORMATION THORY, VOL. 45, NO. 4, MAY 1999 1111 Distributed Source Coding for Satellite Communications Raymond W. Yeung, Senior Member, I, Zhen Zhang, Senior Member, I Abstract Inspired

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

Lecture 3: Channel Capacity

Lecture 3: Channel Capacity Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.

More information

On the Duality between Multiple-Access Codes and Computation Codes

On the Duality between Multiple-Access Codes and Computation Codes On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch

More information

LECTURE 13. Last time: Lecture outline

LECTURE 13. Last time: Lecture outline LECTURE 13 Last time: Strong coding theorem Revisiting channel and codes Bound on probability of error Error exponent Lecture outline Fano s Lemma revisited Fano s inequality for codewords Converse to

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

Characterising Probability Distributions via Entropies

Characterising Probability Distributions via Entropies 1 Characterising Probability Distributions via Entropies Satyajit Thakor, Terence Chan and Alex Grant Indian Institute of Technology Mandi University of South Australia Myriota Pty Ltd arxiv:1602.03618v2

More information

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

The Shannon s basic inequalities refer to the following fundamental properties of entropy function:

The Shannon s basic inequalities refer to the following fundamental properties of entropy function: COMMUNICATIONS IN INFORMATION AND SYSTEMS c 2003 International Press Vol. 3, No. 1, pp. 47-60, June 2003 004 ON A NEW NON-SHANNON TYPE INFORMATION INEQUALITY ZHEN ZHANG Abstract. Recently, K. Makarychev,

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

Network Combination Operations Preserving the Sufficiency of Linear Network Codes

Network Combination Operations Preserving the Sufficiency of Linear Network Codes Network Combination Operations Preserving the Sufficiency of Linear Network Codes Congduan Li, Steven Weber, John MacLaren Walsh ECE Department, Drexel University Philadelphia, PA 904 Abstract Operations

More information

On Multiple User Channels with State Information at the Transmitters

On Multiple User Channels with State Information at the Transmitters On Multiple User Channels with State Information at the Transmitters Styrmir Sigurjónsson and Young-Han Kim* Information Systems Laboratory Stanford University Stanford, CA 94305, USA Email: {styrmir,yhk}@stanford.edu

More information

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122 Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel

More information

Lecture 15: Conditional and Joint Typicaility

Lecture 15: Conditional and Joint Typicaility EE376A Information Theory Lecture 1-02/26/2015 Lecture 15: Conditional and Joint Typicaility Lecturer: Kartik Venkat Scribe: Max Zimet, Brian Wai, Sepehr Nezami 1 Notation We always write a sequence of

More information

Equivalence for Networks with Adversarial State

Equivalence for Networks with Adversarial State Equivalence for Networks with Adversarial State Oliver Kosut Department of Electrical, Computer and Energy Engineering Arizona State University Tempe, AZ 85287 Email: okosut@asu.edu Jörg Kliewer Department

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

LECTURE 10. Last time: Lecture outline

LECTURE 10. Last time: Lecture outline LECTURE 10 Joint AEP Coding Theorem Last time: Error Exponents Lecture outline Strong Coding Theorem Reading: Gallager, Chapter 5. Review Joint AEP A ( ɛ n) (X) A ( ɛ n) (Y ) vs. A ( ɛ n) (X, Y ) 2 nh(x)

More information

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions EE/Stat 376B Handout #5 Network Information Theory October, 14, 014 1. Problem.4 parts (b) and (c). Homework Set # Solutions (b) Consider h(x + Y ) h(x + Y Y ) = h(x Y ) = h(x). (c) Let ay = Y 1 + Y, where

More information

Simultaneous Nonunique Decoding Is Rate-Optimal

Simultaneous Nonunique Decoding Is Rate-Optimal Fiftieth Annual Allerton Conference Allerton House, UIUC, Illinois, USA October 1-5, 2012 Simultaneous Nonunique Decoding Is Rate-Optimal Bernd Bandemer University of California, San Diego La Jolla, CA

More information

Linearly Representable Entropy Vectors and their Relation to Network Coding Solutions

Linearly Representable Entropy Vectors and their Relation to Network Coding Solutions 2009 IEEE Information Theory Workshop Linearly Representable Entropy Vectors and their Relation to Network Coding Solutions Asaf Cohen, Michelle Effros, Salman Avestimehr and Ralf Koetter Abstract In this

More information

Generalized Network Sharing Outer Bound and the Two-Unicast Problem

Generalized Network Sharing Outer Bound and the Two-Unicast Problem Generalized Network Sharing Outer Bound and the Two-Unicast Problem Sudeep U. Kamath, David N. C. Tse and Venkat Anantharam Wireless Foundations, Dept of EECS, University of California at Berkeley, Berkeley,

More information

Lecture 11: Quantum Information III - Source Coding

Lecture 11: Quantum Information III - Source Coding CSCI5370 Quantum Computing November 25, 203 Lecture : Quantum Information III - Source Coding Lecturer: Shengyu Zhang Scribe: Hing Yin Tsang. Holevo s bound Suppose Alice has an information source X that

More information

Variable Length Codes for Degraded Broadcast Channels

Variable Length Codes for Degraded Broadcast Channels Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates

More information

A Comparison of Two Achievable Rate Regions for the Interference Channel

A Comparison of Two Achievable Rate Regions for the Interference Channel A Comparison of Two Achievable Rate Regions for the Interference Channel Hon-Fah Chong, Mehul Motani, and Hari Krishna Garg Electrical & Computer Engineering National University of Singapore Email: {g030596,motani,eleghk}@nus.edu.sg

More information

The Unbounded Benefit of Encoder Cooperation for the k-user MAC

The Unbounded Benefit of Encoder Cooperation for the k-user MAC The Unbounded Benefit of Encoder Cooperation for the k-user MAC Parham Noorzad, Student Member, IEEE, Michelle Effros, Fellow, IEEE, and Michael Langberg, Senior Member, IEEE arxiv:1601.06113v2 [cs.it]

More information

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability

More information

EE5585 Data Compression May 2, Lecture 27

EE5585 Data Compression May 2, Lecture 27 EE5585 Data Compression May 2, 2013 Lecture 27 Instructor: Arya Mazumdar Scribe: Fangying Zhang Distributed Data Compression/Source Coding In the previous class we used a H-W table as a simple example,

More information

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels S. Sandeep Pradhan a, Suhan Choi a and Kannan Ramchandran b, a {pradhanv,suhanc}@eecs.umich.edu, EECS Dept.,

More information

EECS 750. Hypothesis Testing with Communication Constraints

EECS 750. Hypothesis Testing with Communication Constraints EECS 750 Hypothesis Testing with Communication Constraints Name: Dinesh Krithivasan Abstract In this report, we study a modification of the classical statistical problem of bivariate hypothesis testing.

More information

Analyzing Large Communication Networks

Analyzing Large Communication Networks Analyzing Large Communication Networks Shirin Jalali joint work with Michelle Effros and Tracey Ho Dec. 2015 1 The gap Fundamental questions: i. What is the best achievable performance? ii. How to communicate

More information

Cut-Set Bound and Dependence Balance Bound

Cut-Set Bound and Dependence Balance Bound Cut-Set Bound and Dependence Balance Bound Lei Xiao lxiao@nd.edu 1 Date: 4 October, 2006 Reading: Elements of information theory by Cover and Thomas [1, Section 14.10], and the paper by Hekstra and Willems

More information

Alphabet Size Reduction for Secure Network Coding: A Graph Theoretic Approach

Alphabet Size Reduction for Secure Network Coding: A Graph Theoretic Approach ALPHABET SIZE REDUCTION FOR SECURE NETWORK CODING: A GRAPH THEORETIC APPROACH 1 Alphabet Size Reduction for Secure Network Coding: A Graph Theoretic Approach Xuan Guang, Member, IEEE, and Raymond W. Yeung,

More information

EE 4TM4: Digital Communications II. Channel Capacity

EE 4TM4: Digital Communications II. Channel Capacity EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.

More information

SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003

SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003 SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003 SLEPIAN-WOLF RESULT { X i} RATE R x ENCODER 1 DECODER X i V i {, } { V i} ENCODER 0 RATE R v Problem: Determine R, the

More information

Keyless authentication in the presence of a simultaneously transmitting adversary

Keyless authentication in the presence of a simultaneously transmitting adversary Keyless authentication in the presence of a simultaneously transmitting adversary Eric Graves Army Research Lab Adelphi MD 20783 U.S.A. ericsgra@ufl.edu Paul Yu Army Research Lab Adelphi MD 20783 U.S.A.

More information

The Capacity of a Network

The Capacity of a Network The Capacity of a Network April Rasala Lehman MIT Collaborators: Nick Harvey and Robert Kleinberg MIT What is the Capacity of a Network? Source a Source b c d e Sink f Sink What is the Capacity of a Network?

More information

A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality

A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality 0 IEEE Information Theory Workshop A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality Kumar Viswanatha, Emrah Akyol and Kenneth Rose ECE Department, University of California

More information

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity 5-859: Information Theory and Applications in TCS CMU: Spring 23 Lecture 8: Channel and source-channel coding theorems; BEC & linear codes February 7, 23 Lecturer: Venkatesan Guruswami Scribe: Dan Stahlke

More information

National University of Singapore Department of Electrical & Computer Engineering. Examination for

National University of Singapore Department of Electrical & Computer Engineering. Examination for National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:

More information

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

The Poisson Channel with Side Information

The Poisson Channel with Side Information The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH

More information

Lecture 14 February 28

Lecture 14 February 28 EE/Stats 376A: Information Theory Winter 07 Lecture 4 February 8 Lecturer: David Tse Scribe: Sagnik M, Vivek B 4 Outline Gaussian channel and capacity Information measures for continuous random variables

More information

An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel

An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel Forty-Ninth Annual Allerton Conference Allerton House, UIUC, Illinois, USA September 8-3, An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel Invited Paper Bernd Bandemer and

More information

Beyond the Butterfly A Graph-Theoretic Characterization of the Feasibility of Network Coding with Two Simple Unicast Sessions

Beyond the Butterfly A Graph-Theoretic Characterization of the Feasibility of Network Coding with Two Simple Unicast Sessions Beyond the Butterfly A Graph-Theoretic Characterization of the Feasibility of Network Coding with Two Simple Unicast Sessions Chih-Chun Wang Center for Wireless Systems and Applications (CWSA) School of

More information

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels Representation of Correlated s into Graphs for Transmission over Broadcast s Suhan Choi Department of Electrical Eng. and Computer Science University of Michigan, Ann Arbor, MI 80, USA Email: suhanc@eecs.umich.edu

More information

Lossy Distributed Source Coding

Lossy Distributed Source Coding Lossy Distributed Source Coding John MacLaren Walsh, Ph.D. Multiterminal Information Theory, Spring Quarter, 202 Lossy Distributed Source Coding Problem X X 2 S {,...,2 R } S 2 {,...,2 R2 } Ẑ Ẑ 2 E d(z,n,

More information

List of Figures. Acknowledgements. Abstract 1. 1 Introduction 2. 2 Preliminaries Superposition Coding Block Markov Encoding...

List of Figures. Acknowledgements. Abstract 1. 1 Introduction 2. 2 Preliminaries Superposition Coding Block Markov Encoding... Contents Contents i List of Figures iv Acknowledgements vi Abstract 1 1 Introduction 2 2 Preliminaries 7 2.1 Superposition Coding........................... 7 2.2 Block Markov Encoding.........................

More information

Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel

Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel Jun Chen Dept. of Electrical and Computer Engr. McMaster University Hamilton, Ontario, Canada Chao Tian AT&T Labs-Research 80 Park

More information

Multicoding Schemes for Interference Channels

Multicoding Schemes for Interference Channels Multicoding Schemes for Interference Channels 1 Ritesh Kolte, Ayfer Özgür, Haim Permuter Abstract arxiv:1502.04273v1 [cs.it] 15 Feb 2015 The best known inner bound for the 2-user discrete memoryless interference

More information

Approximating the Gaussian multiple description rate region under symmetric distortion constraints

Approximating the Gaussian multiple description rate region under symmetric distortion constraints Approximating the Gaussian multiple description rate region under symmetric distortion constraints Chao Tian AT&T Labs-Research Florham Park, NJ 0793, USA. tian@research.att.com Soheil Mohajer Suhas Diggavi

More information

Lecture 11: Polar codes construction

Lecture 11: Polar codes construction 15-859: Information Theory and Applications in TCS CMU: Spring 2013 Lecturer: Venkatesan Guruswami Lecture 11: Polar codes construction February 26, 2013 Scribe: Dan Stahlke 1 Polar codes: recap of last

More information

A Comparison of Superposition Coding Schemes

A Comparison of Superposition Coding Schemes A Comparison of Superposition Coding Schemes Lele Wang, Eren Şaşoğlu, Bernd Bandemer, and Young-Han Kim Department of Electrical and Computer Engineering University of California, San Diego La Jolla, CA

More information

On the Capacity Region of the Gaussian Z-channel

On the Capacity Region of the Gaussian Z-channel On the Capacity Region of the Gaussian Z-channel Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@eng.umd.edu ulukus@eng.umd.edu

More information

A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding

A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation

More information

On Scalable Coding in the Presence of Decoder Side Information

On Scalable Coding in the Presence of Decoder Side Information On Scalable Coding in the Presence of Decoder Side Information Emrah Akyol, Urbashi Mitra Dep. of Electrical Eng. USC, CA, US Email: {eakyol, ubli}@usc.edu Ertem Tuncel Dep. of Electrical Eng. UC Riverside,

More information

On Function Computation with Privacy and Secrecy Constraints

On Function Computation with Privacy and Secrecy Constraints 1 On Function Computation with Privacy and Secrecy Constraints Wenwen Tu and Lifeng Lai Abstract In this paper, the problem of function computation with privacy and secrecy constraints is considered. The

More information

Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels

Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels Bike ie, Student Member, IEEE and Richard D. Wesel, Senior Member, IEEE Abstract Certain degraded broadcast channels

More information

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem LECTURE 15 Last time: Feedback channel: setting up the problem Perfect feedback Feedback capacity Data compression Lecture outline Joint source and channel coding theorem Converse Robustness Brain teaser

More information

Distributed Lossless Compression. Distributed lossless compression system

Distributed Lossless Compression. Distributed lossless compression system Lecture #3 Distributed Lossless Compression (Reading: NIT 10.1 10.5, 4.4) Distributed lossless source coding Lossless source coding via random binning Time Sharing Achievability proof of the Slepian Wolf

More information

2014 IEEE International Symposium on Information Theory. Two-unicast is hard. David N.C. Tse

2014 IEEE International Symposium on Information Theory. Two-unicast is hard. David N.C. Tse Two-unicast is hard Sudeep Kamath ECE Department, University of California, San Diego, CA, USA sukamath@ucsd.edu David N.C. Tse EECS Department, University of California, Berkeley, CA, USA dtse@eecs.berkeley.edu

More information

Capacity Region of the Permutation Channel

Capacity Region of the Permutation Channel Capacity Region of the Permutation Channel John MacLaren Walsh and Steven Weber Abstract We discuss the capacity region of a degraded broadcast channel (DBC) formed from a channel that randomly permutes

More information

Exercises with solutions (Set B)

Exercises with solutions (Set B) Exercises with solutions (Set B) 3. A fair coin is tossed an infinite number of times. Let Y n be a random variable, with n Z, that describes the outcome of the n-th coin toss. If the outcome of the n-th

More information

Multiterminal Networks: Rate Regions, Codes, Computations, & Forbidden Minors

Multiterminal Networks: Rate Regions, Codes, Computations, & Forbidden Minors Multiterminal etworks: Rate Regions, Codes, Computations, & Forbidden Minors Ph D Thesis Proposal Congduan Li ASPITRG & MAL Drexel University congduanli@gmailcom October 5, 204 C Li (ASPITRG & MAL) Thesis

More information

Information Masking and Amplification: The Source Coding Setting

Information Masking and Amplification: The Source Coding Setting 202 IEEE International Symposium on Information Theory Proceedings Information Masking and Amplification: The Source Coding Setting Thomas A. Courtade Department of Electrical Engineering University of

More information

Chain Independence and Common Information

Chain Independence and Common Information 1 Chain Independence and Common Information Konstantin Makarychev and Yury Makarychev Abstract We present a new proof of a celebrated result of Gács and Körner that the common information is far less than

More information

A Summary of Multiple Access Channels

A Summary of Multiple Access Channels A Summary of Multiple Access Channels Wenyi Zhang February 24, 2003 Abstract In this summary we attempt to present a brief overview of the classical results on the information-theoretic aspects of multiple

More information

II. THE TWO-WAY TWO-RELAY CHANNEL

II. THE TWO-WAY TWO-RELAY CHANNEL An Achievable Rate Region for the Two-Way Two-Relay Channel Jonathan Ponniah Liang-Liang Xie Department of Electrical Computer Engineering, University of Waterloo, Canada Abstract We propose an achievable

More information

On Common Information and the Encoding of Sources that are Not Successively Refinable

On Common Information and the Encoding of Sources that are Not Successively Refinable On Common Information and the Encoding of Sources that are Not Successively Refinable Kumar Viswanatha, Emrah Akyol, Tejaswi Nanjundaswamy and Kenneth Rose ECE Department, University of California - Santa

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

Entropy Vectors and Network Information Theory

Entropy Vectors and Network Information Theory Entropy Vectors and Network Information Theory Sormeh Shadbakht and Babak Hassibi Department of Electrical Engineering Caltech Lee Center Workshop May 25, 2007 1 Sormeh Shadbakht and Babak Hassibi Entropy

More information

Facets of Entropy. Raymond W. Yeung. October 4, 2012

Facets of Entropy. Raymond W. Yeung. October 4, 2012 Facets of Entropy Raymond W. Yeung October 4, 2012 Constraints on the entropy function are of fundamental importance in information theory. For a long time, the polymatroidal axioms, or equivalently the

More information

On the Capacity of the Two-Hop Half-Duplex Relay Channel

On the Capacity of the Two-Hop Half-Duplex Relay Channel On the Capacity of the Two-Hop Half-Duplex Relay Channel Nikola Zlatanov, Vahid Jamali, and Robert Schober University of British Columbia, Vancouver, Canada, and Friedrich-Alexander-University Erlangen-Nürnberg,

More information

Joint Source-Channel Coding for the Multiple-Access Relay Channel

Joint Source-Channel Coding for the Multiple-Access Relay Channel Joint Source-Channel Coding for the Multiple-Access Relay Channel Yonathan Murin, Ron Dabora Department of Electrical and Computer Engineering Ben-Gurion University, Israel Email: moriny@bgu.ac.il, ron@ee.bgu.ac.il

More information

Symmetry in Network Coding

Symmetry in Network Coding Symmetry in Network Coding Formalization, Graph-theoretic Characterization, and Computation Jayant Apte John Walsh Department of Electrical and Computer Engineering Drexel University, Philadelphia ISIT,

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

ECE 4400:693 - Information Theory

ECE 4400:693 - Information Theory ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential

More information

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute ENEE 739C: Advanced Topics in Signal Processing: Coding Theory Instructor: Alexander Barg Lecture 6 (draft; 9/6/03. Error exponents for Discrete Memoryless Channels http://www.enee.umd.edu/ abarg/enee739c/course.html

More information

The Method of Types and Its Application to Information Hiding

The Method of Types and Its Application to Information Hiding The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,

More information

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR 1 Stefano Rini, Daniela Tuninetti and Natasha Devroye srini2, danielat, devroye @ece.uic.edu University of Illinois at Chicago Abstract

More information

Entropies & Information Theory

Entropies & Information Theory Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information

More information

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information 4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information Ramji Venkataramanan Signal Processing and Communications Lab Department of Engineering ramji.v@eng.cam.ac.uk

More information

A digital interface for Gaussian relay and interference networks: Lifting codes from the discrete superposition model

A digital interface for Gaussian relay and interference networks: Lifting codes from the discrete superposition model A digital interface for Gaussian relay and interference networks: Lifting codes from the discrete superposition model M. Anand, Student Member, IEEE, and P. R. Kumar, Fellow, IEEE Abstract For every Gaussian

More information

Frans M.J. Willems. Authentication Based on Secret-Key Generation. Frans M.J. Willems. (joint work w. Tanya Ignatenko)

Frans M.J. Willems. Authentication Based on Secret-Key Generation. Frans M.J. Willems. (joint work w. Tanya Ignatenko) Eindhoven University of Technology IEEE EURASIP Spain Seminar on Signal Processing, Communication and Information Theory, Universidad Carlos III de Madrid, December 11, 2014 : Secret-Based Authentication

More information

Classical codes for quantum broadcast channels. arxiv: Ivan Savov and Mark M. Wilde School of Computer Science, McGill University

Classical codes for quantum broadcast channels. arxiv: Ivan Savov and Mark M. Wilde School of Computer Science, McGill University Classical codes for quantum broadcast channels arxiv:1111.3645 Ivan Savov and Mark M. Wilde School of Computer Science, McGill University International Symposium on Information Theory, Boston, USA July

More information

Subset Typicality Lemmas and Improved Achievable Regions in Multiterminal Source Coding

Subset Typicality Lemmas and Improved Achievable Regions in Multiterminal Source Coding Subset Typicality Lemmas and Improved Achievable Regions in Multiterminal Source Coding Kumar Viswanatha, Emrah Akyol and Kenneth Rose ECE Department, University of California - Santa Barbara {kumar,eakyol,rose}@ece.ucsb.edu

More information

Distributed Functional Compression through Graph Coloring

Distributed Functional Compression through Graph Coloring Distributed Functional Compression through Graph Coloring Vishal Doshi, Devavrat Shah, Muriel Médard, and Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology

More information

On Network Coding Capacity - Matroidal Networks and Network Capacity Regions. Anthony Eli Kim

On Network Coding Capacity - Matroidal Networks and Network Capacity Regions. Anthony Eli Kim On Network Coding Capacity - Matroidal Networks and Network Capacity Regions by Anthony Eli Kim S.B., Electrical Engineering and Computer Science (2009), and S.B., Mathematics (2009) Massachusetts Institute

More information

Communications Theory and Engineering

Communications Theory and Engineering Communications Theory and Engineering Master's Degree in Electronic Engineering Sapienza University of Rome A.A. 2018-2019 AEP Asymptotic Equipartition Property AEP In information theory, the analog of

More information

Exploiting Symmetry in Computing Polyhedral Bounds on Network Coding Rate Regions

Exploiting Symmetry in Computing Polyhedral Bounds on Network Coding Rate Regions Exploiting Symmetry in Computing Polyhedral Bounds on Network Coding Rate Regions Jayant Apte John Walsh Department of Electrical and Computer Engineering Drexel University, Philadelphia NetCod, 205 NSF

More information

Design of linear Boolean network codes for combination networks

Design of linear Boolean network codes for combination networks Louisiana State University LSU Digital Commons LSU Master's Theses Graduate School 2006 Design of linear Boolean network codes for combination networks Shoupei Li Louisiana State University and Agricultural

More information

Capacity of AWGN channels

Capacity of AWGN channels Chapter 3 Capacity of AWGN channels In this chapter we prove that the capacity of an AWGN channel with bandwidth W and signal-tonoise ratio SNR is W log 2 (1+SNR) bits per second (b/s). The proof that

More information

Bounds and Capacity Results for the Cognitive Z-interference Channel

Bounds and Capacity Results for the Cognitive Z-interference Channel Bounds and Capacity Results for the Cognitive Z-interference Channel Nan Liu nanliu@stanford.edu Ivana Marić ivanam@wsl.stanford.edu Andrea J. Goldsmith andrea@wsl.stanford.edu Shlomo Shamai (Shitz) Technion

More information

X 1 : X Table 1: Y = X X 2

X 1 : X Table 1: Y = X X 2 ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access

More information