Network Coding on Directed Acyclic Graphs

Size: px
Start display at page:

Download "Network Coding on Directed Acyclic Graphs"

Transcription

1 Network Coding on Directed Acyclic Graphs John MacLaren Walsh, Ph.D. Multiterminal Information Theory, Spring Quarter, 0 Reference These notes are directly derived from Chapter of R. W. Yeung s Information Theory and Network Coding, Springer, 008. Motivating Example: Butterfly Network s s b t t Figure : The butterfly example. We presented the butterfly example as a case where we could increase the capacity region of a network by incorporating coding between flows at intermediate network nodes. Summing the messages on link b allow both s and s to be multicasted to both sinks t, t. 3 Definition of a Network Code & Coding Capacity Region In these lectures we model a network as a directed acyclic graph G = V, E). There is a finite set of vertices or nodes V, and collection of edges e E which are ordered pairs of vertices e = v, v ), v, v V. We call vertex v the tail of edge e = v, v ) and the vertex v the head of edge e. A sequence of vertices e, e,..., e k such that the head of edge e n is the tail of the next edge e n+ is called a directed path in the the graph G. A directed path with the property that the tail of e is the head of edge e k is called a cycle, and the graph is called acyclic if it has no cycles. There is also a set of source nodes S V and sink nodes T V with S T =. Each source node s is endowed with a source variable X s uniformly distributed over the set X s =,,..., Nτs. The variables X s, s S are mutually independent, and represent messages to be sent over the network. Additionally, the edges of the graph are associated with capacity itations R e, indicating the number of bits per source time instant which can be sent over these edges. Each sink t T has a subset of source variables, those with indices in βt) S, that it wishes to determine. In order to make this possible, each node in the network will encode all of the messages it hears on its incoming edges i.e. all those edges that have it as a tail) into a message to be sent on its outgoing edges i.e. all those edges that have it as a head). These nodes do this using the functions i V \ S T ), e Outi) k e : 0,,..., η d 0,,..., η e ) d Ine) where Ini) = e E e = j, i) some j V is the set of edges having node i as their tail and Outi) = e E e = i, j) some j V is the set of edges having node i as their head. The source nodes encode their sources into messages

2 through the functions s S, e Outs), k e : X s 0,,..., η e ) The sink nodes reproduce the source messages using the encoded messages available locally to them through the functions t T, g t : 0,,..., η d X s 3) d Int) The aggregate of these functions ),),3) are collectively known as a N, η e : e E), τ s : s S network code. For this to work, all of the messages must have arrived on the incoming edges before the ones on the outgoing edges are calculated. This is enabled through the assumption of an acyclic network. For any finite directed acyclic graph, it is possible to order the nodes in a sequence such that if e E, e = i, j), then node i appears before node j in the sequence. By selecting such an order to perform the encoding among the nodes, every node will have all of the messages from its incoming edges before it calculates the message on the outgoing nodes. Let g t X S ) represent the composition of all functions from the sources to the sink t. We say that a collection of source rates ω s, s S are achievable if for arbitrarily small ɛ > 0, there exists a network code such that s βt) N log η e R e + ɛ e E 4) τ s ω s ɛ 5) P [ g t X S ) X βt) ] ɛ 6) The set of all achievable rate vectors ω, denoted by R, is the network coding capacity region. 4 Network Coding Capacity Region If a collection of rates were ω achievable with zero probability of error and a block length of N = for a particular network code, then, after identifying the random variables Y s, s S with the sources X s, s S, and the random variables U e with the coded message on edge e, for this code we would have the inequalities Indeed HY s ) ω s s S 7) HY S ) = s S HY s ) 8) HU Outs) Y s ) = 0 s S 9) HU Outi) U Ini) ) = 0 i V \ S T ) 0) HU e ) R e ) HY βt) U Int) ) = 0 ) 7) reflects the fact that the sources must be uniform over a set with cardinality Nτs with τ s ω s 8) reflects the requirement that the sources are independent of one another. 9) reflects that the message encoded by a source node is a function of the source available to it., i.e. ) 0) reflects that the messages on the outgoing edges from a node are a function of the messages on its incoming edges, i.e. ) ) reflects the itations on edge capacity 4) ) indicates the zero probability of error reconstruction. Of course, our notion of an achievable rate in the network coding capacity region R, was the usual Shannon lossless notion, which allows a non-zero, but finite probability of error as indicated by 6) and an arbitrarily large block length N and closure in rate space as represented by 4) and 6). Surprisingly, this lossless network coding capacity region can be written directly in terms of the inequalities 7,8,9,0,,) with an expression we shall define presently. The first bit of notation is to stack subset entropies into a vector. That is, given a collection of M = V + E random variables, there are M non-empty subsets of the random variables, and to each such subset we have an associated entropy.

3 We stack the entropies of these subsets into a vector h of dimension M, and will index this vector via the subset, so that for instance h A will represent the joint entropy of the random variables in A. The ordering for the indexing can be done, for instance, by using the integer associated with the length M binary string whose kth bit indicates whether or not k is in A.) We then consider each of the inequalities 8,9,0,,) as linear inequalities for this vector, defining the linear constraint sets L = h R M h Y S = h Ys 3) s S L = h R M huouts) Y s h Ys = 0 s S 4) L 3 = h R M huouti) U Ini) h UIni) = 0 i V \ S T ) 5) L 4 = h R M h Ue R e e E 6) L 5 = h R M hyβt) U Int) h UInt) = 0 t T 7) Additionally, introduce the following notation. Proj YS B) := h Ys : s S) h B. ΛB) := h R S + h h, h B 3. convb) the convex hull of the set B 4. B the closure of the set B 5. Γ M = h R N + Z, Z,..., Z M ) finite discrete random variables with h A = HZ A ) A,..., M Yeung and his co-workers have shown that the network coding capacity region R is equal to R = Λ Proj YS conv Γ M L 3) L 4 L 5 8) 4. Converse Sketch For the converse, we must show that R R. Consider an achievable rate vector ω R and a monotone decreasing sequence ɛ k 0 as k. Then for every k for every N sufficiently large there exists a network code such that N log η e R e + ɛ k e E 9) τ s ω s ɛ k 0) P [ g t X S ) X βt) ] ɛ k ) Let U e be the message sent by the network code on edge e for all e E, and identify Y s = X s as the source variables. Because the source variables are independent and because encodings on outgoing edges are a function of the messages on incoming edges to a node, the inequalities 7,8,9,0) hold among these finite discrete random variables Y s, s S, U e, e E. Now, Fano s inequality states that HY βt) U Int) ) + k) t log Y βt) = + k) t HY βt) ) + ɛ k HY βt) ) ) While we can upper bound the entropy of HY βt) ) using HY βt) ) = IY βt) ; U Int) ) + HY βt) U Int) ) HU Int) ) + HY βt) U Int) ) 3) HU Int) ) + + k) t HY βt) ) log η e + ) + + k) t HY βt) ) NR e + ɛ k ) + ɛ k) HY βt) 4) ) e Int) Solving for an inequality for HY βt) ) we get HY βt) ) N ɛ k e Int) R e + ɛ k N e Int) 5) 3

4 which when substituted back into Fano ) gives HY βt) U Int) ) N N + ɛ k R e + ɛ k. 6) ɛ k N e Int) φ tn,ɛ k ) Here, it is clear the function φ t N, ɛ k ) is bounded, is monotone decreasing in both k and N, and approaches 0 as k, N. Moving next to the edge capacity constraints, and the entropies of the sources we observe that HU e ) log η e + ) NR e + ɛ k ) HY s ) Nω s ɛ k ) 7) If we define the half spaces, reminiscent of L 4, L 5 in the more general case L N 4,ɛ k = h R M h Ue NR e ɛ k ) e E L N 5,ɛ k = h R M hyβt) U Int) h UInt) Nφ t N, ɛ k ) t T 8) 9) we observe that the subset entropies of this network code h k) lie in the set h k) Γ M L 3 L N 4,ɛ k L N 5,ɛ k, h Ys Nω s ɛ k ), s S 30) Now, since 0 Γ M L 3 L N 4,ɛ k L N 5,ɛ k, and N h can be viewed as a convex combination with h and zero, we observe that where N h k) conv Γ M L 3) L 4,ɛk L 5,ɛk, N h Ys ω s ɛ k 3) L 4,ɛk = L 5,ɛk = h R M h Ue R e + ɛ k ) e E h R M hyβt) U Int) h UInt) φ t N, ɛ k ) t T 3) 33) Defining the constraint set in 3) to be B N,k) we observe that the B N,k) s are monotone decreasing in that hence B N+,k) B N,k) B N,k+) B N,k) 34) N,k BN,k) = N= k= B N,k) 35) and the latter set, since it involves the intersection of the inequalities in L 4,ɛk and L 5,ɛk, becomes N,k N h k) conv Γ M L 3) L 4 L 5, Rearranging this fact, we have that if ω is achievable, then ω Λ proj YS conv Γ M L 3) L 4 L 5 which is what we needed to prove. 4. Obtaining Inner and Outer Bounds N h k) Y N,k s ω s, s S 36) We discussed that the capacity region presented in 8) is implicit, in that we don t generally know all of the inequalities necessary to describe Γ M in fact, the closure of its infinite cardinality counterpart Γ M is not even polyhedral for M 4). It is possible to obtain inner and outer bounds on the capacity region by substituting in inner and outer bounds for Γ M. We discussed a polyhedral outer bound for Γ M known as the Shannon outer bound Γ M. One way to write the Shannon outer bound is the set of vectors obeying the properties that entropy is a non-decreasing set function that is sub-modular: Γ M := h R M h A h A A A 38) h A + h B h A B + h A B A, B,..., M Inner bound TBD. 37) 4

5 4.3 Achievability Sketch We begin by proving an alternate form of the capacity region 8). Let D ) be a set operator which scales all the points in the set by numbers in between zero and one: DA) = αh h A, α [0, ] 39) We will prove that the convex hull can be replaced by scalings in the capacity region expression, so that Λ proj YS conv Γ M L 3) L 4 L 5 = Λ proj YS D Γ M L 3) L 4 L 5. 40) To do this, we will show that D Γ M L 3) = conv Γ M L 3) 4) Consider a point h D Γ M L 3). It is the it of some sequence h k D Γ M L 3), where h k = α k ĥ k for some ĥ k Γ M L 3 and α k [0, ]. Noting that 0 Γ M L 3, we can view α k ĥ k as the convex combination α k ĥ k + α k )0) convγ M L 3). This shows that D Γ M L 3) conv Γ M L 3). To prove the other containment, we show that DΓ M L 3) is a convex set containing Γ M L 3. Since the convex hull convγ M L 3) is defined as the smallest convex set containing Γ M L 3, the convexity of DΓ M L 3) will guarantee that it contains convγ M L 3) DΓ M L 3).) Consider two points h, h DΓ M L 3), and select any λ [0, ]. These points are its of the sequences h k) = α k) ĥk) and h k) = α k) ĥk) with α k), αk) 0, ] and ĥk), ĥk) Γ M L 3. Select a sequence of positive integers n k, n k N with n k, n k as k and with n k α k n k λ 4) αk λ Letting the collection of random variables Z,..., ZM ) and Z,..., ZM ) be random variables obtaining any ĥ, ĥ Γ M L 3 respectively, we observe that the collection of random variables Z,..., Z M defined via are associated with the entropies n ĥ + n ĥ sufficiently large we have This then shows that Z m = Zm,,..., Zm,n, Zm,,..., Zm,n ), m,..., M 43) n i.i.d. copies of Zm n i.i.d. copies of Zm α k α k n k αk + nk αk for all k sufficiently large, which then implies that α α k k k n k αk + nk αk Γ M L 3, hence n k ĥk) + n k ĥk) ) Γ M L 3. α k α k n k αk + nk αk However, rearranging the terms inside the it we have α α k k ) k n k αk + n k ĥk) nk αk + n k n k ĥk) ) = α k k n k αk + nk αk Additionally for k 44) n k ĥk) + n k ĥk) ) Γ M L 3 45) ) n k ĥk) + n k ĥk) ) Γ M L 3 46) α k ĥk) + k n k α k n k αk + nk αk α k ĥk) = λh + λ)h 47) is in Γ M L 3, proving that it is convex. This establishes We must now prove that any vector ω in the alternate rate region representation ω Λ proj YS DΓ M L 3) L 4 L 5 48) is acheivable. Since we can discard ) any excess rate we wish not to use, this amounts to showing that any rate vector ω proj YS DΓ M L 3) L 4 L 5 is achievable. Let h DΓ M L 3) L 4 L 5 be a vector such that ω = proj YS h. Since h DΓ M L 3) L 4 L 5 it is the it of some sequence h k DΓ M L 3) which is of the form α k ĥ k with 5

6 ĥ k Γ M L 3. Let Ys k : s S), Ue k : e E be the random variables associated with ĥk, since the entropies for these variables are in L 3 they obey the inequalities HY k S ) = s S HY k s ) 49) HUOuts) k Y s k ) = 0 s S 50) HU Outi) U Ini) ) = 0 i V \ S T ) 5) The equalities 50) and 5) show that we may think of the random variables Ue k = f e,k Ys k ) as a deterministic function f s,e,k of the source random variable Ys k for each e Outs) and s S and the random variables Ue k = f e,k UIni) k ) as a deterministic function of the random variables UIni) k for each e Outi) for each i V \ S T ). Additionally, since the it of the scaled entropies h k is in L 4 L 5 and has proj YS k h k = ω, we have where γ k 0 and µ k 0 as k while ω k s ω s. Let ˆN k = αn k. For each source s, generate a N kτ k s α k HY k s ) = ω k s s S 5) α k HU k e ) R e + µ k e E 53) α k HY k βt) U Int γ k t T 54) ˆNk dimensional matrix by sampling its elements i.i.d. according to the distribution p Ys, let the jth row be denoted by Y ˆN k s j), j,..., N kτ s. For each edge, enumerate all of the length ˆN k typical sequences T ˆN k ɛ Ue k ) as U ˆN k e,k ),..., U ˆN k e,k ηk e ). Due to the bound on the cardinality of the typical set, for such an enumeration ηe k ˆN k HU k e )+ɛc) α kn k HU k e )+ɛc e,k) 55) so that N k log η k e α k HU k e ) + ɛc e,k R e + µ k + ɛc e,k 56) The encoder at source node s selects at random one of the N kτ k s rows in its matrix, then calculates the deterministic function f s,k Y ˆN k s j operating elementwise on each of the ˆN k positions in the vector. Provided we select ɛ e,k Y k s ɛ s,k e Outs) 57) if Y ˆN k s j) T ˆN k ɛ s,k Ys k ), then the result of these deterministic function will be in T ˆN k ɛ e,k Ue k ), and together the outgoing messages will all be jointly typical, i.e. in T ˆN k ɛ Outs),k UOuts) k, Y s k ). The messages sent are the associated typical sequence index from,..., η e from the deterministic function, or 0 if the input was not typical. Via the Markov lemma if we take N sufficiently large, then we observe that if Y ˆN k s j) T ˆN k ɛ s,k Y k s ) for each s S, then all of the messages outgoing from these source are together jointly typical. Proceeding via the order defined by the directed acyclic graph for the operation of the encoders such that all incoming message are available before an outgoing message is calculated), we observe that provided all incoming messages are jointly typical, the outgoing messages calculated with the deterministic functions f e,k operating on the typical sequences associated with the incoming indices) will be jointly typical themselves, and jointly typical with everything computed so far. Thus, provided that each of the selected messages Y ˆN k s j s ) are typical, they will be jointly typical with the sequences associated with each of the incoming messages at a sink. The sink operates by looking in the rows of the codebooks for the sources βt) for a collection of codewords that are jointly typical together with the incoming messages U ˆN k Int). By the logic above, there will be at least one collection corresponding to the correct decoding) of such sequences provided that Y ˆN k s j s ) are typical. If there is more than one such collection, then an error is declared. This error event is then the union, over all subsets A of βt), of the events E A for which U ˆN k Int), Y ˆN k A j A ), Y ˆN k A j c A c) are jointly typical for some j s j s for each s A. These events associated with the independent codewords Y ˆN s j s), s A winding up in the jointly typical set with Y ˆN s j s ), s A c and U ˆN Int), have probabilities bounded by P[E A ] N s A τs ˆNIY A ;Y A c,u Int) ɛc) 58) These will go to zero exponentially as N k provided that we select τs k slightly less than α k HY s ) = ω s. This, together with 49,50,5,56), shows that the ωs k are achievable according to the definition 4,5,6) ) for sufficiently large k and N k. 6

The Capacity Region for Multi-source Multi-sink Network Coding

The Capacity Region for Multi-source Multi-sink Network Coding The Capacity Region for Multi-source Multi-sink Network Coding Xijin Yan Dept. of Electrical Eng. - Systems University of Southern California Los Angeles, CA, U.S.A. xyan@usc.edu Raymond W. Yeung Dept.

More information

Lossy Distributed Source Coding

Lossy Distributed Source Coding Lossy Distributed Source Coding John MacLaren Walsh, Ph.D. Multiterminal Information Theory, Spring Quarter, 202 Lossy Distributed Source Coding Problem X X 2 S {,...,2 R } S 2 {,...,2 R2 } Ẑ Ẑ 2 E d(z,n,

More information

Network Combination Operations Preserving the Sufficiency of Linear Network Codes

Network Combination Operations Preserving the Sufficiency of Linear Network Codes Network Combination Operations Preserving the Sufficiency of Linear Network Codes Congduan Li, Steven Weber, John MacLaren Walsh ECE Department, Drexel University Philadelphia, PA 904 Abstract Operations

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

Capacity Region of the Permutation Channel

Capacity Region of the Permutation Channel Capacity Region of the Permutation Channel John MacLaren Walsh and Steven Weber Abstract We discuss the capacity region of a degraded broadcast channel (DBC) formed from a channel that randomly permutes

More information

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions EE/Stat 376B Handout #5 Network Information Theory October, 14, 014 1. Problem.4 parts (b) and (c). Homework Set # Solutions (b) Consider h(x + Y ) h(x + Y Y ) = h(x Y ) = h(x). (c) Let ay = Y 1 + Y, where

More information

LECTURE 13. Last time: Lecture outline

LECTURE 13. Last time: Lecture outline LECTURE 13 Last time: Strong coding theorem Revisiting channel and codes Bound on probability of error Error exponent Lecture outline Fano s Lemma revisited Fano s inequality for codewords Converse to

More information

Network coding for multicast relation to compression and generalization of Slepian-Wolf

Network coding for multicast relation to compression and generalization of Slepian-Wolf Network coding for multicast relation to compression and generalization of Slepian-Wolf 1 Overview Review of Slepian-Wolf Distributed network compression Error exponents Source-channel separation issues

More information

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12 Lecture 1: The Multiple Access Channel Copyright G. Caire 12 Outline Two-user MAC. The Gaussian case. The K-user case. Polymatroid structure and resource allocation problems. Copyright G. Caire 13 Two-user

More information

Matroid Bounds on the Region of Entropic Vectors

Matroid Bounds on the Region of Entropic Vectors Matroid Bounds on the Region of Entropic Vectors Congduan Li, John MacLaren Walsh, Steven Weber Drexel University, Dept of ECE, Philadelphia, PA 19104, USA congduanli@drexeledu, jwalsh@coedrexeledu, sweber@coedrexeledu

More information

Exploiting Symmetry in Computing Polyhedral Bounds on Network Coding Rate Regions

Exploiting Symmetry in Computing Polyhedral Bounds on Network Coding Rate Regions Exploiting Symmetry in Computing Polyhedral Bounds on Network Coding Rate Regions Jayant Apte John Walsh Department of Electrical and Computer Engineering Drexel University, Philadelphia NetCod, 205 NSF

More information

Alphabet Size Reduction for Secure Network Coding: A Graph Theoretic Approach

Alphabet Size Reduction for Secure Network Coding: A Graph Theoretic Approach ALPHABET SIZE REDUCTION FOR SECURE NETWORK CODING: A GRAPH THEORETIC APPROACH 1 Alphabet Size Reduction for Secure Network Coding: A Graph Theoretic Approach Xuan Guang, Member, IEEE, and Raymond W. Yeung,

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem LECTURE 15 Last time: Feedback channel: setting up the problem Perfect feedback Feedback capacity Data compression Lecture outline Joint source and channel coding theorem Converse Robustness Brain teaser

More information

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity 5-859: Information Theory and Applications in TCS CMU: Spring 23 Lecture 8: Channel and source-channel coding theorems; BEC & linear codes February 7, 23 Lecturer: Venkatesan Guruswami Scribe: Dan Stahlke

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

A computational approach for determining rate regions and codes using entropic vector bounds

A computational approach for determining rate regions and codes using entropic vector bounds 1 A computational approach for determining rate regions and codes using entropic vector bounds Congduan Li, John MacLaren Walsh, Steven Weber Drexel University, Dept. of ECE, Philadelphia, PA 19104, USA

More information

Multiterminal Networks: Rate Regions, Codes, Computations, & Forbidden Minors

Multiterminal Networks: Rate Regions, Codes, Computations, & Forbidden Minors Multiterminal etworks: Rate Regions, Codes, Computations, & Forbidden Minors Ph D Thesis Proposal Congduan Li ASPITRG & MAL Drexel University congduanli@gmailcom October 5, 204 C Li (ASPITRG & MAL) Thesis

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122 Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel

More information

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel

More information

Lecture 11: Polar codes construction

Lecture 11: Polar codes construction 15-859: Information Theory and Applications in TCS CMU: Spring 2013 Lecturer: Venkatesan Guruswami Lecture 11: Polar codes construction February 26, 2013 Scribe: Dan Stahlke 1 Polar codes: recap of last

More information

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels Representation of Correlated s into Graphs for Transmission over Broadcast s Suhan Choi Department of Electrical Eng. and Computer Science University of Michigan, Ann Arbor, MI 80, USA Email: suhanc@eecs.umich.edu

More information

On Multiple User Channels with State Information at the Transmitters

On Multiple User Channels with State Information at the Transmitters On Multiple User Channels with State Information at the Transmitters Styrmir Sigurjónsson and Young-Han Kim* Information Systems Laboratory Stanford University Stanford, CA 94305, USA Email: {styrmir,yhk}@stanford.edu

More information

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,

More information

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet. EE376A - Information Theory Final, Monday March 14th 216 Solutions Instructions: You have three hours, 3.3PM - 6.3PM The exam has 4 questions, totaling 12 points. Please start answering each question on

More information

Characterising Probability Distributions via Entropies

Characterising Probability Distributions via Entropies 1 Characterising Probability Distributions via Entropies Satyajit Thakor, Terence Chan and Alex Grant Indian Institute of Technology Mandi University of South Australia Myriota Pty Ltd arxiv:1602.03618v2

More information

EECS 229A Spring 2007 * * (a) By stationarity and the chain rule for entropy, we have

EECS 229A Spring 2007 * * (a) By stationarity and the chain rule for entropy, we have EECS 229A Spring 2007 * * Solutions to Homework 3 1. Problem 4.11 on pg. 93 of the text. Stationary processes (a) By stationarity and the chain rule for entropy, we have H(X 0 ) + H(X n X 0 ) = H(X 0,

More information

Linearly Representable Entropy Vectors and their Relation to Network Coding Solutions

Linearly Representable Entropy Vectors and their Relation to Network Coding Solutions 2009 IEEE Information Theory Workshop Linearly Representable Entropy Vectors and their Relation to Network Coding Solutions Asaf Cohen, Michelle Effros, Salman Avestimehr and Ralf Koetter Abstract In this

More information

Lecture 11: Quantum Information III - Source Coding

Lecture 11: Quantum Information III - Source Coding CSCI5370 Quantum Computing November 25, 203 Lecture : Quantum Information III - Source Coding Lecturer: Shengyu Zhang Scribe: Hing Yin Tsang. Holevo s bound Suppose Alice has an information source X that

More information

Homework Set #2 Data Compression, Huffman code and AEP

Homework Set #2 Data Compression, Huffman code and AEP Homework Set #2 Data Compression, Huffman code and AEP 1. Huffman coding. Consider the random variable ( x1 x X = 2 x 3 x 4 x 5 x 6 x 7 0.50 0.26 0.11 0.04 0.04 0.03 0.02 (a Find a binary Huffman code

More information

Symmetry in Network Coding

Symmetry in Network Coding Symmetry in Network Coding Formalization, Graph-theoretic Characterization, and Computation Jayant Apte John Walsh Department of Electrical and Computer Engineering Drexel University, Philadelphia ISIT,

More information

Extremal Entropy: Information Geometry, Numerical Entropy Mapping, and Machine Learning Application of Associated Conditional Independences.

Extremal Entropy: Information Geometry, Numerical Entropy Mapping, and Machine Learning Application of Associated Conditional Independences. Extremal Entropy: Information Geometry, Numerical Entropy Mapping, and Machine Learning Application of Associated Conditional Independences A Thesis Submitted to the Faculty of Drexel University by Yunshu

More information

EE5585 Data Compression May 2, Lecture 27

EE5585 Data Compression May 2, Lecture 27 EE5585 Data Compression May 2, 2013 Lecture 27 Instructor: Arya Mazumdar Scribe: Fangying Zhang Distributed Data Compression/Source Coding In the previous class we used a H-W table as a simple example,

More information

A digital interface for Gaussian relay and interference networks: Lifting codes from the discrete superposition model

A digital interface for Gaussian relay and interference networks: Lifting codes from the discrete superposition model A digital interface for Gaussian relay and interference networks: Lifting codes from the discrete superposition model M. Anand, Student Member, IEEE, and P. R. Kumar, Fellow, IEEE Abstract For every Gaussian

More information

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

LECTURE 10. Last time: Lecture outline

LECTURE 10. Last time: Lecture outline LECTURE 10 Joint AEP Coding Theorem Last time: Error Exponents Lecture outline Strong Coding Theorem Reading: Gallager, Chapter 5. Review Joint AEP A ( ɛ n) (X) A ( ɛ n) (Y ) vs. A ( ɛ n) (X, Y ) 2 nh(x)

More information

Exercises with solutions (Set B)

Exercises with solutions (Set B) Exercises with solutions (Set B) 3. A fair coin is tossed an infinite number of times. Let Y n be a random variable, with n Z, that describes the outcome of the n-th coin toss. If the outcome of the n-th

More information

Lecture 10: Broadcast Channel and Superposition Coding

Lecture 10: Broadcast Channel and Superposition Coding Lecture 10: Broadcast Channel and Superposition Coding Scribed by: Zhe Yao 1 Broadcast channel M 0M 1M P{y 1 y x} M M 01 1 M M 0 The capacity of the broadcast channel depends only on the marginal conditional

More information

Lecture 16. Error-free variable length schemes (contd.): Shannon-Fano-Elias code, Huffman code

Lecture 16. Error-free variable length schemes (contd.): Shannon-Fano-Elias code, Huffman code Lecture 16 Agenda for the lecture Error-free variable length schemes (contd.): Shannon-Fano-Elias code, Huffman code Variable-length source codes with error 16.1 Error-free coding schemes 16.1.1 The Shannon-Fano-Elias

More information

PART III. Outline. Codes and Cryptography. Sources. Optimal Codes (I) Jorge L. Villar. MAMME, Fall 2015

PART III. Outline. Codes and Cryptography. Sources. Optimal Codes (I) Jorge L. Villar. MAMME, Fall 2015 Outline Codes and Cryptography 1 Information Sources and Optimal Codes 2 Building Optimal Codes: Huffman Codes MAMME, Fall 2015 3 Shannon Entropy and Mutual Information PART III Sources Information source:

More information

COMM901 Source Coding and Compression. Quiz 1

COMM901 Source Coding and Compression. Quiz 1 German University in Cairo - GUC Faculty of Information Engineering & Technology - IET Department of Communication Engineering Winter Semester 2013/2014 Students Name: Students ID: COMM901 Source Coding

More information

Lecture 3: Error Correcting Codes

Lecture 3: Error Correcting Codes CS 880: Pseudorandomness and Derandomization 1/30/2013 Lecture 3: Error Correcting Codes Instructors: Holger Dell and Dieter van Melkebeek Scribe: Xi Wu In this lecture we review some background on error

More information

On Network Coding Capacity - Matroidal Networks and Network Capacity Regions. Anthony Eli Kim

On Network Coding Capacity - Matroidal Networks and Network Capacity Regions. Anthony Eli Kim On Network Coding Capacity - Matroidal Networks and Network Capacity Regions by Anthony Eli Kim S.B., Electrical Engineering and Computer Science (2009), and S.B., Mathematics (2009) Massachusetts Institute

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

Lecture 3: Channel Capacity

Lecture 3: Channel Capacity Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

Optimal Encoding Schemes for Several Classes of Discrete Degraded Broadcast Channels

Optimal Encoding Schemes for Several Classes of Discrete Degraded Broadcast Channels Optimal Encoding Schemes for Several Classes of Discrete Degraded Broadcast Channels Bike Xie, Student Member, IEEE and Richard D. Wesel, Senior Member, IEEE Abstract arxiv:0811.4162v4 [cs.it] 8 May 2009

More information

Amobile satellite communication system, like Motorola s

Amobile satellite communication system, like Motorola s I TRANSACTIONS ON INFORMATION THORY, VOL. 45, NO. 4, MAY 1999 1111 Distributed Source Coding for Satellite Communications Raymond W. Yeung, Senior Member, I, Zhen Zhang, Senior Member, I Abstract Inspired

More information

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels S. Sandeep Pradhan a, Suhan Choi a and Kannan Ramchandran b, a {pradhanv,suhanc}@eecs.umich.edu, EECS Dept.,

More information

Rate region for a class of delay mitigating codes and P2P networks

Rate region for a class of delay mitigating codes and P2P networks Rate region for a class of delay mitigating codes and P2P networks Steven Weber, Congduan Li, John MacLaren Walsh Drexel University, Dept of ECE, Philadelphia, PA 19104 Abstract This paper identifies the

More information

On Multi-source Multi-Sink Hyperedge Networks: Enumeration, Rate Region. Computation, and Hierarchy. A Thesis. Submitted to the Faculty

On Multi-source Multi-Sink Hyperedge Networks: Enumeration, Rate Region. Computation, and Hierarchy. A Thesis. Submitted to the Faculty On Multi-source Multi-Sink Hyperedge Networks: Enumeration, Rate Region Computation, and Hierarchy A Thesis Submitted to the Faculty of Drexel University by Congduan Li in partial fulfillment of the requirements

More information

On the Capacity and Degrees of Freedom Regions of MIMO Interference Channels with Limited Receiver Cooperation

On the Capacity and Degrees of Freedom Regions of MIMO Interference Channels with Limited Receiver Cooperation On the Capacity and Degrees of Freedom Regions of MIMO Interference Channels with Limited Receiver Cooperation Mehdi Ashraphijuo, Vaneet Aggarwal and Xiaodong Wang 1 arxiv:1308.3310v1 [cs.it] 15 Aug 2013

More information

Symmetries in the Entropy Space

Symmetries in the Entropy Space Symmetries in the Entropy Space Jayant Apte, Qi Chen, John MacLaren Walsh Abstract This paper investigates when Shannon-type inequalities completely characterize the part of the closure of the entropy

More information

Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel

Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel Jun Chen Dept. of Electrical and Computer Engr. McMaster University Hamilton, Ontario, Canada Chao Tian AT&T Labs-Research 80 Park

More information

Interference Channels with Source Cooperation

Interference Channels with Source Cooperation Interference Channels with Source Cooperation arxiv:95.319v1 [cs.it] 19 May 29 Vinod Prabhakaran and Pramod Viswanath Coordinated Science Laboratory University of Illinois, Urbana-Champaign Urbana, IL

More information

Problem Set 2: Solutions Math 201A: Fall 2016

Problem Set 2: Solutions Math 201A: Fall 2016 Problem Set 2: s Math 201A: Fall 2016 Problem 1. (a) Prove that a closed subset of a complete metric space is complete. (b) Prove that a closed subset of a compact metric space is compact. (c) Prove that

More information

SIGNAL COMPRESSION Lecture Shannon-Fano-Elias Codes and Arithmetic Coding

SIGNAL COMPRESSION Lecture Shannon-Fano-Elias Codes and Arithmetic Coding SIGNAL COMPRESSION Lecture 3 4.9.2007 Shannon-Fano-Elias Codes and Arithmetic Coding 1 Shannon-Fano-Elias Coding We discuss how to encode the symbols {a 1, a 2,..., a m }, knowing their probabilities,

More information

Shannon s Noisy-Channel Coding Theorem

Shannon s Noisy-Channel Coding Theorem Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 2015 Abstract In information theory, Shannon s Noisy-Channel Coding Theorem states that it is possible to communicate over a noisy

More information

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 Please submit the solutions on Gradescope. EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 1. Optimal codeword lengths. Although the codeword lengths of an optimal variable length code

More information

An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel

An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel Forty-Ninth Annual Allerton Conference Allerton House, UIUC, Illinois, USA September 8-3, An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel Invited Paper Bernd Bandemer and

More information

Interactive Decoding of a Broadcast Message

Interactive Decoding of a Broadcast Message In Proc. Allerton Conf. Commun., Contr., Computing, (Illinois), Oct. 2003 Interactive Decoding of a Broadcast Message Stark C. Draper Brendan J. Frey Frank R. Kschischang University of Toronto Toronto,

More information

Intro to Information Theory

Intro to Information Theory Intro to Information Theory Math Circle February 11, 2018 1. Random variables Let us review discrete random variables and some notation. A random variable X takes value a A with probability P (a) 0. Here

More information

Analyzing Large Communication Networks

Analyzing Large Communication Networks Analyzing Large Communication Networks Shirin Jalali joint work with Michelle Effros and Tracey Ho Dec. 2015 1 The gap Fundamental questions: i. What is the best achievable performance? ii. How to communicate

More information

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

18.2 Continuous Alphabet (discrete-time, memoryless) Channel 0-704: Information Processing and Learning Spring 0 Lecture 8: Gaussian channel, Parallel channels and Rate-distortion theory Lecturer: Aarti Singh Scribe: Danai Koutra Disclaimer: These notes have not

More information

The Gallager Converse

The Gallager Converse The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing

More information

Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function

Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Dinesh Krithivasan and S. Sandeep Pradhan Department of Electrical Engineering and Computer Science,

More information

Multimedia Communications. Mathematical Preliminaries for Lossless Compression

Multimedia Communications. Mathematical Preliminaries for Lossless Compression Multimedia Communications Mathematical Preliminaries for Lossless Compression What we will see in this chapter Definition of information and entropy Modeling a data source Definition of coding and when

More information

Algorithms for Computing Network Coding Rate Regions via Single Element Extensions of Matroids

Algorithms for Computing Network Coding Rate Regions via Single Element Extensions of Matroids Algorithms for Computing Network Coding Rate Regions via Single Element Extensions of Matroids Jayant Apte, Congduan Li, John MacLaren Walsh Drexel University, Dept. of ECE, Philadelphia, PA 19104, USA

More information

Lecture 1: September 25, A quick reminder about random variables and convexity

Lecture 1: September 25, A quick reminder about random variables and convexity Information and Coding Theory Autumn 207 Lecturer: Madhur Tulsiani Lecture : September 25, 207 Administrivia This course will cover some basic concepts in information and coding theory, and their applications

More information

An Algebraic Approach to Network Coding

An Algebraic Approach to Network Coding An Algebraic Approach to July 30 31, 2009 Outline 1 2 a 3 Linear 4 Digital Communication Digital communication networks are integral parts of our lives these days; so, we want to know how to most effectively

More information

Entropy as a measure of surprise

Entropy as a measure of surprise Entropy as a measure of surprise Lecture 5: Sam Roweis September 26, 25 What does information do? It removes uncertainty. Information Conveyed = Uncertainty Removed = Surprise Yielded. How should we quantify

More information

EE376A - Information Theory Midterm, Tuesday February 10th. Please start answering each question on a new page of the answer booklet.

EE376A - Information Theory Midterm, Tuesday February 10th. Please start answering each question on a new page of the answer booklet. EE376A - Information Theory Midterm, Tuesday February 10th Instructions: You have two hours, 7PM - 9PM The exam has 3 questions, totaling 100 points. Please start answering each question on a new page

More information

Reliable Computation over Multiple-Access Channels

Reliable Computation over Multiple-Access Channels Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,

More information

Generalized Network Sharing Outer Bound and the Two-Unicast Problem

Generalized Network Sharing Outer Bound and the Two-Unicast Problem Generalized Network Sharing Outer Bound and the Two-Unicast Problem Sudeep U. Kamath, David N. C. Tse and Venkat Anantharam Wireless Foundations, Dept of EECS, University of California at Berkeley, Berkeley,

More information

Simultaneous Nonunique Decoding Is Rate-Optimal

Simultaneous Nonunique Decoding Is Rate-Optimal Fiftieth Annual Allerton Conference Allerton House, UIUC, Illinois, USA October 1-5, 2012 Simultaneous Nonunique Decoding Is Rate-Optimal Bernd Bandemer University of California, San Diego La Jolla, CA

More information

PCP Theorem and Hardness of Approximation

PCP Theorem and Hardness of Approximation PCP Theorem and Hardness of Approximation An Introduction Lee Carraher and Ryan McGovern Department of Computer Science University of Cincinnati October 27, 2003 Introduction Assuming NP P, there are many

More information

Entropy Vectors and Network Information Theory

Entropy Vectors and Network Information Theory Entropy Vectors and Network Information Theory Sormeh Shadbakht and Babak Hassibi Department of Electrical Engineering Caltech Lee Center Workshop May 25, 2007 1 Sormeh Shadbakht and Babak Hassibi Entropy

More information

Multicoding Schemes for Interference Channels

Multicoding Schemes for Interference Channels Multicoding Schemes for Interference Channels 1 Ritesh Kolte, Ayfer Özgür, Haim Permuter Abstract arxiv:1502.04273v1 [cs.it] 15 Feb 2015 The best known inner bound for the 2-user discrete memoryless interference

More information

When does a mixture of products contain a product of mixtures?

When does a mixture of products contain a product of mixtures? When does a mixture of products contain a product of mixtures? Jason Morton Penn State May 19, 2014 Algebraic Statistics 2014 IIT Joint work with Guido Montufar Supported by DARPA FA8650-11-1-7145 Jason

More information

The Capacity of a Network

The Capacity of a Network The Capacity of a Network April Rasala Lehman MIT Collaborators: Nick Harvey and Robert Kleinberg MIT What is the Capacity of a Network? Source a Source b c d e Sink f Sink What is the Capacity of a Network?

More information

Bounding the Entropic Region via Information Geometry

Bounding the Entropic Region via Information Geometry Bounding the ntropic Region via Information Geometry Yunshu Liu John MacLaren Walsh Dept. of C, Drexel University, Philadelphia, PA 19104, USA yunshu.liu@drexel.edu jwalsh@coe.drexel.edu Abstract This

More information

Distributed Lossless Compression. Distributed lossless compression system

Distributed Lossless Compression. Distributed lossless compression system Lecture #3 Distributed Lossless Compression (Reading: NIT 10.1 10.5, 4.4) Distributed lossless source coding Lossless source coding via random binning Time Sharing Achievability proof of the Slepian Wolf

More information

Equivalence for Networks with Adversarial State

Equivalence for Networks with Adversarial State Equivalence for Networks with Adversarial State Oliver Kosut Department of Electrical, Computer and Energy Engineering Arizona State University Tempe, AZ 85287 Email: okosut@asu.edu Jörg Kliewer Department

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

Entropies & Information Theory

Entropies & Information Theory Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information

More information

The Shannon s basic inequalities refer to the following fundamental properties of entropy function:

The Shannon s basic inequalities refer to the following fundamental properties of entropy function: COMMUNICATIONS IN INFORMATION AND SYSTEMS c 2003 International Press Vol. 3, No. 1, pp. 47-60, June 2003 004 ON A NEW NON-SHANNON TYPE INFORMATION INEQUALITY ZHEN ZHANG Abstract. Recently, K. Makarychev,

More information

A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding

A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation

More information

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1 Kraft s inequality An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if N 2 l i 1 Proof: Suppose that we have a tree code. Let l max = max{l 1,...,

More information

On the Duality between Multiple-Access Codes and Computation Codes

On the Duality between Multiple-Access Codes and Computation Codes On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch

More information

Distributed Lossy Interactive Function Computation

Distributed Lossy Interactive Function Computation Distributed Lossy Interactive Function Computation Solmaz Torabi & John MacLaren Walsh Dept. of Electrical and Computer Engineering Drexel University Philadelphia, PA 19104 Email: solmaz.t@drexel.edu &

More information

CS6304 / Analog and Digital Communication UNIT IV - SOURCE AND ERROR CONTROL CODING PART A 1. What is the use of error control coding? The main use of error control coding is to reduce the overall probability

More information

CS 6820 Fall 2014 Lectures, October 3-20, 2014

CS 6820 Fall 2014 Lectures, October 3-20, 2014 Analysis of Algorithms Linear Programming Notes CS 6820 Fall 2014 Lectures, October 3-20, 2014 1 Linear programming The linear programming (LP) problem is the following optimization problem. We are given

More information

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel Introduction to Coding Theory CMU: Spring 2010 Notes 3: Stochastic channels and noisy coding theorem bound January 2010 Lecturer: Venkatesan Guruswami Scribe: Venkatesan Guruswami We now turn to the basic

More information

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem Information Theory for Wireless Communications. Lecture 0 Discrete Memoryless Multiple Access Channel (DM-MAC: The Converse Theorem Instructor: Dr. Saif Khan Mohammed Scribe: Antonios Pitarokoilis I. THE

More information

Lecture 5: Asymptotic Equipartition Property

Lecture 5: Asymptotic Equipartition Property Lecture 5: Asymptotic Equipartition Property Law of large number for product of random variables AEP and consequences Dr. Yao Xie, ECE587, Information Theory, Duke University Stock market Initial investment

More information

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability

More information

for some error exponent E( R) as a function R,

for some error exponent E( R) as a function R, . Capacity-achieving codes via Forney concatenation Shannon s Noisy Channel Theorem assures us the existence of capacity-achieving codes. However, exhaustive search for the code has double-exponential

More information

Sum-networks from undirected graphs: Construction and capacity analysis

Sum-networks from undirected graphs: Construction and capacity analysis Electrical and Computer Engineering Conference Papers Posters and Presentations Electrical and Computer Engineering 014 Sum-networks from undirected graphs: Construction and capacity analysis Ardhendu

More information