Polar Codes for Some Multi-terminal Communications Problems

Size: px
Start display at page:

Download "Polar Codes for Some Multi-terminal Communications Problems"

Transcription

1 Polar Codes for ome Multi-terminal Communications Problems Aria G. ahebi and. andeep Pradhan Department of Electrical Engineering and Computer cience, niversity of Michigan, Ann Arbor, MI 48109, A. Abstract It is shown that polar coding schemes achieve the known achievable rate regions for several multi-terminal communications problems including lossy distributed source coding, multiple access channels and multiple descriptions coding. he results are valid for arbitrary alphabet sizes (binary or nonbinary) and arbitrary distributions (symmetric or asymmetric). I. IODCIO Polar codes were recently proposed by Arikan [1] to achieve the symmetric capacity of binary input channels. his result was later generalized to arbitrary discrete memoryless channels [2] [5]. Polar coding schemes were also developed to achieve the symmetric rate-distortion function for arbitrary discrete memoryless sources [6] [8]. Polar coding results for asymmetric cases are developed in [9]. Among the existing works on the application of polar codes for multi-terminal cases we note [6], [10] [12] for distributed source coding, [13], [14] for the multiple access channels and [15] for broadcast channels. In [16], it is shown that nested polar codes can be used to achieve the hannon capacity of arbitrary discrete memoryless channels and the hannon rate-distortion function for discrete memoryless sources. In this paper, we show that nested polar codes can achieve the best known achievable rate regions for several multi-terminal communication systems. We present several examples in this paper, including the distributed source coding problem, multiple access channels, computation over MAC, broadcast channels and multiple description coding to illustrate how these codes can be employed to have an optimal performance for multi-terminal cases. he results of this paper are general regarding the size of alphabets (binary or nonbinary) using the approach of [4]. he special case where the alphabets are binary is discussed in [16] for the lossy source coding problem. In addition, the results of this paper are general regarding the distributions i.e., we do not assume uniform distributions on the channel inputs or the source alphabets. his paper is organized as follows: In ection II we state some preliminaries. In ection III, we consider the distributed source coding problem and show that polar codes achieve the Berger-ung rate region. In ection I, we consider a his work was supported by F grant CCF distributed source coding problem in which the decoder is interested in decoding the sum of auxiliary random variables and show that polar codes have the optimal performance (this scheme has the Korner-Marton scheme as a special case). In ection, we show that polar codes achieve the capacity region for multiple access channels. In ection I, we show that polar codes have an optimal performance for the problem of computation over MAC where the decoder is interested in the sum of variables. In ection II, we study the performance of polar codes for broadcast channels. In ection III, we show that polar codes are optimal for the multiple description problem. Finally, in ection I, we discuss briefly other possible problems and extensions to multiple user (more that two) cases. II. PELIMIAIE 1) Channel Parameters: For a channel (,, W ), assume is equipped with the structure of a group (G, +). he symmetric capacity is defined as Ī(W ) = I(; ) where the channel input is uniformly distributed over and is the output of the channel. For d G, we define d (W ) = 1 W (y x)w (y x + d) q x G y and for H G define H (W ) = d / H d(w ). 2) Binary Polar Codes: For any = 2 n, a polar code of length designed for the channel ( 2,, W ) is a linear (coset) code characterized by a generator matrix G and a set of indices A {1,, } of almost perfect channels. he set A is a function of the channel. he decoding algorithm for polar codes is a specific form of successive cancelation [1]. 3) Polar Codes Over Abelian Groups: For any discrete memoryless channel, there always exists an Abelian group of the same size as that of the channel input alphabet. Polar codes for arbitrary discrete memoryless channels (over arbitrary Abelian groups) are introduced in [4]. For various notations used in this paper, we refer the reader to [4] and [16]. III. DIIBED OCE CODIG: HE BEGE-G POBLEM In the distributed source coding problem, two separate sources and communicates with a centralized decoder.

2 Let, and, be the source and the reconstruction alphabets of the two terminals and assume and have the joint distribution p. Let d 1 : + and d 2 : + be the distortion measures for terminals and respectively. We denote this source by (,,,, p, d 1, d 2 ). Let and be auxiliary random variables taking values from and respectively such that, E{d 1 (, )} D 1 and E{d 2 (, )} D 2 for some distortion levels D 1, D 2 +. It is known by the Berger-ung coding scheme that the tuple ( 1, 2, D 1, D 2 ) is achievable if 1 I(; ) I(; ), 2 I( ; ) I(; ) and I(; ) + I( ; ) I(; ). In this section, we prove the following theorem: heorem III.1. For a source (,,,, p, d 1, d 2 ), assume and are finite. hen the Berger-ung rate region is achievable using nested polar codes. It suffices to show that the rates 1 = I(; ) I(; ) and 2 = I( ; ) achievable. Let G be an Abelian group of the size larger than or equal to the size of both and. ote that for the source, we can use a nested polar codes as introduced in [16] to achieve the rate I( ; ). Furthermore, we have access to the outcome v1 of 1 at the decoder with high probability. It remains to show that the rate 1 = I(; ) I(; ) is achievable when the sequence v1 with d 2 (y1, v1 ) D 2 is available at the decoder. Given the test channel p, define the artificial channels (G, G 2, ) and (G, G, ) such that for s, z G and x, (v, z s) = p (v, z s) and (x, z s) = p (x, z s). hese channels have been depicted in Figures 1 and 2. Let be a random variable uniformly distributed over p Fig. 1: est channel for the Fig. 2: est channel for the inner code (the channel coding outer code (the source coding component) component) p G which is independent from and. It is straightforward to show that in this case, is also uniformly distributed over G. imilarly to the point-to-point result [16], we can show that the symmetric capacities of the channels and are given by Ī() = q H( ) and Ī() = q H( ). We employ a nested polar code in which the inner code is a good channel code for the channel and the outer code is a good source code for. he rate of this code is equal to = Ī() Ī() = I(; ) I(; ). he rest of this section is devoted to some general definitions and lemmas which are used in the proofs. Lemma III.1. he channel is stochastically degraded with respect to the channel. Proof: In the Definition [16, Definition III.1], let the channel ( G, G 2, W ) be such that for v, z, z G and x, W (v, z x, z ) = p (v x)1 {z=z }. Let = 2 n for some positive integer n and let G be the corresponding generator matrix for polar codes. For i = 1,,, and for z1, a 1 G, v1 and x 1, let c, (zn 1, v 1, a i 1 s, (x 1, z n 1, a i 1 1 q 1 (z1, v1 a 1 G) a i+1 G i a i+1 G i 1 q 1 W s (x 1, z 1 a 1 G) Let the random vectors 1, 1, 1, 1 be distributed according to P and let 1 be a random variable uniformly distributed over G which is independent of 1, 1, 1, 1. Let 1 = 1 1 and A 1 = 1 G 1 (Here, G 1 is the inverse of the mapping G : G G ). In other words, the joint distribution of the random vectors is given by p A (a 1, s 1, u 1, v 1, x 1, z 1 ) = 1 q p (x 1, u 1, v 1 )1 {s 1 =a 1 G,un 1 =z 1 a 1 G} A. ketch of the proof he following theorems from [4] state the standard channel coding and source coding polarization phenomenons for the general case. heorem III.2. For any ɛ > 0 and 0 < β < 1 2, there exist a large = 2 n and a partition {A H H G} of [1, ] such (i) that for H G and i A H, Ī(W ) < ɛ and H ( c, ) < 2 β. Moreover, as ɛ 0 (and ), A H p H for some probabilities p H, H G adding up to one with H G p H = Ī(). heorem III.3. For any ɛ > 0 and 0 < β < 1 2, there exist a large = 2 n and a partition {B H H G} of [1, ] such (i) that for H G and i B H, Ī(W ) < ɛ and c, H ( s, ) < 2 β. Moreover, as ɛ 0 (and ), B H q H for some probabilities q H, H G adding up to one with H G q H = Ī(). s, For H G, define { A H = i [1, ] H ( c, ) < β 2, } K H : K ( c, ) < β { 2 B H = i [1, ] H ( s, ) < 1 β 2, } K H : K ( c, ) < 1 β 2 For H G and K G, define A H,K = A H B K. ote that for large, 2 β < 1 2 β. his implies for i A H, we have H ( s, ) < 1 2 β and hence i K H B K.

3 herefore, for K H, we have A H,K =. his means {A H,K K H G} forms a partition of [1, ]. ote that as increases, A H p H and B H q H. he encoding and decoding rules are as follows: Let z1 G be an outcome of the random variable 1 known to both the encoder and the decoder. Given K H G, let H be a transversal of H in G and let K H be a transversal of K in H. Any element g of G can be represented by g = [g] K + [g] K H + [g] H for unique [g] K K, [g] K H K H and [g] H H. Also note that K H + H is a transversal K of K in G so that g can be uniquely represented by g = [g] K + [g] K for some [g] K K and [g] K can be uniquely represented by [g] K = [g] K H + [g] H. Given a source sequence x 1, the encoding rule is as follows: For i [1, ], if i A H,K for some K H G, [a i ] K is uniformly distributed over K and is known to both the encoder and the decoder (and is independent from other random variables). he component [a i ] K is chosen randomly so that for g G, P (a i = g) = p Ai (g x 1 1 Ai 1 1, z1, a i 1 1 ) 1 p Ai 1 1 Ai 1 1 ([a i ] K + K x 1, z 1, ai 1 1 ) ote that a 1 can be decomposed as a 1 = [v1 ] K + [a 1 ] K H + [a 1 ] H in which [a 1 ] K is known to the decoder. he encoder sends [a 1 ] K H to the decoder and the decoder uses the channel code to recover [a 1 ] H. he decoding rule is as follows: Given z1, v1, [a 1 ] K and [a 1 ] K H, and for i A H,K, let â i = argmax c, (z 1, v1, â i 1 1 g) g [a i] K +[a i] K H + H Finally, the decoder outputs z1 â 1 G. ote that the rate of this code is equal to = A H,K K = A H,K K Ī() Ī() = I(; ) I(; ) A H,K I. DIIBED OCE CODIG: DECODIG HE M OF AIABLE For a distributed source (, p,, d) let the random variables and take values from a group G. Assume that and satisfy the Markov chain and assume E{d(,, g( + )) D} for some function g. For W = +, we show that the following rates are achievable: 1 = H(W ) H( ), 2 = H(W ) H( ) for decoding W at the decoder. he source employs a nested polar codes whose inner code is a good channel code for the channel (G, G,, ) and whose outer code is a good source code for the test channel (G, G,, ) where for s, t, q, z G and x,, (q s+t) = p W (q s t) and, (x, z s)=p (x, z s). imilarly, the source employs a nested polar code whose inner code is a good channel code for the channel (G, G,, ) and whose outer code is a good source code for the test channel (G, G,, ) where for s, t, q, r G and y,, (q s+t) = p W (q s t) and, (y, r t) = p (y, r t). hese channels are depicted in Figures 3, 4, 5 and W = +, Fig. 3: Inner code (). W = +, Fig. 5: Inner code ()., p Fig. 4: Outer code ()., p Fig. 6: Inner code (). We need to show that, is degraded with respect to, (and, is degraded with respect to, ). o show this, in the definition of degradedness [16, Definition III.1], we let the channel (G, G, W ) be such that that for q, z G and x, W (q x, z) = p (q z x).. MLIPLE ACCE CHAEL Let the finite sets and be the input alphabets of a twouser MAC and let be the output alphabet and assume the messages are independent. In order to show that nested polar codes achieve the capacity region of a MAC, it suffices to show that the rates 1 = I(; ) = H() H( ) and 2 = I( ; ) are achievable (to incorporate the timesharing argument see ection I). It is known from the pointto-point result [16] that the terminal can communicate with the decoder with rate I( ; ) so that y1 is available at the decoder with high probability. It remains to show that the rate 1 is achievable for the terminal when y1 is available at the decoder. Let G be an Abelian group with =. Define the artificial channels (G, G, ) and (G, G, ) such that for u, z G and y, (s u) = p (s u) and (y, z, s u) = p (s u, y, z). hese channels have been depicted in Figures(7) and (8). Fig. 7: Channel for inner code. Fig. 8: Channel for outer code. p, imilarly to previous cases, one can show that the symmetric capacities of the channels are equal to Ī() =

4 q H() and Ī() = q H( ). We employ a nested polar code in which the inner code is a good source code for the test channel and the outer code is a good channel code for. he rate of this code is equal to = Ī() Ī(W x) = I(; ). Here, we only give a sketch of the proof. First note that the channel is degraded with respect to so that the the source code is contained in the channel code. For s 1 G, y1 and z1, let s, (s 1, a i 1 1 q 1 (s 1 a 1 G) a i+1 G i c, (y 1,z n 1,s 1,a i 1 1 a i)= a i+1 G i 1 q 1 W c (y 1,z 1,s 1 a 1 G) Let the random vectors 1, 1, 1, 1 be distributed according to P and let 1 be a random variable uniformly distributed over G which is independent of 1, 1, 1, 1. Let 1 = 1 1 and A 1 = 1 G 1. he encoding and decoding rules are similar to those of the point-to-point channel coding result; i.e., at the encoder, the distribution p Ai is used for soft encoding and at the 1 Ai 1 1 decoder, c, (y 1, z1 n, s 1, a i 1 1 a i ) is used in the successive cancelation decoder to decode a 1. he final decoder output is equal to z1 a 1 G. ote that since y1 is known to the decoder with high probability, it can be used as the channel output for. I. COMPAIO OE MAC In this section, we consider a simple computation problem over a MAC with input alphabets, and output alphabet. he two input terminals of a MAC, and are trying to communicate with a centralized decoder which is interested in the sum of the inputs = + where + is summation over a group G. We show that the rate = min(h(), H( )) H( ) is achievable using polar codes. he terminal employs a nested polar code whose inner code is a good source code for the test channel (G, G,, ) and whose outer code is a good channel code for the channel (G, G,, ) where for u, v, r, z G and z,, (r u)=p (r u) and, (z, q u+v)=p (q u v, z). imilarly, the terminal employs a nested polar code whose inner code is a good source code for the test channel (G, G,, ) and whose outer code is a good channel code for the channel (G, G,, ) where for u, v, t, z G and z,, (t v)=p (t v) and, (z, q u+v)=p (q u v, z). ote that the two terminals use the same channel code. hese channels are depicted in Figures 9, 10, 12 and 11., Fig. 9: Inner code (). +, p Fig. 10: Outer code ()., Fig. 11: Inner code (). +, p Fig. 12: Outer code (). imilarly to previous cases, one can show that the symmetric capacities of the channels are equal to Ī() = q H() and Ī() = q H( ). We employ a nested polar code in which the inner code is a good source code for both test channels, and, and the outer code is a good channel code for, =,. he rate of this code is equal to = Ī(,) max(ī(,), Ī(, )) = min(h(), H( )) H( ). It is worth noting that it can be shown that the intersection of the two source codes is contained in the common channel code. II. HE BOADCA CHAEL In this section, we consider a broadcast channel (,, W, w) when = G for some arbitrary Abelian group G. Let be a random variable over such that E{w()} D and let, be the corresponding channel outputs. Let, be random variable over G satisfying the Markov chain such that there exists a function g : G 2 with g(, ) =. We show that the following rates are achievable 1 =I(; ) I(; )=H( ) H( ), 2 =I( ; ) if the Markov chain holds in addition to the Markov chain above needed for Marton s bound. ote that the terminal can use a point-to-point channel code to achieve the desired rate. It remains to show that the rate 2 is achievable when v1 is available at the encoder. Define the artificial channels (G, G 2, ) and (G, G, ) such that for s, v, z G and y, (v, z s) = p (z s, v) and (y, z s) = p (z s, y). hese channels have been depicted in Figures(13) and (14). imilarly to previous cases, p Fig. 13: Channel for inner code p Fig. 14: Channel for outer code one can show that the symmetric capacities of the channels are equal to Ī() = q H( ) and Ī() = q H( ). ote that to guarantee that is degraded with respect to, we need an additional condition on the auxiliary random variables. It suffices to assume that the Markov chain holds. We employ a nested polar code in which the inner code is a good source code for the test channel and the outer code is a good channel code for. he rate of this code is equal to = Ī() Ī(W x) = I(; ) I(; ). III. MLIPLE DECIPIO CODIG Consider a multiple description problem in which a source is to be reconstructed at three terminals, and W.

5 here are two encoders and three decoders. erminals and have access to the output of their corresponding encoders and terminal W has access to the output of both encoders. he goal is to find all achievable tuples ( 1, 2, D 1, D 2, D 3 ) where 1 and 2 are the rates of encoders and respectively and D 1, D 2 and D 3 are the distortion levels corresponding to decoders, and W respectively. D 1, D 2 and D 3 are measured as the average of distortion measures d 1 (, ), d 2 (, ) and d 3 (, ) respectively. Let, and W be random variables such that E{d 1 (, )} D 1, E{d 2 (, )} D 2 and E{d 3 (, W )} D 3. We show that the tuple ( 1, 2, D 1, D 2, D 3 ) is achievable if 1 I(; ), 2 I(; ) and I(; W )+I(; ). It suffices to show that the rates 1 = I(; W ) I(; ) + I(; ), 2 = I(; ) are achievable. he point-to-point source coding result implies that with 2 = I(; ) we can have v1 at the output of the second decoder with high probability. o achieve the rate 1 when v1 is available, first we note that 1 = H() H( ) + H(W ) H(W ). We use a code with rate 11 = H() H( ) for sending and another code 12 = H(W ) H(W ) for sending W. he corresponding channels are depicted in Figures 15, 16, 17 and 18., Fig. 15: Inner code (). W,W p W Fig. 17: Inner code ().,, p Fig. 16: Outer code (). W,W p W Fig. 18: Outer code (). I. OHE POBLEM AD DICIO,, In this paper, we studied the main multi-terminal communication problems in their simplest forms (e.g., no time sharing etc.). he approach of this paper can be extended to the more general formulations and to other similar problems. he approach presented in this paper can also be extended to multiple user (more than two) cases in a straightforward fashion. We briefly discuss examples of such extensions. First, consider the Berger-ung rate region for the distributed source coding problem and let be the time-sharing random variable. We show that the rates 1 = I(; ) and 2 = I( ; ) are achievable for this problem. o achieve these rates, note that 1 = I(; ) I(; ) and 2 = I( ; ) I( ; ) and design an inner polar code of rate I(; ) and an outer polar code of rate I(; ) for the source, and an inner polar code of rate I( ; ) and an outer polar code of rate I( ; ) for the source using some suitably defined channels. Let s denote the channel depicted in Figure 1 symbolically by W( ) for generic random variables and. hen, these channels are given by W(, ), W(, ), W( ) and W(, ) respectively. ext, we consider the multiple description coding problem and show that the rates 1 = I(; ) and 2 = I(; W ) + 2I(; ) + I(; ) I(; ) are achievable for some random variable. o achieve these rates, we note that 1 = and 2 = where 11 = H( ) H( ), 12 = H(W ) H(W ), 13 = H( ) H( ), 21 = H( ) H( ) and 22 = H( ) H( ). We design a nested polar codes for each of these rates similarly to the other examples presented in the paper. Finally, consider a 3-user MAC with inputs W, and and output. We have seen in ection that with rates = I(; ) and = I( ; ), we can have access to x 1 and y1 at the decoder with high probability. he channels W(W 0) and W(,, ) can be used to design a nested polar code of rate W = I(W ; ) for terminal W. EFEECE [1] E. Arikan, Channel Polarization: A Method for Constructing Capacity- Achieving Codes for ymmetric Binary-Input Memoryless Channels, IEEE ransactions on Information heory, vol. 55, no. 7, pp , [2] E. asoglu, E. elatar, and E. Arikan, Polarization for arbitrary discrete memoryless channels, IEEE Information heory Worshop, Dec. 2009, Lausanne, witzerland. [3]. Mori and. anaka, Channel Polarization on q-ary Discrete Memoryless Channels by Arbitrary Kernels, Proceedings of IEEE International ymposium on Information heory, July 2010, austin,. [4] A. ahebi and. Pradhan, Multilevel Channel Polarization for Arbitrary Discrete Memoryless Channels, Information heory, IEEE ransactions on, vol. 59, no. 12, pp , [5] W. Park and A. Barg, Polar codes for q-ary channels, q = 2 r, 2012, Online: [6]. Korada and. rbanke, Polar codes are optimal for lossy source coding, in Information heory Workshop, IW IEEE, 2009, pp [7] M. Karzand and E. elatar, Polar codes for q-ary source coding, in Information heory Proceedings (II), 2010 IEEE International ymposium on, 2010, pp [8] A. ahebi and. Pradhan, Polar codes for sources with finite reconstruction alphabets, in Communication, Control, and Computing (Allerton), th Annual Allerton Conference on, 2012, pp [9] J. Honda and H. amamoto, Polar coding without alphabet extension for asymmetric models, Information heory, IEEE ransactions on, vol. 59, no. 12, pp , Dec [10] E. Arikan, ource polarization, in Information heory Proceedings (II), 2010 IEEE International ymposium on, 2010, pp [11], Polar coding for the slepian-wolf problem based on monotone chain rules, in Information heory Proceedings (II), 2012 IEEE International ymposium on, 2012, pp [12] E. Abbe, andomness and dependencies extraction via polarization, in Information heory and Applications Workshop (IA), 2011, Feb 2011, pp [13] E. asoglu, E. elatar, and E. eh, Polar codes for the two-user binaryinput multiple-access channel, in Information heory Workshop (IW), 2010 IEEE, 2010, pp [14] E. Abbe and E. elatar, Polar Codes for the m-ser MAC, 2010, Online: [15]. Goela, E. Abbe, and M. Gastpar, Polar Codes For Broadcast Channels, 2013, Online: [16] A. ahebi and. Pradhan, ested Polar Codes Achieve the hannon ate-distortion Function and the hannon Capacity, 2014, Online:

Polar Codes for Sources with Finite Reconstruction Alphabets

Polar Codes for Sources with Finite Reconstruction Alphabets Polar Codes for Sources with Finite Reconstruction Alphabets Aria G. Sahebi and S. Sandeep Pradhan Department of Electrical Engineering and Computer Science, University of Michigan, Ann Arbor, MI 4809,

More information

Computing sum of sources over an arbitrary multiple access channel

Computing sum of sources over an arbitrary multiple access channel Computing sum of sources over an arbitrary multiple access channel Arun Padakandla University of Michigan Ann Arbor, MI 48109, USA Email: arunpr@umich.edu S. Sandeep Pradhan University of Michigan Ann

More information

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,

More information

On Scalable Source Coding for Multiple Decoders with Side Information

On Scalable Source Coding for Multiple Decoders with Side Information On Scalable Source Coding for Multiple Decoders with Side Information Chao Tian School of Computer and Communication Sciences Laboratory for Information and Communication Systems (LICOS), EPFL, Lausanne,

More information

An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets

An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets Jing Guo University of Cambridge jg582@cam.ac.uk Jossy Sayir University of Cambridge j.sayir@ieee.org Minghai Qin

More information

On the Polarization Levels of Automorphic-Symmetric Channels

On the Polarization Levels of Automorphic-Symmetric Channels On the Polarization Levels of Automorphic-Symmetric Channels Rajai Nasser Email: rajai.nasser@alumni.epfl.ch arxiv:8.05203v2 [cs.it] 5 Nov 208 Abstract It is known that if an Abelian group operation is

More information

The Gallager Converse

The Gallager Converse The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels

Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels Bike ie, Student Member, IEEE and Richard D. Wesel, Senior Member, IEEE Abstract Certain degraded broadcast channels

More information

Variable Length Codes for Degraded Broadcast Channels

Variable Length Codes for Degraded Broadcast Channels Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates

More information

Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function

Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Dinesh Krithivasan and S. Sandeep Pradhan Department of Electrical Engineering and Computer Science,

More information

Polar Codes are Optimal for Write-Efficient Memories

Polar Codes are Optimal for Write-Efficient Memories Polar Codes are Optimal for Write-Efficient Memories Qing Li Department of Computer Science and Engineering Texas A & M University College Station, TX 7784 qingli@cse.tamu.edu Anxiao (Andrew) Jiang Department

More information

Cut-Set Bound and Dependence Balance Bound

Cut-Set Bound and Dependence Balance Bound Cut-Set Bound and Dependence Balance Bound Lei Xiao lxiao@nd.edu 1 Date: 4 October, 2006 Reading: Elements of information theory by Cover and Thomas [1, Section 14.10], and the paper by Hekstra and Willems

More information

On the Capacity Region of the Gaussian Z-channel

On the Capacity Region of the Gaussian Z-channel On the Capacity Region of the Gaussian Z-channel Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@eng.umd.edu ulukus@eng.umd.edu

More information

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels Representation of Correlated s into Graphs for Transmission over Broadcast s Suhan Choi Department of Electrical Eng. and Computer Science University of Michigan, Ann Arbor, MI 80, USA Email: suhanc@eecs.umich.edu

More information

Reliable Computation over Multiple-Access Channels

Reliable Computation over Multiple-Access Channels Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,

More information

SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003

SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003 SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003 SLEPIAN-WOLF RESULT { X i} RATE R x ENCODER 1 DECODER X i V i {, } { V i} ENCODER 0 RATE R v Problem: Determine R, the

More information

arxiv: v1 [cs.it] 5 Feb 2016

arxiv: v1 [cs.it] 5 Feb 2016 An Achievable Rate-Distortion Region for Multiple Descriptions Source Coding Based on Coset Codes Farhad Shirani and S. Sandeep Pradhan Dept. of Electrical Engineering and Computer Science Univ. of Michigan,

More information

Group, Lattice and Polar Codes for Multi-terminal Communications

Group, Lattice and Polar Codes for Multi-terminal Communications Group, Lattice and Polar Codes for Multi-terminal Communications by Aria Ghasemian Sahebi A dissertation submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy (Electrical

More information

6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011

6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011 6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011 On the Structure of Real-Time Encoding and Decoding Functions in a Multiterminal Communication System Ashutosh Nayyar, Student

More information

On Multiple User Channels with State Information at the Transmitters

On Multiple User Channels with State Information at the Transmitters On Multiple User Channels with State Information at the Transmitters Styrmir Sigurjónsson and Young-Han Kim* Information Systems Laboratory Stanford University Stanford, CA 94305, USA Email: {styrmir,yhk}@stanford.edu

More information

On the Convergence of the Polarization Process in the Noisiness/Weak- Topology

On the Convergence of the Polarization Process in the Noisiness/Weak- Topology 1 On the Convergence of the Polarization Process in the Noisiness/Weak- Topology Rajai Nasser Email: rajai.nasser@alumni.epfl.ch arxiv:1810.10821v2 [cs.it] 26 Oct 2018 Abstract Let W be a channel where

More information

Distributed Lossy Interactive Function Computation

Distributed Lossy Interactive Function Computation Distributed Lossy Interactive Function Computation Solmaz Torabi & John MacLaren Walsh Dept. of Electrical and Computer Engineering Drexel University Philadelphia, PA 19104 Email: solmaz.t@drexel.edu &

More information

On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers

On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers Albrecht Wolf, Diana Cristina González, Meik Dörpinghaus, José Cândido Silveira Santos Filho, and Gerhard Fettweis Vodafone

More information

On Scalable Coding in the Presence of Decoder Side Information

On Scalable Coding in the Presence of Decoder Side Information On Scalable Coding in the Presence of Decoder Side Information Emrah Akyol, Urbashi Mitra Dep. of Electrical Eng. USC, CA, US Email: {eakyol, ubli}@usc.edu Ertem Tuncel Dep. of Electrical Eng. UC Riverside,

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview AN INTRODUCTION TO SECRECY CAPACITY BRIAN DUNN. Overview This paper introduces the reader to several information theoretic aspects of covert communications. In particular, it discusses fundamental limits

More information

arxiv: v1 [cs.it] 19 Aug 2008

arxiv: v1 [cs.it] 19 Aug 2008 Distributed Source Coding using Abelian Group Codes arxiv:0808.2659v1 [cs.it] 19 Aug 2008 Dinesh Krithivasan and S. Sandeep Pradhan, Department of Electrical Engineering and Computer Science, University

More information

Side-information Scalable Source Coding

Side-information Scalable Source Coding Side-information Scalable Source Coding Chao Tian, Member, IEEE, Suhas N. Diggavi, Member, IEEE Abstract The problem of side-information scalable (SI-scalable) source coding is considered in this work,

More information

SOURCE coding problems with side information at the decoder(s)

SOURCE coding problems with side information at the decoder(s) 1458 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 59, NO. 3, MARCH 2013 Heegard Berger Cascade Source Coding Problems With Common Reconstruction Constraints Behzad Ahmadi, Student Member, IEEE, Ravi Ton,

More information

New communication strategies for broadcast and interference networks

New communication strategies for broadcast and interference networks New communication strategies for broadcast and interference networks S. Sandeep Pradhan (Joint work with Arun Padakandla and Aria Sahebi) University of Michigan, Ann Arbor Distributed Information Coding

More information

Random Access: An Information-Theoretic Perspective

Random Access: An Information-Theoretic Perspective Random Access: An Information-Theoretic Perspective Paolo Minero, Massimo Franceschetti, and David N. C. Tse Abstract This paper considers a random access system where each sender can be in two modes of

More information

Practical Polar Code Construction Using Generalised Generator Matrices

Practical Polar Code Construction Using Generalised Generator Matrices Practical Polar Code Construction Using Generalised Generator Matrices Berksan Serbetci and Ali E. Pusane Department of Electrical and Electronics Engineering Bogazici University Istanbul, Turkey E-mail:

More information

Polar Codes for Arbitrary DMCs and Arbitrary MACs

Polar Codes for Arbitrary DMCs and Arbitrary MACs Polar Codes for Arbitrary DMCs and Arbitrary MACs Rajai Nasser and Emre Telatar, Fellow, IEEE, School of Computer and Communication Sciences, EPFL Lausanne, Switzerland Email: rajainasser, emretelatar}@epflch

More information

Polar codes for the m-user MAC and matroids

Polar codes for the m-user MAC and matroids Research Collection Conference Paper Polar codes for the m-user MAC and matroids Author(s): Abbe, Emmanuel; Telatar, Emre Publication Date: 2010 Permanent Link: https://doi.org/10.3929/ethz-a-005997169

More information

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR 1 Stefano Rini, Daniela Tuninetti and Natasha Devroye srini2, danielat, devroye @ece.uic.edu University of Illinois at Chicago Abstract

More information

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 Capacity Theorems for Discrete, Finite-State Broadcast Channels With Feedback and Unidirectional Receiver Cooperation Ron Dabora

More information

Equidistant Polarizing Transforms

Equidistant Polarizing Transforms DRAFT 1 Equidistant Polarizing Transorms Sinan Kahraman Abstract arxiv:1708.0133v1 [cs.it] 3 Aug 017 This paper presents a non-binary polar coding scheme that can reach the equidistant distant spectrum

More information

On the Duality between Multiple-Access Codes and Computation Codes

On the Duality between Multiple-Access Codes and Computation Codes On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch

More information

Coding for Noisy Write-Efficient Memories

Coding for Noisy Write-Efficient Memories Coding for oisy Write-Efficient Memories Qing Li Computer Sci. & Eng. Dept. Texas A & M University College Station, TX 77843 qingli@cse.tamu.edu Anxiao (Andrew) Jiang CSE and ECE Departments Texas A &

More information

Multi-Kernel Polar Codes: Proof of Polarization and Error Exponents

Multi-Kernel Polar Codes: Proof of Polarization and Error Exponents Multi-Kernel Polar Codes: Proof of Polarization and Error Exponents Meryem Benammar, Valerio Bioglio, Frédéric Gabry, Ingmar Land Mathematical and Algorithmic Sciences Lab Paris Research Center, Huawei

More information

Lecture 10: Broadcast Channel and Superposition Coding

Lecture 10: Broadcast Channel and Superposition Coding Lecture 10: Broadcast Channel and Superposition Coding Scribed by: Zhe Yao 1 Broadcast channel M 0M 1M P{y 1 y x} M M 01 1 M M 0 The capacity of the broadcast channel depends only on the marginal conditional

More information

On Gaussian MIMO Broadcast Channels with Common and Private Messages

On Gaussian MIMO Broadcast Channels with Common and Private Messages On Gaussian MIMO Broadcast Channels with Common and Private Messages Ersen Ekrem Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 20742 ersen@umd.edu

More information

Performance of Polar Codes for Channel and Source Coding

Performance of Polar Codes for Channel and Source Coding Performance of Polar Codes for Channel and Source Coding Nadine Hussami AUB, Lebanon, Email: njh03@aub.edu.lb Satish Babu Korada and üdiger Urbanke EPFL, Switzerland, Email: {satish.korada,ruediger.urbanke}@epfl.ch

More information

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University) Performance-based Security for Encoding of Information Signals FA9550-15-1-0180 (2015-2018) Paul Cuff (Princeton University) Contributors Two students finished PhD Tiance Wang (Goldman Sachs) Eva Song

More information

How to Compute Modulo Prime-Power Sums?

How to Compute Modulo Prime-Power Sums? How to Compute Modulo Prime-Power Sums? Mohsen Heidari, Farhad Shirani, and S. Sandeep Pradhan Department of Electrical Engineering and Computer Science, University of Michigan, Ann Arbor, MI 48109, USA.

More information

Lossy Distributed Source Coding

Lossy Distributed Source Coding Lossy Distributed Source Coding John MacLaren Walsh, Ph.D. Multiterminal Information Theory, Spring Quarter, 202 Lossy Distributed Source Coding Problem X X 2 S {,...,2 R } S 2 {,...,2 R2 } Ẑ Ẑ 2 E d(z,n,

More information

On Bit Error Rate Performance of Polar Codes in Finite Regime

On Bit Error Rate Performance of Polar Codes in Finite Regime On Bit Error Rate Performance of Polar Codes in Finite Regime A. Eslami and H. Pishro-Nik Abstract Polar codes have been recently proposed as the first low complexity class of codes that can provably achieve

More information

On Function Computation with Privacy and Secrecy Constraints

On Function Computation with Privacy and Secrecy Constraints 1 On Function Computation with Privacy and Secrecy Constraints Wenwen Tu and Lifeng Lai Abstract In this paper, the problem of function computation with privacy and secrecy constraints is considered. The

More information

Error Exponent Region for Gaussian Broadcast Channels

Error Exponent Region for Gaussian Broadcast Channels Error Exponent Region for Gaussian Broadcast Channels Lihua Weng, S. Sandeep Pradhan, and Achilleas Anastasopoulos Electrical Engineering and Computer Science Dept. University of Michigan, Ann Arbor, MI

More information

Multiuser Successive Refinement and Multiple Description Coding

Multiuser Successive Refinement and Multiple Description Coding Multiuser Successive Refinement and Multiple Description Coding Chao Tian Laboratory for Information and Communication Systems (LICOS) School of Computer and Communication Sciences EPFL Lausanne Switzerland

More information

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016 Midterm Exam Time: 09:10 12:10 11/23, 2016 Name: Student ID: Policy: (Read before You Start to Work) The exam is closed book. However, you are allowed to bring TWO A4-size cheat sheet (single-sheet, two-sided).

More information

Joint Source-Channel Coding for the Multiple-Access Relay Channel

Joint Source-Channel Coding for the Multiple-Access Relay Channel Joint Source-Channel Coding for the Multiple-Access Relay Channel Yonathan Murin, Ron Dabora Department of Electrical and Computer Engineering Ben-Gurion University, Israel Email: moriny@bgu.ac.il, ron@ee.bgu.ac.il

More information

A Comparison of Two Achievable Rate Regions for the Interference Channel

A Comparison of Two Achievable Rate Regions for the Interference Channel A Comparison of Two Achievable Rate Regions for the Interference Channel Hon-Fah Chong, Mehul Motani, and Hari Krishna Garg Electrical & Computer Engineering National University of Singapore Email: {g030596,motani,eleghk}@nus.edu.sg

More information

Channel Polarization and Blackwell Measures

Channel Polarization and Blackwell Measures Channel Polarization Blackwell Measures Maxim Raginsky Abstract The Blackwell measure of a binary-input channel (BIC is the distribution of the posterior probability of 0 under the uniform input distribution

More information

Equivalence for Networks with Adversarial State

Equivalence for Networks with Adversarial State Equivalence for Networks with Adversarial State Oliver Kosut Department of Electrical, Computer and Energy Engineering Arizona State University Tempe, AZ 85287 Email: okosut@asu.edu Jörg Kliewer Department

More information

Source and Channel Coding for Correlated Sources Over Multiuser Channels

Source and Channel Coding for Correlated Sources Over Multiuser Channels Source and Channel Coding for Correlated Sources Over Multiuser Channels Deniz Gündüz, Elza Erkip, Andrea Goldsmith, H. Vincent Poor Abstract Source and channel coding over multiuser channels in which

More information

Upper Bounds on the Capacity of Binary Intermittent Communication

Upper Bounds on the Capacity of Binary Intermittent Communication Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,

More information

An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel

An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel Forty-Ninth Annual Allerton Conference Allerton House, UIUC, Illinois, USA September 8-3, An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel Invited Paper Bernd Bandemer and

More information

On the Secrecy Capacity of the Z-Interference Channel

On the Secrecy Capacity of the Z-Interference Channel On the Secrecy Capacity of the Z-Interference Channel Ronit Bustin Tel Aviv University Email: ronitbustin@post.tau.ac.il Mojtaba Vaezi Princeton University Email: mvaezi@princeton.edu Rafael F. Schaefer

More information

Bounds on Mutual Information for Simple Codes Using Information Combining

Bounds on Mutual Information for Simple Codes Using Information Combining ACCEPTED FOR PUBLICATION IN ANNALS OF TELECOMM., SPECIAL ISSUE 3RD INT. SYMP. TURBO CODES, 003. FINAL VERSION, AUGUST 004. Bounds on Mutual Information for Simple Codes Using Information Combining Ingmar

More information

List of Figures. Acknowledgements. Abstract 1. 1 Introduction 2. 2 Preliminaries Superposition Coding Block Markov Encoding...

List of Figures. Acknowledgements. Abstract 1. 1 Introduction 2. 2 Preliminaries Superposition Coding Block Markov Encoding... Contents Contents i List of Figures iv Acknowledgements vi Abstract 1 1 Introduction 2 2 Preliminaries 7 2.1 Superposition Coding........................... 7 2.2 Block Markov Encoding.........................

More information

A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels

A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels Mehdi Mohseni Department of Electrical Engineering Stanford University Stanford, CA 94305, USA Email: mmohseni@stanford.edu

More information

Optimal Encoding Schemes for Several Classes of Discrete Degraded Broadcast Channels

Optimal Encoding Schemes for Several Classes of Discrete Degraded Broadcast Channels Optimal Encoding Schemes for Several Classes of Discrete Degraded Broadcast Channels Bike Xie, Student Member, IEEE and Richard D. Wesel, Senior Member, IEEE Abstract arxiv:0811.4162v4 [cs.it] 8 May 2009

More information

Secret Key Agreement Using Asymmetry in Channel State Knowledge

Secret Key Agreement Using Asymmetry in Channel State Knowledge Secret Key Agreement Using Asymmetry in Channel State Knowledge Ashish Khisti Deutsche Telekom Inc. R&D Lab USA Los Altos, CA, 94040 Email: ashish.khisti@telekom.com Suhas Diggavi LICOS, EFL Lausanne,

More information

A Comparison of Superposition Coding Schemes

A Comparison of Superposition Coding Schemes A Comparison of Superposition Coding Schemes Lele Wang, Eren Şaşoğlu, Bernd Bandemer, and Young-Han Kim Department of Electrical and Computer Engineering University of California, San Diego La Jolla, CA

More information

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying Ahmad Abu Al Haija, and Mai Vu, Department of Electrical and Computer Engineering McGill University Montreal, QC H3A A7 Emails: ahmadabualhaija@mailmcgillca,

More information

Joint Write-Once-Memory and Error-Control Codes

Joint Write-Once-Memory and Error-Control Codes 1 Joint Write-Once-Memory and Error-Control Codes Xudong Ma Pattern Technology Lab LLC, U.S.A. Email: xma@ieee.org arxiv:1411.4617v1 [cs.it] 17 ov 2014 Abstract Write-Once-Memory (WOM) is a model for many

More information

An Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel

An Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 4, APRIL 2012 2427 An Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel Ersen Ekrem, Student Member, IEEE,

More information

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/40 Acknowledgement Praneeth Boda Himanshu Tyagi Shun Watanabe 3/40 Outline Two-terminal model: Mutual

More information

SUCCESSIVE refinement of information, or scalable

SUCCESSIVE refinement of information, or scalable IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 49, NO. 8, AUGUST 2003 1983 Additive Successive Refinement Ertem Tuncel, Student Member, IEEE, Kenneth Rose, Fellow, IEEE Abstract Rate-distortion bounds for

More information

Multiterminal Source Coding with an Entropy-Based Distortion Measure

Multiterminal Source Coding with an Entropy-Based Distortion Measure Multiterminal Source Coding with an Entropy-Based Distortion Measure Thomas Courtade and Rick Wesel Department of Electrical Engineering University of California, Los Angeles 4 August, 2011 IEEE International

More information

Distributed Lossless Compression. Distributed lossless compression system

Distributed Lossless Compression. Distributed lossless compression system Lecture #3 Distributed Lossless Compression (Reading: NIT 10.1 10.5, 4.4) Distributed lossless source coding Lossless source coding via random binning Time Sharing Achievability proof of the Slepian Wolf

More information

On Dependence Balance Bounds for Two Way Channels

On Dependence Balance Bounds for Two Way Channels On Dependence Balance Bounds for Two Way Channels Ravi Tandon Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 20742 ravit@umd.edu ulukus@umd.edu

More information

Amobile satellite communication system, like Motorola s

Amobile satellite communication system, like Motorola s I TRANSACTIONS ON INFORMATION THORY, VOL. 45, NO. 4, MAY 1999 1111 Distributed Source Coding for Satellite Communications Raymond W. Yeung, Senior Member, I, Zhen Zhang, Senior Member, I Abstract Inspired

More information

Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel

Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel Pritam Mukherjee Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 074 pritamm@umd.edu

More information

The Duality Between Information Embedding and Source Coding With Side Information and Some Applications

The Duality Between Information Embedding and Source Coding With Side Information and Some Applications IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 49, NO. 5, MAY 2003 1159 The Duality Between Information Embedding and Source Coding With Side Information and Some Applications Richard J. Barron, Member,

More information

On Network Interference Management

On Network Interference Management On Network Interference Management Aleksandar Jovičić, Hua Wang and Pramod Viswanath March 3, 2008 Abstract We study two building-block models of interference-limited wireless networks, motivated by the

More information

X 1 : X Table 1: Y = X X 2

X 1 : X Table 1: Y = X X 2 ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access

More information

The Least Degraded and the Least Upgraded Channel with respect to a Channel Family

The Least Degraded and the Least Upgraded Channel with respect to a Channel Family The Least Degraded and the Least Upgraded Channel with respect to a Channel Family Wei Liu, S. Hamed Hassani, and Rüdiger Urbanke School of Computer and Communication Sciences EPFL, Switzerland Email:

More information

Capacity bounds for multiple access-cognitive interference channel

Capacity bounds for multiple access-cognitive interference channel Mirmohseni et al. EURASIP Journal on Wireless Communications and Networking, :5 http://jwcn.eurasipjournals.com/content///5 RESEARCH Open Access Capacity bounds for multiple access-cognitive interference

More information

Interactive Decoding of a Broadcast Message

Interactive Decoding of a Broadcast Message In Proc. Allerton Conf. Commun., Contr., Computing, (Illinois), Oct. 2003 Interactive Decoding of a Broadcast Message Stark C. Draper Brendan J. Frey Frank R. Kschischang University of Toronto Toronto,

More information

On Source-Channel Communication in Networks

On Source-Channel Communication in Networks On Source-Channel Communication in Networks Michael Gastpar Department of EECS University of California, Berkeley gastpar@eecs.berkeley.edu DIMACS: March 17, 2003. Outline 1. Source-Channel Communication

More information

Solutions to Homework Set #2 Broadcast channel, degraded message set, Csiszar Sum Equality

Solutions to Homework Set #2 Broadcast channel, degraded message set, Csiszar Sum Equality 1st Semester 2010/11 Solutions to Homework Set #2 Broadcast channel, degraded message set, Csiszar Sum Equality 1. Convexity of capacity region of broadcast channel. Let C R 2 be the capacity region of

More information

Secret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper

Secret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper Secret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper Mohsen Bahrami, Ali Bereyhi, Mahtab Mirmohseni and Mohammad Reza Aref Information Systems and Security

More information

Primary Rate-Splitting Achieves Capacity for the Gaussian Cognitive Interference Channel

Primary Rate-Splitting Achieves Capacity for the Gaussian Cognitive Interference Channel Primary Rate-Splitting Achieves Capacity for the Gaussian Cognitive Interference Channel Stefano Rini, Ernest Kurniawan and Andrea Goldsmith Technische Universität München, Munich, Germany, Stanford University,

More information

On the Capacity of the Interference Channel with a Relay

On the Capacity of the Interference Channel with a Relay On the Capacity of the Interference Channel with a Relay Ivana Marić, Ron Dabora and Andrea Goldsmith Stanford University, Stanford, CA {ivanam,ron,andrea}@wsl.stanford.edu Abstract Capacity gains due

More information

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Nevroz Şen, Fady Alajaji and Serdar Yüksel Department of Mathematics and Statistics Queen s University Kingston, ON K7L 3N6, Canada

More information

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels S. Sandeep Pradhan a, Suhan Choi a and Kannan Ramchandran b, a {pradhanv,suhanc}@eecs.umich.edu, EECS Dept.,

More information

Feedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case

Feedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case 1 arxiv:0901.3580v1 [cs.it] 23 Jan 2009 Feedback Capacity of the Gaussian Interference Channel to Within 1.7075 Bits: the Symmetric Case Changho Suh and David Tse Wireless Foundations in the Department

More information

A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality

A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality 0 IEEE Information Theory Workshop A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality Kumar Viswanatha, Emrah Akyol and Kenneth Rose ECE Department, University of California

More information

Cognitive Multiple Access Networks

Cognitive Multiple Access Networks Cognitive Multiple Access Networks Natasha Devroye Email: ndevroye@deas.harvard.edu Patrick Mitran Email: mitran@deas.harvard.edu Vahid Tarokh Email: vahid@deas.harvard.edu Abstract A cognitive radio can

More information

Network coding for multicast relation to compression and generalization of Slepian-Wolf

Network coding for multicast relation to compression and generalization of Slepian-Wolf Network coding for multicast relation to compression and generalization of Slepian-Wolf 1 Overview Review of Slepian-Wolf Distributed network compression Error exponents Source-channel separation issues

More information

Optimal Power Allocation for Parallel Gaussian Broadcast Channels with Independent and Common Information

Optimal Power Allocation for Parallel Gaussian Broadcast Channels with Independent and Common Information SUBMIED O IEEE INERNAIONAL SYMPOSIUM ON INFORMAION HEORY, DE. 23 1 Optimal Power Allocation for Parallel Gaussian Broadcast hannels with Independent and ommon Information Nihar Jindal and Andrea Goldsmith

More information

A Simple Memoryless Proof of the Capacity of the Exponential Server Timing Channel

A Simple Memoryless Proof of the Capacity of the Exponential Server Timing Channel A Simple Memoryless Proof of the Capacity of the Exponential Server iming Channel odd P. Coleman ECE Department Coordinated Science Laboratory University of Illinois colemant@illinois.edu Abstract his

More information

Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel

Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel Shahid Mehraj Shah and Vinod Sharma Department of Electrical Communication Engineering, Indian Institute of

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

ProblemsWeCanSolveWithaHelper

ProblemsWeCanSolveWithaHelper ITW 2009, Volos, Greece, June 10-12, 2009 ProblemsWeCanSolveWitha Haim Permuter Ben-Gurion University of the Negev haimp@bgu.ac.il Yossef Steinberg Technion - IIT ysteinbe@ee.technion.ac.il Tsachy Weissman

More information

Sum Rate of Multiterminal Gaussian Source Coding

Sum Rate of Multiterminal Gaussian Source Coding DIMACS Series in Discrete Mathematics and Theoretical Computer Science Sum Rate of Multiterminal Gaussian Source Coding Pramod Viswanath Abstract We characterize the sum rate of a class of multiterminal

More information