ProblemsWeCanSolveWithaHelper

Size: px
Start display at page:

Download "ProblemsWeCanSolveWithaHelper"

Transcription

1 ITW 2009, Volos, Greece, June 10-12, 2009 ProblemsWeCanSolveWitha Haim Permuter Ben-Gurion University of the Negev Yossef Steinberg Technion - IIT ysteinbe@ee.technion.ac.il Tsachy Weissman Technion/Stanford University tsachy@stanford.edu Abstract In this work we study source coding problems where a helper provides rate-limited side information to the involved parties. We first consider the Wyner-Ziv problem, where in addition to the memoryless side information available to the decoder, a helper sends common, rate-limited side information to the encoder and decoder. A single letter characterization of the achievable rates is derived, under certain Markov conditions on the source and side information. We then examine the problem of cascade rate distortion with a helper. Partial results are derived also for the case where the side information is not necessarily common, i.e., when the helper can send different streams of coded side information to the involved parties. X n T,rateR T,rateR Z n ecoder ˆX n I. INTROUCTION In this work we study problems of lossy source coding with rate limited side information. The first problem we study is depicted in Figure 1. It is an extension of the Wyner- Ziv problem [8], as described next. The source, helper, and side information sequences, X i, Y i,andz i, are independent and identically distributed i.i.d. sequences, with generic distribution X i,y i,z i P X,Y,Z. We assume throughout that P X,Y,Z = P Y X P Z X P X. That is, the Markov relations Y X Z hold. The coding scheme proceeds as follows. At the first stage of the encoding scheme, the helper compresses the side information to rate R, and sends the stream T of coded bits to the encoder and decoder. At the second stage, the encoder compresses the source X n to rate R, based on the vector X n and the stream of coded bits T that arrived from the helper. enote this stream by T. The decoder then constructs the estimation ˆX n based on the codewords arrived from the helper and the encoder, T and T respectively, and the memoryless side information Z n. We are interested in the region of all achievable rates and distortion triples, R,R,, where is the distortion between the reproduction ˆX n and the source X n. Several works on related problems appeared in the past in the literature. Wyner [6] studied a problem of network source coding with compressed side information that is provided only to the decoders. A special case of his model is the system in Figure 1, without the memoryless side information Z, and where the stream T arrives only to the decoder. A full characterization of the achievable region can be concluded from the results of [6], for the special case where the source X has to be reconstructed losslessly. This problem was solved independently by Ahlswede and Körner in [1]. The extension The work of Y. Steinberg was supported by THE ISRAEL SCIENCE FOUNATION grant No. 280/07 Fig. 1. The rate distortion problem with a helper Y, and additional side information Z known only to the decoder. We assume that the side information Z and the helper Y are independent given the source X. of these results to the case of lossy reconstruction of X, remains open. Kaspi [3] and Kaspi and Berger [4] derived an achievable region for a problem that contains the helper problem with degenerate Z as a special case. However, the converse part does not match. In [5], Vasudevan and Perron describe a general rate distortion problem with encoder breakdown. For some choice of distortions, which they term as Setup B, their model reduces to our model with degenerate side information Z. In particular, Theorem 3 of [5] provides a single letter characterization of the achievable rates and distortions for their Setup B. The second problem we study in this work is cascade source cooding with rate limited side information, as depicted in Figure 2. It is an extension of Yamamoto s cascade source coding model [9] to the case where helper sends compressed side information to the encoder and the two decoders. The communication protocol is as described above for the Wyner- Ziv problem: first, the helper compresses the side information, to a stream T of bits at rate R. The stream T is sent to all the parties involved. Then the encoder compresses the source X n, and sends the codeword T 1,atrateR 1,tothefirst decoder, ecoder 1. Upon receiving T and T, this decoder produces an estimate ˆX 1 n of Xn. In addition, it acts as an encoder for the next stage: based on T and T 1, ecoder 1 produces a codeword T 2 at rate R 2, and sends it to the second decoder, who in turn produces the estimate ˆX 2 n. enote by 1 resp. 2 the distortion between ˆX 1 n rep. ˆXn 2 andthe source X n. The problem we treat here in this context is the characterization of all quintuples R,R 1,R 2, 1, /09/$ IEEE 266

2 X n ˆX n 1 T 1,rateR 1 ecoder 1 T,rateR ˆX n 2 T 2,rateR 2 ecoder 2 Fig. 2. The cascade rate distortion problem with a helper Y. This work is organized as follows. In Section II-A, we define the problem of rate distortion with helper and decoder side information, and derive the main result. The Gaussian example is given in Section II-B. Section II-C treats the case where independent rates are allowed, and gives partial results. The problem of cascade source coding with a helper is defined and treated in Section III. II. RATE ISTORTION WITH HELPER AN SIE INFORMATION In this Section, we consider the rate distortion with a helper and additional side information Z, known only to the decoder, as shown in Fig. 1. We also assume that the source X, the helper Y, and the side information Z, form the Markov chain Y X Z. A. efinitions and main result We are given a source X, distributed according to P X on a finite set X. The reconstruction alphabet is denoted by ˆX,and t he distortion measure by d : X ˆX IR +. The distortion between vectors is defined as usual, as the normalized sum between single letter distortions. The formal definition of the code is the following. efinition 1: An n, M, M, code for source X with helper and side information Z consists of two encoders f,f and a decoder g, f : {1, 2,..., M } f : X n {1, 2,..., M } {1, 2,..., M} g : {1, 2,..., M} {1, 2,..., M } Z n ˆX n 1 EdX n, ˆX n. 2 A triple R, R, is said to be achievable if for any δ > 0, ɛ > 0, and sufficiently large n, there exists an n, 2 nr+δ, 2 nr +δ, + ɛ code for the source X with helper side information Z. For a given distortion level, The operational achievable region, denoted R O Z, isthe set of all rate pairs R, R the triple R, R, is achievable. Let R Z be the set of all rate pairs R, R that satisfy R IX; U V,Z, 3 R IV ; Y Z, 4 for some joint distribution of the form px, y, z, u, v = px, ypz xpv ypu x, v,5 EdX, ˆXU, V, Z, 6 where U and V are auxiliary random variables, and the reconstruction variable ˆX is a deterministic function of the triple U, V, Z. The next lemma states properties of R Z. Lemma 1: 1 The region R Z is convex 2 To exhaust R Z, it is enough to restrict the alphabets of V and U to satisfy V Y +2 U X Y The proof is omitted. We now state the main result of this section. Theorem 1: R O Z =R Z 8 provided the Markov chain Y X Z holds. The proof of Theorem 1 is omitted, due to lack of space. However, we describe here a side result about the region R Z, which is essential in the proof of the converse part. Let us define an additional region R Z as the set of all rate pairs R, R satisfying 3, 4, and 6, for some distribution of the form px, y, z, u, v =px, ypz xpv ypu x, v, y, 9 where U, V are auxiliary random variables, and the reconstruction variable ˆX is a deterministic function of the triple U, V, Z. Note that the only difference between R Z and R Z is in the joint distributions over which the region is computed: in the composition 9 the external random variable U is independent of Y when conditioned on X, V. Thatis, the Markov chain U X, V Y is imposed, whereas in the definition of R Z this Markov structure is not imposed. In the proof of Theorem 1, we show that R Z is achievable and that R Z is an outer bound, and we conclude the proof by applying the following lemma, which states that the two regions are equal. Lemma 2: R Z =R Z. Proof: First we notice that R Z R Z, since one can restrict the input distribution px, y, z, u, v = px, ypz xpv ypu x, v. Now we prove that R Z R Z. LetR, R R Z, and px, y, z, u, v =px, ypz xpv ypu x, v, y 10 be a distribution that satisfies 3 and 4. It is shown next that there exists a distribution of the form 5 3 and 4 hold. Let px, y, z, u, v =px, y, zpv ypu x, v,

3 where pu x, v is induced by px, y, z, u, v. We now show that the terms IV ; Y Z, IX; U Z, V and EdX, ˆXu, v, z are the same whether we evaluate them by the joint distribution px, y, z, u, v of 11, or by px, y, z, u, v; hence R, R R Z. Inordertoshow that the terms above are the same it is enough to show that the marginal distributions py, z, v and px, z, u, v induced by px, y, z, u, v are equal to the marginal distributions py, z, v and px, z, u, v induced by px, y, z, u, v. Clearly py, v, z = py, v, z. In the rest of the proof we show px, z, u, v =px, z, u, v. A distribution of the form px, y, z, u, v as given in 10 implies that the Markov chain U X, V Z holds since pz x, u, v = pz x, u, v, ypy x, u, v y = pz x, vpy x, u, v y = pz x, v. 12 Therefore pu x, v, z = pu x, v. Now consider px, z, u, v = px, z, vpu x, v, and since px, z, v = px, z, v and pu x, v = pu x, v we conclude that px, z, u, v =px, z, u, v. B. The Gaussian case In this subsection we consider the Gaussian instance that corresponds to Theorem 1. Since X, Y, Z form the Markov chain Y X Z, we assume, without loss of generality, that X = Z + A and Y = Z + A + B, where the random variables A, B, Z are zero-mean Gaussian and independent of each other, where E[A 2 ]=σa 2, E[B2 ]=σb 2 and E[Z2 ]=σz 2.The Gaussian example with helper and without side information Z, was solved in [5]. Their result will be used in the sequel to establish our converse for the Gaussian model. The following theorem establishes the rate region of the Gaussian case. Theorem 2: The achievable rate region for the Gaussian problem is R 1 σ 2 2 log A 1 σ2 A 1 2 σa 2 2R +σ2 B = 1 σ 2 2 log A σb 2 σ2 A 1 2 2R σa 2 + σ2 B 13 It is interesting to note that the rate region does not depend on σz 2. Furthermore, we show in the proof that for the Gaussian case the rate region is the same as when Z is known to the source X and the helper Y. Proof of Theorem 2: Converse: Assume that both encoders observe Z n. Without loss of generality, the encoders can subtract Z from X and Y ; hence the problem is equivalent to new rate distortion problem with a helper, where the source is A and the helper is A + B. Now using the result for the Gaussian case from [5], adapted to our notation, we obtain R 1 σ 2 2 log A 1 σ2 A 1 2 σ 2 2R A +σ2 B. 14 Achievability: Before proving the direct-part of Theorem 2, we establish the following lemma which is proved in the Appendix. Lemma 3: Gaussian Wyner-Ziv rate-distortion problem with additional side information known to the encoder and decoder. Let X, W, Z be jointly Gaussian. Consider the Wyner-Ziv rate distortion problem where the source X is to be compressed with quadratic distortion measure, W is available at the encoder and decoder, and Z is available only at the decoder. The rate-distortion region for this problem is given by R = 1 2 log σ2 X W,Z, 15 where σx W,Z 2 = E[X E[X W, Z]2 ], i.e., the minimum square error of estimating X from W, Z. Let V = A + B + Z +, where N0,σ 2 and is independent of A, B, Z. Clearly, we have V Y X Z. Now, let us generate V at the source-encoder and at the decoder using the achievability scheme of Wyner [7]. Since IV ; Z IV ; X arater = IV ; Y IV ; Z would suffice, and it may be expressed as follows: R = IV ; Y Z = hv Z hv Y = 1 2 log σ2 A + σ2 B + σ2 σ 2, 16 and this implies that σ 2 = σ2 A + σ2 B 2 2R Now, we invoke Lemma 3, where V is the side information known both to the encoder and decoder; hence a rate that satisfies the following inequality achieves a distortion ; R 1 2 log σ2 X V,Z = 1 2 log σ2 A 1 σ 2 A σ 2 A + σ2 B + σ2 18 Finally, by replacing σ 2 with the identity in 17 we obtain 14. C. The case of independent rates In this subsection we treat the rate distortion scenario where side information from the helper is encoded using two different messages, possibly at different rates, one to the encoder and and one to the decoder, as shown in Fig. 3. The complete characterization of achievable rates for this scenario is still an open problem. However, the solution that is given in previous sections, where there is one message known both to the encoder and decoder, provides us insight that allows us to solve several cases of the problem shown here. We start with the definition of the general case. It follows quite closely efinition 1, except that there are three rates involved. 268

4 X n T e,rater e T,rateR T d,rater d Z n ecoder Fig. 3. The rate distortion problem with decoder side information, and independent helper rates. We assume the Markov relation Y X Z efinition 2: An n, M, M e,m d, code for source X with side information Y and different helper messages to the encoder and decoder, consists of three encoders and a decoder f e : {1, 2,..., M e } f d : {1, 2,..., M d } f : X n {1, 2,..., M e } {1, 2,..., M} ˆX n 19 g : {1, 2,..., M} {1, 2,..., M d } ˆX n 20 EdX n, ˆX n, 21 To avoid cumbersome statements, we will not repeat in the sequel the words... different helper messages to the encoder and decoder, as this is the topic of this section, and should be clear from the context. The rate pair R, R e,r d of the n, M, M e,m d, code is R = 1 n log M R e = 1 n log M e R d = 1 n log M d 22 efinition 3: Given a distortion, a rate triple R, R e,r d is said to be achievable if for any δ>0, andsufficiently large n, there exists an n, 2 nr+δ, 2 nre+δ, 2 nrd+δ,+δ code for the source X with side information Y. efinition 4: The operational achievable region R O g Z of rate distortion with a helper known at the encoder and decoder is the closure of the set of all achievable rate triples at distortion. enote by R O g R e,r d, Z the section of R O g at helper rates R e,r d.thatis, R O g R e,r d, Z = {R : R, R e,r d are achievable with distortion }, 23 and similarly, denote by RR, Z the section of the region R Z, defined in 3, 4, 5, and 6 at helper rate R. Recall that, according to Theorem 1, RR, Z consists of all achievable source coding rates when the helper sends common messages to the source encoder and destination at rate R. The main result of this section is the following. Theorem 3: For any R e R d, R O g R e,r d, Z =RR d, Z 24 Theorem 3 has interesting implications on the coding strategy taken by the helper. It says that no gain in performance can be achieved if the source encoder gets more help than the decoder at the destination i.e., if R e >R d, and thus we may restrict R e to be no higher than R d. Moreover, in those cases where R e = R d, optimal performance is achieved when the helper sends to the encoder and decoder exactly the same message. The proof of this statement uses operational arguments, and is omitted here. The statement of Theorem 3 can be extended to rates R e slightly lower than R d. This extension is based on the simple observation that the source encoder knows X, which can serve as side information in decoding the message sent by the helper. Therefore, any message T 2 sent to the source decoder can undergo a stage of binning with respect to X. Asanextreme example, consider the case where R e HY X. The source encoder can fully recover Y, hence there is no advantage in transmitting to the encoder at rates higher than HY X; the decoder, on the other hand, can benefit from rates in the region HY X <R d <HY Z. This rate interval is not empty due to the Markov chain Y X Z. These observations are summarized in the next theorem. Theorem 4: 1 Let U, V achieve a point R, R in R Z, i.e., R = IX; U V,Z R = IY ; V Z =IV ; Y IV ; Z 25 EdX, ˆXU, V, Z, 26 V Y X Z. 27 Then R, R e,r R O g Z for every R e satisfying R e IV ; Y Z IV ; X Z = IV ; Y IV ; X Let R, R be an outer point of R. Thatis, R, R R Z. 29 Then R, R e,r is an outer point of R O g Z for any R e, i.e., R, R e,r R O g Z R e. 30 The proof of Part 1 is based on binning, as described above. In particular, observe that R e given in 28 is lower than R of 25 due to the Markov chain V Y X Z. Part2is a partial converse, and is a direct consequence of Theorem 3. The details, being straightforward, are omitted. 269

5 III. CASCAE SOURCE COING WITH HELPER In this section we study the problem of cascade source coding with distortion, where a helper sends a common message to the encoder and two decoders, as depicted in Figure 2. In general, we have two reconstruction alphabets ˆX i and two distortion measures, d i : X ˆX i IR +,where reconstruction alphabet i and distortion measure i are used at decoder i, i =1, 2. efinition 5: An n, M 1,M 2,M, cascade code for source X consists of three encoders f 1, f 2,andf and two decoders g 1, g 2, f : {1, 2,..., M } f 1 : X n {1, 2,..., M } {1, 2,..., M 1 } f 2 : {1, 2,..., M 1 } {1, 2,..., M } {1, 2,..., M 2 } g 1 : {1, 2,..., M 1 } {1, 2,..., M } ˆX 1 n g 2 : {1, 2,..., M 2 } {1, 2,..., M } ˆX 2 n 31 Ed i X n, ˆX n i i, i =1, The rates of the code are R = 1/nlogM and R i = 1/nlogM i, i =1, 2. The achievable rates and the operational rates region R O c 1, 2 at distortions 1, 2 are defined as usual. The subscript c stands here for cascade coding. efine the region R c 1, 2 as the collection of all rate triples R 1,R 2,R R IZ; V 33 R 1 IX; ˆX 1, ˆX 2 V 34 R 2 IX; ˆX 2 V for some joint distribution px, y, v, ˆx 1, ˆx 2 satisfying px, y, v, ˆx 1, ˆx 2 = px, ypv ypˆx 1, ˆx 2 v, x 37 i IEdX, ˆX i, i =1, As with the region R Z, it can be shown also here that R c 1, 2 is convex, and the alphabet of V can be restricted to be of size at most Y +4 where we use Y 1 constraints to preserve the distribution of Y, plus 5 to preserve the mutual information functions in the definition of R c 1, 2 and the two distortions. In addition, the region is invariant to whether we use a distribution of the form 37, or allow for a dependence on Y in the last term, i.e., px, y, v, ˆx 1, ˆx 2 =px, ypv ypˆx 1, ˆx 2 v, x, y. 39 The proof follows the proofs of parallel statements in Section II-A and is therefore omitted. The main result of this section is stated next. Theorem 5: R O c 1, 2 =R c 1, 2. ue to lack of space, the proof is omitted. APPENIX Proof of Lemma 3: Since W, X, Z are jointly Gaussian, we have E[X W, Z] = αw + βz, for some scalars α, β. Furthermore, we have X = αw + βz + N, 40 where N is a Gaussian random variable independent of W, Z with zero mean and variance σx W,Z 2.SinceW is known to the encoder and decoder we can subtract αw from X, and then using Wyner-Ziv coding for the Gaussian case [7] we obtain R = 1 2 log σ2 X W,Z. 41 Obviously, one can not achieve a rate smaller than this even if Z is known both to the encoder and decoder, and therefore this is the achievable region. REFERENCES [1] R. F. AhlswedeandJ. Körner, Source coding with side information and a converse for the degraded broadcast channel, IEEE Trans. Inf. Theory, vol. IT-21, no. 6, pp , November [2] I. Csiszar and J. Körner, Information Theory: Coding Theorems for iscrete Memoryless Systems. Academic, New York, [3] A. Kaspi, Rate-distortion for correlated sources with patially separated encoders. Ph. disseration, School of Electrical Engineering, Cornell University, Ithaca, NY, January [4] A. Kaspi and T. Berger, Rate-distortion for correlated sources with partially separated encoders, IEEE Trans. Inform. Theory, vol. IT-28, no. 6, pp , November [5]. Vasudevan and E. Perron, Cooperative source coding with encoder breakdown, in Proc. IEEE Int. Symp. Information Theory, Nice, France, June 2007, pp [6] A.. Wyner, On source coding with side information at the decoder, IEEE Trans. Inf. Theory, vol. IT-21, no. 3, pp , May [7] A.. Wyner, The rate-distortion function for source coding with side information at the decodr-{ii}: General sources, Information and Control, 38:60 80, [8] A.. Wyner and J. Ziv, The rate-distortion function for source coding with side information at the decoder, IEEE Trans. Inf. Theory, vol. IT-22, no. 1, pp. 1 10, January [9] H. Yamamoto, Source coding theory for cascade and branching communication systems, IEEE Trans. Inf. Theory, vol. IT-27, no. 3, pp , May

On Scalable Coding in the Presence of Decoder Side Information

On Scalable Coding in the Presence of Decoder Side Information On Scalable Coding in the Presence of Decoder Side Information Emrah Akyol, Urbashi Mitra Dep. of Electrical Eng. USC, CA, US Email: {eakyol, ubli}@usc.edu Ertem Tuncel Dep. of Electrical Eng. UC Riverside,

More information

On Scalable Source Coding for Multiple Decoders with Side Information

On Scalable Source Coding for Multiple Decoders with Side Information On Scalable Source Coding for Multiple Decoders with Side Information Chao Tian School of Computer and Communication Sciences Laboratory for Information and Communication Systems (LICOS), EPFL, Lausanne,

More information

YAMAMOTO [1] considered the cascade source coding

YAMAMOTO [1] considered the cascade source coding IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 58, NO 6, JUNE 2012 3339 Cascade Triangular Source Coding With Side Information at the First Two Nodes Haim H Permuter, Member, IEEE, Tsachy Weissman, Senior

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

The Gallager Converse

The Gallager Converse The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing

More information

Variable Length Codes for Degraded Broadcast Channels

Variable Length Codes for Degraded Broadcast Channels Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates

More information

Multiterminal Source Coding with an Entropy-Based Distortion Measure

Multiterminal Source Coding with an Entropy-Based Distortion Measure Multiterminal Source Coding with an Entropy-Based Distortion Measure Thomas Courtade and Rick Wesel Department of Electrical Engineering University of California, Los Angeles 4 August, 2011 IEEE International

More information

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,

More information

On Common Information and the Encoding of Sources that are Not Successively Refinable

On Common Information and the Encoding of Sources that are Not Successively Refinable On Common Information and the Encoding of Sources that are Not Successively Refinable Kumar Viswanatha, Emrah Akyol, Tejaswi Nanjundaswamy and Kenneth Rose ECE Department, University of California - Santa

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003

SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003 SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003 SLEPIAN-WOLF RESULT { X i} RATE R x ENCODER 1 DECODER X i V i {, } { V i} ENCODER 0 RATE R v Problem: Determine R, the

More information

Haim H. Permuter and Tsachy Weissman. Abstract

Haim H. Permuter and Tsachy Weissman. Abstract Cascade and Triangular Source Coding with 1 Side Information at the First Two Nodes Haim H. Permuter and Tsachy Weissman Abstract We consider the cascade and triangular rate-distortion problem where side

More information

Joint Source-Channel Coding for the Multiple-Access Relay Channel

Joint Source-Channel Coding for the Multiple-Access Relay Channel Joint Source-Channel Coding for the Multiple-Access Relay Channel Yonathan Murin, Ron Dabora Department of Electrical and Computer Engineering Ben-Gurion University, Israel Email: moriny@bgu.ac.il, ron@ee.bgu.ac.il

More information

Lossy Distributed Source Coding

Lossy Distributed Source Coding Lossy Distributed Source Coding John MacLaren Walsh, Ph.D. Multiterminal Information Theory, Spring Quarter, 202 Lossy Distributed Source Coding Problem X X 2 S {,...,2 R } S 2 {,...,2 R2 } Ẑ Ẑ 2 E d(z,n,

More information

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview AN INTRODUCTION TO SECRECY CAPACITY BRIAN DUNN. Overview This paper introduces the reader to several information theoretic aspects of covert communications. In particular, it discusses fundamental limits

More information

Side-information Scalable Source Coding

Side-information Scalable Source Coding Side-information Scalable Source Coding Chao Tian, Member, IEEE, Suhas N. Diggavi, Member, IEEE Abstract The problem of side-information scalable (SI-scalable) source coding is considered in this work,

More information

Interactive Hypothesis Testing with Communication Constraints

Interactive Hypothesis Testing with Communication Constraints Fiftieth Annual Allerton Conference Allerton House, UIUC, Illinois, USA October - 5, 22 Interactive Hypothesis Testing with Communication Constraints Yu Xiang and Young-Han Kim Department of Electrical

More information

Information Masking and Amplification: The Source Coding Setting

Information Masking and Amplification: The Source Coding Setting 202 IEEE International Symposium on Information Theory Proceedings Information Masking and Amplification: The Source Coding Setting Thomas A. Courtade Department of Electrical Engineering University of

More information

Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function

Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Dinesh Krithivasan and S. Sandeep Pradhan Department of Electrical Engineering and Computer Science,

More information

Multiuser Successive Refinement and Multiple Description Coding

Multiuser Successive Refinement and Multiple Description Coding Multiuser Successive Refinement and Multiple Description Coding Chao Tian Laboratory for Information and Communication Systems (LICOS) School of Computer and Communication Sciences EPFL Lausanne Switzerland

More information

SOURCE coding problems with side information at the decoder(s)

SOURCE coding problems with side information at the decoder(s) 1458 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 59, NO. 3, MARCH 2013 Heegard Berger Cascade Source Coding Problems With Common Reconstruction Constraints Behzad Ahmadi, Student Member, IEEE, Ravi Ton,

More information

On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers

On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers Albrecht Wolf, Diana Cristina González, Meik Dörpinghaus, José Cândido Silveira Santos Filho, and Gerhard Fettweis Vodafone

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

II. THE TWO-WAY TWO-RELAY CHANNEL

II. THE TWO-WAY TWO-RELAY CHANNEL An Achievable Rate Region for the Two-Way Two-Relay Channel Jonathan Ponniah Liang-Liang Xie Department of Electrical Computer Engineering, University of Waterloo, Canada Abstract We propose an achievable

More information

Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel

Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel Jun Chen Dept. of Electrical and Computer Engr. McMaster University Hamilton, Ontario, Canada Chao Tian AT&T Labs-Research 80 Park

More information

Remote Source Coding with Two-Sided Information

Remote Source Coding with Two-Sided Information Remote Source Coding with Two-Sided Information Basak Guler Ebrahim MolavianJazi Aylin Yener Wireless Communications and Networking Laboratory Department of Electrical Engineering The Pennsylvania State

More information

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University) Performance-based Security for Encoding of Information Signals FA9550-15-1-0180 (2015-2018) Paul Cuff (Princeton University) Contributors Two students finished PhD Tiance Wang (Goldman Sachs) Eva Song

More information

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

Lecture 10: Broadcast Channel and Superposition Coding

Lecture 10: Broadcast Channel and Superposition Coding Lecture 10: Broadcast Channel and Superposition Coding Scribed by: Zhe Yao 1 Broadcast channel M 0M 1M P{y 1 y x} M M 01 1 M M 0 The capacity of the broadcast channel depends only on the marginal conditional

More information

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 62, NO. 5, MAY

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 62, NO. 5, MAY IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 6, NO. 5, MAY 06 85 Duality of a Source Coding Problem and the Semi-Deterministic Broadcast Channel With Rate-Limited Cooperation Ziv Goldfeld, Student Member,

More information

A Comparison of Superposition Coding Schemes

A Comparison of Superposition Coding Schemes A Comparison of Superposition Coding Schemes Lele Wang, Eren Şaşoğlu, Bernd Bandemer, and Young-Han Kim Department of Electrical and Computer Engineering University of California, San Diego La Jolla, CA

More information

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

18.2 Continuous Alphabet (discrete-time, memoryless) Channel 0-704: Information Processing and Learning Spring 0 Lecture 8: Gaussian channel, Parallel channels and Rate-distortion theory Lecturer: Aarti Singh Scribe: Danai Koutra Disclaimer: These notes have not

More information

The Capacity Region of a Class of Discrete Degraded Interference Channels

The Capacity Region of a Class of Discrete Degraded Interference Channels The Capacity Region of a Class of Discrete Degraded Interference Channels Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@umd.edu

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

On Gaussian MIMO Broadcast Channels with Common and Private Messages

On Gaussian MIMO Broadcast Channels with Common and Private Messages On Gaussian MIMO Broadcast Channels with Common and Private Messages Ersen Ekrem Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 20742 ersen@umd.edu

More information

A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality

A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality 0 IEEE Information Theory Workshop A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality Kumar Viswanatha, Emrah Akyol and Kenneth Rose ECE Department, University of California

More information

Multicoding Schemes for Interference Channels

Multicoding Schemes for Interference Channels Multicoding Schemes for Interference Channels 1 Ritesh Kolte, Ayfer Özgür, Haim Permuter Abstract arxiv:1502.04273v1 [cs.it] 15 Feb 2015 The best known inner bound for the 2-user discrete memoryless interference

More information

Universal Incremental Slepian-Wolf Coding

Universal Incremental Slepian-Wolf Coding Proceedings of the 43rd annual Allerton Conference, Monticello, IL, September 2004 Universal Incremental Slepian-Wolf Coding Stark C. Draper University of California, Berkeley Berkeley, CA, 94720 USA sdraper@eecs.berkeley.edu

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

An Outer Bound for the Gaussian. Interference channel with a relay.

An Outer Bound for the Gaussian. Interference channel with a relay. An Outer Bound for the Gaussian Interference Channel with a Relay Ivana Marić Stanford University Stanford, CA ivanam@wsl.stanford.edu Ron Dabora Ben-Gurion University Be er-sheva, Israel ron@ee.bgu.ac.il

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

3238 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 60, NO. 6, JUNE 2014

3238 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 60, NO. 6, JUNE 2014 3238 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 60, NO. 6, JUNE 2014 The Lossy Common Information of Correlated Sources Kumar B. Viswanatha, Student Member, IEEE, Emrah Akyol, Member, IEEE, and Kenneth

More information

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 Capacity Theorems for Discrete, Finite-State Broadcast Channels With Feedback and Unidirectional Receiver Cooperation Ron Dabora

More information

Generalized Writing on Dirty Paper

Generalized Writing on Dirty Paper Generalized Writing on Dirty Paper Aaron S. Cohen acohen@mit.edu MIT, 36-689 77 Massachusetts Ave. Cambridge, MA 02139-4307 Amos Lapidoth lapidoth@isi.ee.ethz.ch ETF E107 ETH-Zentrum CH-8092 Zürich, Switzerland

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

A Formula for the Capacity of the General Gel fand-pinsker Channel

A Formula for the Capacity of the General Gel fand-pinsker Channel A Formula for the Capacity of the General Gel fand-pinsker Channel Vincent Y. F. Tan Institute for Infocomm Research (I 2 R, A*STAR, Email: tanyfv@i2r.a-star.edu.sg ECE Dept., National University of Singapore

More information

Distributed Lossy Interactive Function Computation

Distributed Lossy Interactive Function Computation Distributed Lossy Interactive Function Computation Solmaz Torabi & John MacLaren Walsh Dept. of Electrical and Computer Engineering Drexel University Philadelphia, PA 19104 Email: solmaz.t@drexel.edu &

More information

On the Capacity of the Interference Channel with a Relay

On the Capacity of the Interference Channel with a Relay On the Capacity of the Interference Channel with a Relay Ivana Marić, Ron Dabora and Andrea Goldsmith Stanford University, Stanford, CA {ivanam,ron,andrea}@wsl.stanford.edu Abstract Capacity gains due

More information

Exercise 1. = P(y a 1)P(a 1 )

Exercise 1. = P(y a 1)P(a 1 ) Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a

More information

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122 Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel

More information

Duality Between Channel Capacity and Rate Distortion With Two-Sided State Information

Duality Between Channel Capacity and Rate Distortion With Two-Sided State Information IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 6, JUNE 2002 1629 Duality Between Channel Capacity Rate Distortion With Two-Sided State Information Thomas M. Cover, Fellow, IEEE, Mung Chiang, Student

More information

Reliable Computation over Multiple-Access Channels

Reliable Computation over Multiple-Access Channels Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,

More information

X 1 : X Table 1: Y = X X 2

X 1 : X Table 1: Y = X X 2 ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access

More information

On Lossless Coding With Coded Side Information Daniel Marco, Member, IEEE, and Michelle Effros, Fellow, IEEE

On Lossless Coding With Coded Side Information Daniel Marco, Member, IEEE, and Michelle Effros, Fellow, IEEE 3284 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 7, JULY 2009 On Lossless Coding With Coded Side Information Daniel Marco, Member, IEEE, Michelle Effros, Fellow, IEEE Abstract This paper considers

More information

On Multiple User Channels with State Information at the Transmitters

On Multiple User Channels with State Information at the Transmitters On Multiple User Channels with State Information at the Transmitters Styrmir Sigurjónsson and Young-Han Kim* Information Systems Laboratory Stanford University Stanford, CA 94305, USA Email: {styrmir,yhk}@stanford.edu

More information

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying Ahmad Abu Al Haija, and Mai Vu, Department of Electrical and Computer Engineering McGill University Montreal, QC H3A A7 Emails: ahmadabualhaija@mailmcgillca,

More information

On Function Computation with Privacy and Secrecy Constraints

On Function Computation with Privacy and Secrecy Constraints 1 On Function Computation with Privacy and Secrecy Constraints Wenwen Tu and Lifeng Lai Abstract In this paper, the problem of function computation with privacy and secrecy constraints is considered. The

More information

Wyner-Ziv Coding over Broadcast Channels: Digital Schemes

Wyner-Ziv Coding over Broadcast Channels: Digital Schemes Wyner-Ziv Coding over Broadcast Channels: Digital Schemes Jayanth Nayak, Ertem Tuncel, Deniz Gündüz 1 Abstract This paper addresses lossy transmission of a common source over a broadcast channel when there

More information

A Summary of Multiple Access Channels

A Summary of Multiple Access Channels A Summary of Multiple Access Channels Wenyi Zhang February 24, 2003 Abstract In this summary we attempt to present a brief overview of the classical results on the information-theoretic aspects of multiple

More information

Solutions to Homework Set #2 Broadcast channel, degraded message set, Csiszar Sum Equality

Solutions to Homework Set #2 Broadcast channel, degraded message set, Csiszar Sum Equality 1st Semester 2010/11 Solutions to Homework Set #2 Broadcast channel, degraded message set, Csiszar Sum Equality 1. Convexity of capacity region of broadcast channel. Let C R 2 be the capacity region of

More information

Source-Channel Coding Theorems for the Multiple-Access Relay Channel

Source-Channel Coding Theorems for the Multiple-Access Relay Channel Source-Channel Coding Theorems for the Multiple-Access Relay Channel Yonathan Murin, Ron Dabora, and Deniz Gündüz Abstract We study reliable transmission of arbitrarily correlated sources over multiple-access

More information

Design of Optimal Quantizers for Distributed Source Coding

Design of Optimal Quantizers for Distributed Source Coding Design of Optimal Quantizers for Distributed Source Coding David Rebollo-Monedero, Rui Zhang and Bernd Girod Information Systems Laboratory, Electrical Eng. Dept. Stanford University, Stanford, CA 94305

More information

ECE Information theory Final

ECE Information theory Final ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the

More information

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability

More information

The Sensor Reachback Problem

The Sensor Reachback Problem Submitted to the IEEE Trans. on Information Theory, November 2003. 1 The Sensor Reachback Problem João Barros Sergio D. Servetto Abstract We consider the problem of reachback communication in sensor networks.

More information

On the Duality between Multiple-Access Codes and Computation Codes

On the Duality between Multiple-Access Codes and Computation Codes On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels Representation of Correlated s into Graphs for Transmission over Broadcast s Suhan Choi Department of Electrical Eng. and Computer Science University of Michigan, Ann Arbor, MI 80, USA Email: suhanc@eecs.umich.edu

More information

Amobile satellite communication system, like Motorola s

Amobile satellite communication system, like Motorola s I TRANSACTIONS ON INFORMATION THORY, VOL. 45, NO. 4, MAY 1999 1111 Distributed Source Coding for Satellite Communications Raymond W. Yeung, Senior Member, I, Zhen Zhang, Senior Member, I Abstract Inspired

More information

arxiv: v1 [cs.it] 4 Jun 2018

arxiv: v1 [cs.it] 4 Jun 2018 State-Dependent Interference Channel with Correlated States 1 Yunhao Sun, 2 Ruchen Duan, 3 Yingbin Liang, 4 Shlomo Shamai (Shitz) 5 Abstract arxiv:180600937v1 [csit] 4 Jun 2018 This paper investigates

More information

Capacity bounds for multiple access-cognitive interference channel

Capacity bounds for multiple access-cognitive interference channel Mirmohseni et al. EURASIP Journal on Wireless Communications and Networking, :5 http://jwcn.eurasipjournals.com/content///5 RESEARCH Open Access Capacity bounds for multiple access-cognitive interference

More information

Lecture 20: Quantization and Rate-Distortion

Lecture 20: Quantization and Rate-Distortion Lecture 20: Quantization and Rate-Distortion Quantization Introduction to rate-distortion theorem Dr. Yao Xie, ECE587, Information Theory, Duke University Approimating continuous signals... Dr. Yao Xie,

More information

On Optimum Conventional Quantization for Source Coding with Side Information at the Decoder

On Optimum Conventional Quantization for Source Coding with Side Information at the Decoder On Optimum Conventional Quantization for Source Coding with Side Information at the Decoder by Lin Zheng A thesis presented to the University of Waterloo in fulfillment of the thesis requirement for the

More information

The Duality Between Information Embedding and Source Coding With Side Information and Some Applications

The Duality Between Information Embedding and Source Coding With Side Information and Some Applications IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 49, NO. 5, MAY 2003 1159 The Duality Between Information Embedding and Source Coding With Side Information and Some Applications Richard J. Barron, Member,

More information

Capacity Bounds for Diamond Networks

Capacity Bounds for Diamond Networks Technische Universität München Capacity Bounds for Diamond Networks Gerhard Kramer (TUM) joint work with Shirin Saeedi Bidokhti (TUM & Stanford) DIMACS Workshop on Network Coding Rutgers University, NJ

More information

Paul Cuff, Han-I Su, and Abbas EI Gamal Department of Electrical Engineering Stanford University {cuff, hanisu,

Paul Cuff, Han-I Su, and Abbas EI Gamal Department of Electrical Engineering Stanford University   {cuff, hanisu, Cascade Multiterminal Source Coding Paul Cuff, Han-I Su, and Abbas EI Gamal Department of Electrical Engineering Stanford University E-mail: {cuff, hanisu, abbas}@stanford.edu Abstract-We investigate distributed

More information

Coordination Capacity

Coordination Capacity 1 Coordination Capacity Paul Cuff, Member, IEEE, Haim Permuter, Member, IEEE, and Thomas M Cover, Fellow, IEEE arxiv:09092408v2 [csit] 26 May 2010 Abstract We develop elements of a theory of cooperation

More information

The Capacity Region of the Cognitive Z-interference Channel with One Noiseless Component

The Capacity Region of the Cognitive Z-interference Channel with One Noiseless Component 1 The Capacity Region of the Cognitive Z-interference Channel with One Noiseless Component Nan Liu, Ivana Marić, Andrea J. Goldsmith, Shlomo Shamai (Shitz) arxiv:0812.0617v1 [cs.it] 2 Dec 2008 Dept. of

More information

Interactive Decoding of a Broadcast Message

Interactive Decoding of a Broadcast Message In Proc. Allerton Conf. Commun., Contr., Computing, (Illinois), Oct. 2003 Interactive Decoding of a Broadcast Message Stark C. Draper Brendan J. Frey Frank R. Kschischang University of Toronto Toronto,

More information

Extended Gray Wyner System with Complementary Causal Side Information

Extended Gray Wyner System with Complementary Causal Side Information Extended Gray Wyner System with Complementary Causal Side Information Cheuk Ting Li and Abbas El Gamal Department of Electrical Engineering, Stanford University Email: ctli@stanford.edu, abbas@ee.stanford.edu

More information

Lecture 3: Channel Capacity

Lecture 3: Channel Capacity Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.

More information

On the Capacity Region of the Gaussian Z-channel

On the Capacity Region of the Gaussian Z-channel On the Capacity Region of the Gaussian Z-channel Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@eng.umd.edu ulukus@eng.umd.edu

More information

Linearly Representable Entropy Vectors and their Relation to Network Coding Solutions

Linearly Representable Entropy Vectors and their Relation to Network Coding Solutions 2009 IEEE Information Theory Workshop Linearly Representable Entropy Vectors and their Relation to Network Coding Solutions Asaf Cohen, Michelle Effros, Salman Avestimehr and Ralf Koetter Abstract In this

More information

6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011

6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011 6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011 On the Structure of Real-Time Encoding and Decoding Functions in a Multiterminal Communication System Ashutosh Nayyar, Student

More information

Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel

Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel Ahmad Abu Al Haija ECE Department, McGill University, Montreal, QC, Canada Email: ahmad.abualhaija@mail.mcgill.ca

More information

Subset Typicality Lemmas and Improved Achievable Regions in Multiterminal Source Coding

Subset Typicality Lemmas and Improved Achievable Regions in Multiterminal Source Coding Subset Typicality Lemmas and Improved Achievable Regions in Multiterminal Source Coding Kumar Viswanatha, Emrah Akyol and Kenneth Rose ECE Department, University of California - Santa Barbara {kumar,eakyol,rose}@ece.ucsb.edu

More information

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel

More information

On Two-user Fading Gaussian Broadcast Channels. with Perfect Channel State Information at the Receivers. Daniela Tuninetti

On Two-user Fading Gaussian Broadcast Channels. with Perfect Channel State Information at the Receivers. Daniela Tuninetti DIMACS Workshop on Network Information Theory - March 2003 Daniela Tuninetti 1 On Two-user Fading Gaussian Broadcast Channels with Perfect Channel State Information at the Receivers Daniela Tuninetti Mobile

More information

Equivalence for Networks with Adversarial State

Equivalence for Networks with Adversarial State Equivalence for Networks with Adversarial State Oliver Kosut Department of Electrical, Computer and Energy Engineering Arizona State University Tempe, AZ 85287 Email: okosut@asu.edu Jörg Kliewer Department

More information

On the Capacity of the Two-Hop Half-Duplex Relay Channel

On the Capacity of the Two-Hop Half-Duplex Relay Channel On the Capacity of the Two-Hop Half-Duplex Relay Channel Nikola Zlatanov, Vahid Jamali, and Robert Schober University of British Columbia, Vancouver, Canada, and Friedrich-Alexander-University Erlangen-Nürnberg,

More information

EE5585 Data Compression May 2, Lecture 27

EE5585 Data Compression May 2, Lecture 27 EE5585 Data Compression May 2, 2013 Lecture 27 Instructor: Arya Mazumdar Scribe: Fangying Zhang Distributed Data Compression/Source Coding In the previous class we used a H-W table as a simple example,

More information

Lecture 15: Conditional and Joint Typicaility

Lecture 15: Conditional and Joint Typicaility EE376A Information Theory Lecture 1-02/26/2015 Lecture 15: Conditional and Joint Typicaility Lecturer: Kartik Venkat Scribe: Max Zimet, Brian Wai, Sepehr Nezami 1 Notation We always write a sequence of

More information

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute ENEE 739C: Advanced Topics in Signal Processing: Coding Theory Instructor: Alexander Barg Lecture 6 (draft; 9/6/03. Error exponents for Discrete Memoryless Channels http://www.enee.umd.edu/ abarg/enee739c/course.html

More information

The Information Lost in Erasures Sergio Verdú, Fellow, IEEE, and Tsachy Weissman, Senior Member, IEEE

The Information Lost in Erasures Sergio Verdú, Fellow, IEEE, and Tsachy Weissman, Senior Member, IEEE 5030 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 54, NO. 11, NOVEMBER 2008 The Information Lost in Erasures Sergio Verdú, Fellow, IEEE, Tsachy Weissman, Senior Member, IEEE Abstract We consider sources

More information

arxiv: v1 [cs.it] 5 Feb 2016

arxiv: v1 [cs.it] 5 Feb 2016 An Achievable Rate-Distortion Region for Multiple Descriptions Source Coding Based on Coset Codes Farhad Shirani and S. Sandeep Pradhan Dept. of Electrical Engineering and Computer Science Univ. of Michigan,

More information

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR 1 Stefano Rini, Daniela Tuninetti and Natasha Devroye srini2, danielat, devroye @ece.uic.edu University of Illinois at Chicago Abstract

More information

Solutions to Homework Set #1 Sanov s Theorem, Rate distortion

Solutions to Homework Set #1 Sanov s Theorem, Rate distortion st Semester 00/ Solutions to Homework Set # Sanov s Theorem, Rate distortion. Sanov s theorem: Prove the simple version of Sanov s theorem for the binary random variables, i.e., let X,X,...,X n be a sequence

More information

Quantization for Distributed Estimation

Quantization for Distributed Estimation 0 IEEE International Conference on Internet of Things ithings 0), Green Computing and Communications GreenCom 0), and Cyber-Physical-Social Computing CPSCom 0) Quantization for Distributed Estimation uan-yu

More information

THE multiple-access relay channel (MARC) is a multiuser

THE multiple-access relay channel (MARC) is a multiuser IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 60, NO. 10, OCTOBER 2014 6231 On Joint Source-Channel Coding for Correlated Sources Over Multiple-Access Relay Channels Yonathan Murin, Ron Dabora, and Deniz

More information

National University of Singapore Department of Electrical & Computer Engineering. Examination for

National University of Singapore Department of Electrical & Computer Engineering. Examination for National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:

More information