ProblemsWeCanSolveWithaHelper

Similar documents
On Scalable Coding in the Presence of Decoder Side Information

On Scalable Source Coding for Multiple Decoders with Side Information

YAMAMOTO [1] considered the cascade source coding

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

The Gallager Converse

Variable Length Codes for Degraded Broadcast Channels

Multiterminal Source Coding with an Entropy-Based Distortion Measure

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

On Common Information and the Encoding of Sources that are Not Successively Refinable

ECE Information theory Final (Fall 2008)

SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003

Haim H. Permuter and Tsachy Weissman. Abstract

Joint Source-Channel Coding for the Multiple-Access Relay Channel

Lossy Distributed Source Coding

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview

Side-information Scalable Source Coding

Interactive Hypothesis Testing with Communication Constraints

Information Masking and Amplification: The Source Coding Setting

Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function

Multiuser Successive Refinement and Multiple Description Coding

SOURCE coding problems with side information at the decoder(s)

On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

II. THE TWO-WAY TWO-RELAY CHANNEL

Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel

Remote Source Coding with Two-Sided Information

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)

Lecture 22: Final Review

Lecture 10: Broadcast Channel and Superposition Coding

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 62, NO. 5, MAY

A Comparison of Superposition Coding Schemes

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

The Capacity Region of a Class of Discrete Degraded Interference Channels

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

On Gaussian MIMO Broadcast Channels with Common and Private Messages

A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality

Multicoding Schemes for Interference Channels

Universal Incremental Slepian-Wolf Coding

Shannon s noisy-channel theorem

An Outer Bound for the Gaussian. Interference channel with a relay.

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

3238 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 60, NO. 6, JUNE 2014

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010

Generalized Writing on Dirty Paper

Lecture 5 Channel Coding over Continuous Channels

A Formula for the Capacity of the General Gel fand-pinsker Channel

Distributed Lossy Interactive Function Computation

On the Capacity of the Interference Channel with a Relay

Exercise 1. = P(y a 1)P(a 1 )

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Duality Between Channel Capacity and Rate Distortion With Two-Sided State Information

Reliable Computation over Multiple-Access Channels

X 1 : X Table 1: Y = X X 2

On Lossless Coding With Coded Side Information Daniel Marco, Member, IEEE, and Michelle Effros, Fellow, IEEE

On Multiple User Channels with State Information at the Transmitters

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying

On Function Computation with Privacy and Secrecy Constraints

Wyner-Ziv Coding over Broadcast Channels: Digital Schemes

A Summary of Multiple Access Channels

Solutions to Homework Set #2 Broadcast channel, degraded message set, Csiszar Sum Equality

Source-Channel Coding Theorems for the Multiple-Access Relay Channel

Design of Optimal Quantizers for Distributed Source Coding

ECE Information theory Final

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

The Sensor Reachback Problem

On the Duality between Multiple-Access Codes and Computation Codes

Lecture 4 Noisy Channel Coding

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels

Amobile satellite communication system, like Motorola s

arxiv: v1 [cs.it] 4 Jun 2018

Capacity bounds for multiple access-cognitive interference channel

Lecture 20: Quantization and Rate-Distortion

On Optimum Conventional Quantization for Source Coding with Side Information at the Decoder

The Duality Between Information Embedding and Source Coding With Side Information and Some Applications

Capacity Bounds for Diamond Networks

Paul Cuff, Han-I Su, and Abbas EI Gamal Department of Electrical Engineering Stanford University {cuff, hanisu,

Coordination Capacity

The Capacity Region of the Cognitive Z-interference Channel with One Noiseless Component

Interactive Decoding of a Broadcast Message

Extended Gray Wyner System with Complementary Causal Side Information

Lecture 3: Channel Capacity

On the Capacity Region of the Gaussian Z-channel

Linearly Representable Entropy Vectors and their Relation to Network Coding Solutions

6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011

Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel

Subset Typicality Lemmas and Improved Achievable Regions in Multiterminal Source Coding

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

On Two-user Fading Gaussian Broadcast Channels. with Perfect Channel State Information at the Receivers. Daniela Tuninetti

Equivalence for Networks with Adversarial State

On the Capacity of the Two-Hop Half-Duplex Relay Channel

EE5585 Data Compression May 2, Lecture 27

Lecture 15: Conditional and Joint Typicaility

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute

The Information Lost in Erasures Sergio Verdú, Fellow, IEEE, and Tsachy Weissman, Senior Member, IEEE

arxiv: v1 [cs.it] 5 Feb 2016

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR

Solutions to Homework Set #1 Sanov s Theorem, Rate distortion

Quantization for Distributed Estimation

THE multiple-access relay channel (MARC) is a multiuser

National University of Singapore Department of Electrical & Computer Engineering. Examination for

Transcription:

ITW 2009, Volos, Greece, June 10-12, 2009 ProblemsWeCanSolveWitha Haim Permuter Ben-Gurion University of the Negev haimp@bgu.ac.il Yossef Steinberg Technion - IIT ysteinbe@ee.technion.ac.il Tsachy Weissman Technion/Stanford University tsachy@stanford.edu Abstract In this work we study source coding problems where a helper provides rate-limited side information to the involved parties. We first consider the Wyner-Ziv problem, where in addition to the memoryless side information available to the decoder, a helper sends common, rate-limited side information to the encoder and decoder. A single letter characterization of the achievable rates is derived, under certain Markov conditions on the source and side information. We then examine the problem of cascade rate distortion with a helper. Partial results are derived also for the case where the side information is not necessarily common, i.e., when the helper can send different streams of coded side information to the involved parties. X n T,rateR T,rateR Z n ecoder ˆX n I. INTROUCTION In this work we study problems of lossy source coding with rate limited side information. The first problem we study is depicted in Figure 1. It is an extension of the Wyner- Ziv problem [8], as described next. The source, helper, and side information sequences, X i, Y i,andz i, are independent and identically distributed i.i.d. sequences, with generic distribution X i,y i,z i P X,Y,Z. We assume throughout that P X,Y,Z = P Y X P Z X P X. That is, the Markov relations Y X Z hold. The coding scheme proceeds as follows. At the first stage of the encoding scheme, the helper compresses the side information to rate R, and sends the stream T of coded bits to the encoder and decoder. At the second stage, the encoder compresses the source X n to rate R, based on the vector X n and the stream of coded bits T that arrived from the helper. enote this stream by T. The decoder then constructs the estimation ˆX n based on the codewords arrived from the helper and the encoder, T and T respectively, and the memoryless side information Z n. We are interested in the region of all achievable rates and distortion triples, R,R,, where is the distortion between the reproduction ˆX n and the source X n. Several works on related problems appeared in the past in the literature. Wyner [6] studied a problem of network source coding with compressed side information that is provided only to the decoders. A special case of his model is the system in Figure 1, without the memoryless side information Z, and where the stream T arrives only to the decoder. A full characterization of the achievable region can be concluded from the results of [6], for the special case where the source X has to be reconstructed losslessly. This problem was solved independently by Ahlswede and Körner in [1]. The extension The work of Y. Steinberg was supported by THE ISRAEL SCIENCE FOUNATION grant No. 280/07 Fig. 1. The rate distortion problem with a helper Y, and additional side information Z known only to the decoder. We assume that the side information Z and the helper Y are independent given the source X. of these results to the case of lossy reconstruction of X, remains open. Kaspi [3] and Kaspi and Berger [4] derived an achievable region for a problem that contains the helper problem with degenerate Z as a special case. However, the converse part does not match. In [5], Vasudevan and Perron describe a general rate distortion problem with encoder breakdown. For some choice of distortions, which they term as Setup B, their model reduces to our model with degenerate side information Z. In particular, Theorem 3 of [5] provides a single letter characterization of the achievable rates and distortions for their Setup B. The second problem we study in this work is cascade source cooding with rate limited side information, as depicted in Figure 2. It is an extension of Yamamoto s cascade source coding model [9] to the case where helper sends compressed side information to the encoder and the two decoders. The communication protocol is as described above for the Wyner- Ziv problem: first, the helper compresses the side information, to a stream T of bits at rate R. The stream T is sent to all the parties involved. Then the encoder compresses the source X n, and sends the codeword T 1,atrateR 1,tothefirst decoder, ecoder 1. Upon receiving T and T, this decoder produces an estimate ˆX 1 n of Xn. In addition, it acts as an encoder for the next stage: based on T and T 1, ecoder 1 produces a codeword T 2 at rate R 2, and sends it to the second decoder, who in turn produces the estimate ˆX 2 n. enote by 1 resp. 2 the distortion between ˆX 1 n rep. ˆXn 2 andthe source X n. The problem we treat here in this context is the characterization of all quintuples R,R 1,R 2, 1, 2. 978-1-4244-4536-3/09/$25.00 2009 IEEE 266

X n ˆX n 1 T 1,rateR 1 ecoder 1 T,rateR ˆX n 2 T 2,rateR 2 ecoder 2 Fig. 2. The cascade rate distortion problem with a helper Y. This work is organized as follows. In Section II-A, we define the problem of rate distortion with helper and decoder side information, and derive the main result. The Gaussian example is given in Section II-B. Section II-C treats the case where independent rates are allowed, and gives partial results. The problem of cascade source coding with a helper is defined and treated in Section III. II. RATE ISTORTION WITH HELPER AN SIE INFORMATION In this Section, we consider the rate distortion with a helper and additional side information Z, known only to the decoder, as shown in Fig. 1. We also assume that the source X, the helper Y, and the side information Z, form the Markov chain Y X Z. A. efinitions and main result We are given a source X, distributed according to P X on a finite set X. The reconstruction alphabet is denoted by ˆX,and t he distortion measure by d : X ˆX IR +. The distortion between vectors is defined as usual, as the normalized sum between single letter distortions. The formal definition of the code is the following. efinition 1: An n, M, M, code for source X with helper and side information Z consists of two encoders f,f and a decoder g, f : {1, 2,..., M } f : X n {1, 2,..., M } {1, 2,..., M} g : {1, 2,..., M} {1, 2,..., M } Z n ˆX n 1 EdX n, ˆX n. 2 A triple R, R, is said to be achievable if for any δ > 0, ɛ > 0, and sufficiently large n, there exists an n, 2 nr+δ, 2 nr +δ, + ɛ code for the source X with helper side information Z. For a given distortion level, The operational achievable region, denoted R O Z, isthe set of all rate pairs R, R the triple R, R, is achievable. Let R Z be the set of all rate pairs R, R that satisfy R IX; U V,Z, 3 R IV ; Y Z, 4 for some joint distribution of the form px, y, z, u, v = px, ypz xpv ypu x, v,5 EdX, ˆXU, V, Z, 6 where U and V are auxiliary random variables, and the reconstruction variable ˆX is a deterministic function of the triple U, V, Z. The next lemma states properties of R Z. Lemma 1: 1 The region R Z is convex 2 To exhaust R Z, it is enough to restrict the alphabets of V and U to satisfy V Y +2 U X Y +2+1. 7 The proof is omitted. We now state the main result of this section. Theorem 1: R O Z =R Z 8 provided the Markov chain Y X Z holds. The proof of Theorem 1 is omitted, due to lack of space. However, we describe here a side result about the region R Z, which is essential in the proof of the converse part. Let us define an additional region R Z as the set of all rate pairs R, R satisfying 3, 4, and 6, for some distribution of the form px, y, z, u, v =px, ypz xpv ypu x, v, y, 9 where U, V are auxiliary random variables, and the reconstruction variable ˆX is a deterministic function of the triple U, V, Z. Note that the only difference between R Z and R Z is in the joint distributions over which the region is computed: in the composition 9 the external random variable U is independent of Y when conditioned on X, V. Thatis, the Markov chain U X, V Y is imposed, whereas in the definition of R Z this Markov structure is not imposed. In the proof of Theorem 1, we show that R Z is achievable and that R Z is an outer bound, and we conclude the proof by applying the following lemma, which states that the two regions are equal. Lemma 2: R Z =R Z. Proof: First we notice that R Z R Z, since one can restrict the input distribution px, y, z, u, v = px, ypz xpv ypu x, v. Now we prove that R Z R Z. LetR, R R Z, and px, y, z, u, v =px, ypz xpv ypu x, v, y 10 be a distribution that satisfies 3 and 4. It is shown next that there exists a distribution of the form 5 3 and 4 hold. Let px, y, z, u, v =px, y, zpv ypu x, v, 11 267

where pu x, v is induced by px, y, z, u, v. We now show that the terms IV ; Y Z, IX; U Z, V and EdX, ˆXu, v, z are the same whether we evaluate them by the joint distribution px, y, z, u, v of 11, or by px, y, z, u, v; hence R, R R Z. Inordertoshow that the terms above are the same it is enough to show that the marginal distributions py, z, v and px, z, u, v induced by px, y, z, u, v are equal to the marginal distributions py, z, v and px, z, u, v induced by px, y, z, u, v. Clearly py, v, z = py, v, z. In the rest of the proof we show px, z, u, v =px, z, u, v. A distribution of the form px, y, z, u, v as given in 10 implies that the Markov chain U X, V Z holds since pz x, u, v = pz x, u, v, ypy x, u, v y = pz x, vpy x, u, v y = pz x, v. 12 Therefore pu x, v, z = pu x, v. Now consider px, z, u, v = px, z, vpu x, v, and since px, z, v = px, z, v and pu x, v = pu x, v we conclude that px, z, u, v =px, z, u, v. B. The Gaussian case In this subsection we consider the Gaussian instance that corresponds to Theorem 1. Since X, Y, Z form the Markov chain Y X Z, we assume, without loss of generality, that X = Z + A and Y = Z + A + B, where the random variables A, B, Z are zero-mean Gaussian and independent of each other, where E[A 2 ]=σa 2, E[B2 ]=σb 2 and E[Z2 ]=σz 2.The Gaussian example with helper and without side information Z, was solved in [5]. Their result will be used in the sequel to establish our converse for the Gaussian model. The following theorem establishes the rate region of the Gaussian case. Theorem 2: The achievable rate region for the Gaussian problem is R 1 σ 2 2 log A 1 σ2 A 1 2 σa 2 2R +σ2 B = 1 σ 2 2 log A σb 2 σ2 A 1 2 2R σa 2 + σ2 B 13 It is interesting to note that the rate region does not depend on σz 2. Furthermore, we show in the proof that for the Gaussian case the rate region is the same as when Z is known to the source X and the helper Y. Proof of Theorem 2: Converse: Assume that both encoders observe Z n. Without loss of generality, the encoders can subtract Z from X and Y ; hence the problem is equivalent to new rate distortion problem with a helper, where the source is A and the helper is A + B. Now using the result for the Gaussian case from [5], adapted to our notation, we obtain R 1 σ 2 2 log A 1 σ2 A 1 2 σ 2 2R A +σ2 B. 14 Achievability: Before proving the direct-part of Theorem 2, we establish the following lemma which is proved in the Appendix. Lemma 3: Gaussian Wyner-Ziv rate-distortion problem with additional side information known to the encoder and decoder. Let X, W, Z be jointly Gaussian. Consider the Wyner-Ziv rate distortion problem where the source X is to be compressed with quadratic distortion measure, W is available at the encoder and decoder, and Z is available only at the decoder. The rate-distortion region for this problem is given by R = 1 2 log σ2 X W,Z, 15 where σx W,Z 2 = E[X E[X W, Z]2 ], i.e., the minimum square error of estimating X from W, Z. Let V = A + B + Z +, where N0,σ 2 and is independent of A, B, Z. Clearly, we have V Y X Z. Now, let us generate V at the source-encoder and at the decoder using the achievability scheme of Wyner [7]. Since IV ; Z IV ; X arater = IV ; Y IV ; Z would suffice, and it may be expressed as follows: R = IV ; Y Z = hv Z hv Y = 1 2 log σ2 A + σ2 B + σ2 σ 2, 16 and this implies that σ 2 = σ2 A + σ2 B 2 2R 1. 17 Now, we invoke Lemma 3, where V is the side information known both to the encoder and decoder; hence a rate that satisfies the following inequality achieves a distortion ; R 1 2 log σ2 X V,Z = 1 2 log σ2 A 1 σ 2 A σ 2 A + σ2 B + σ2 18 Finally, by replacing σ 2 with the identity in 17 we obtain 14. C. The case of independent rates In this subsection we treat the rate distortion scenario where side information from the helper is encoded using two different messages, possibly at different rates, one to the encoder and and one to the decoder, as shown in Fig. 3. The complete characterization of achievable rates for this scenario is still an open problem. However, the solution that is given in previous sections, where there is one message known both to the encoder and decoder, provides us insight that allows us to solve several cases of the problem shown here. We start with the definition of the general case. It follows quite closely efinition 1, except that there are three rates involved. 268

X n T e,rater e T,rateR T d,rater d Z n ecoder Fig. 3. The rate distortion problem with decoder side information, and independent helper rates. We assume the Markov relation Y X Z efinition 2: An n, M, M e,m d, code for source X with side information Y and different helper messages to the encoder and decoder, consists of three encoders and a decoder f e : {1, 2,..., M e } f d : {1, 2,..., M d } f : X n {1, 2,..., M e } {1, 2,..., M} ˆX n 19 g : {1, 2,..., M} {1, 2,..., M d } ˆX n 20 EdX n, ˆX n, 21 To avoid cumbersome statements, we will not repeat in the sequel the words... different helper messages to the encoder and decoder, as this is the topic of this section, and should be clear from the context. The rate pair R, R e,r d of the n, M, M e,m d, code is R = 1 n log M R e = 1 n log M e R d = 1 n log M d 22 efinition 3: Given a distortion, a rate triple R, R e,r d is said to be achievable if for any δ>0, andsufficiently large n, there exists an n, 2 nr+δ, 2 nre+δ, 2 nrd+δ,+δ code for the source X with side information Y. efinition 4: The operational achievable region R O g Z of rate distortion with a helper known at the encoder and decoder is the closure of the set of all achievable rate triples at distortion. enote by R O g R e,r d, Z the section of R O g at helper rates R e,r d.thatis, R O g R e,r d, Z = {R : R, R e,r d are achievable with distortion }, 23 and similarly, denote by RR, Z the section of the region R Z, defined in 3, 4, 5, and 6 at helper rate R. Recall that, according to Theorem 1, RR, Z consists of all achievable source coding rates when the helper sends common messages to the source encoder and destination at rate R. The main result of this section is the following. Theorem 3: For any R e R d, R O g R e,r d, Z =RR d, Z 24 Theorem 3 has interesting implications on the coding strategy taken by the helper. It says that no gain in performance can be achieved if the source encoder gets more help than the decoder at the destination i.e., if R e >R d, and thus we may restrict R e to be no higher than R d. Moreover, in those cases where R e = R d, optimal performance is achieved when the helper sends to the encoder and decoder exactly the same message. The proof of this statement uses operational arguments, and is omitted here. The statement of Theorem 3 can be extended to rates R e slightly lower than R d. This extension is based on the simple observation that the source encoder knows X, which can serve as side information in decoding the message sent by the helper. Therefore, any message T 2 sent to the source decoder can undergo a stage of binning with respect to X. Asanextreme example, consider the case where R e HY X. The source encoder can fully recover Y, hence there is no advantage in transmitting to the encoder at rates higher than HY X; the decoder, on the other hand, can benefit from rates in the region HY X <R d <HY Z. This rate interval is not empty due to the Markov chain Y X Z. These observations are summarized in the next theorem. Theorem 4: 1 Let U, V achieve a point R, R in R Z, i.e., R = IX; U V,Z R = IY ; V Z =IV ; Y IV ; Z 25 EdX, ˆXU, V, Z, 26 V Y X Z. 27 Then R, R e,r R O g Z for every R e satisfying R e IV ; Y Z IV ; X Z = IV ; Y IV ; X. 28 2 Let R, R be an outer point of R. Thatis, R, R R Z. 29 Then R, R e,r is an outer point of R O g Z for any R e, i.e., R, R e,r R O g Z R e. 30 The proof of Part 1 is based on binning, as described above. In particular, observe that R e given in 28 is lower than R of 25 due to the Markov chain V Y X Z. Part2is a partial converse, and is a direct consequence of Theorem 3. The details, being straightforward, are omitted. 269

III. CASCAE SOURCE COING WITH HELPER In this section we study the problem of cascade source coding with distortion, where a helper sends a common message to the encoder and two decoders, as depicted in Figure 2. In general, we have two reconstruction alphabets ˆX i and two distortion measures, d i : X ˆX i IR +,where reconstruction alphabet i and distortion measure i are used at decoder i, i =1, 2. efinition 5: An n, M 1,M 2,M, cascade code for source X consists of three encoders f 1, f 2,andf and two decoders g 1, g 2, f : {1, 2,..., M } f 1 : X n {1, 2,..., M } {1, 2,..., M 1 } f 2 : {1, 2,..., M 1 } {1, 2,..., M } {1, 2,..., M 2 } g 1 : {1, 2,..., M 1 } {1, 2,..., M } ˆX 1 n g 2 : {1, 2,..., M 2 } {1, 2,..., M } ˆX 2 n 31 Ed i X n, ˆX n i i, i =1, 2. 32 The rates of the code are R = 1/nlogM and R i = 1/nlogM i, i =1, 2. The achievable rates and the operational rates region R O c 1, 2 at distortions 1, 2 are defined as usual. The subscript c stands here for cascade coding. efine the region R c 1, 2 as the collection of all rate triples R 1,R 2,R R IZ; V 33 R 1 IX; ˆX 1, ˆX 2 V 34 R 2 IX; ˆX 2 V 35 36 for some joint distribution px, y, v, ˆx 1, ˆx 2 satisfying px, y, v, ˆx 1, ˆx 2 = px, ypv ypˆx 1, ˆx 2 v, x 37 i IEdX, ˆX i, i =1, 2. 38 As with the region R Z, it can be shown also here that R c 1, 2 is convex, and the alphabet of V can be restricted to be of size at most Y +4 where we use Y 1 constraints to preserve the distribution of Y, plus 5 to preserve the mutual information functions in the definition of R c 1, 2 and the two distortions. In addition, the region is invariant to whether we use a distribution of the form 37, or allow for a dependence on Y in the last term, i.e., px, y, v, ˆx 1, ˆx 2 =px, ypv ypˆx 1, ˆx 2 v, x, y. 39 The proof follows the proofs of parallel statements in Section II-A and is therefore omitted. The main result of this section is stated next. Theorem 5: R O c 1, 2 =R c 1, 2. ue to lack of space, the proof is omitted. APPENIX Proof of Lemma 3: Since W, X, Z are jointly Gaussian, we have E[X W, Z] = αw + βz, for some scalars α, β. Furthermore, we have X = αw + βz + N, 40 where N is a Gaussian random variable independent of W, Z with zero mean and variance σx W,Z 2.SinceW is known to the encoder and decoder we can subtract αw from X, and then using Wyner-Ziv coding for the Gaussian case [7] we obtain R = 1 2 log σ2 X W,Z. 41 Obviously, one can not achieve a rate smaller than this even if Z is known both to the encoder and decoder, and therefore this is the achievable region. REFERENCES [1] R. F. AhlswedeandJ. Körner, Source coding with side information and a converse for the degraded broadcast channel, IEEE Trans. Inf. Theory, vol. IT-21, no. 6, pp. 629 637, November 1975. [2] I. Csiszar and J. Körner, Information Theory: Coding Theorems for iscrete Memoryless Systems. Academic, New York, 1981. [3] A. Kaspi, Rate-distortion for correlated sources with patially separated encoders. Ph. disseration, School of Electrical Engineering, Cornell University, Ithaca, NY, January 1979. [4] A. Kaspi and T. Berger, Rate-distortion for correlated sources with partially separated encoders, IEEE Trans. Inform. Theory, vol. IT-28, no. 6, pp. 828 840, November 1982. [5]. Vasudevan and E. Perron, Cooperative source coding with encoder breakdown, in Proc. IEEE Int. Symp. Information Theory, Nice, France, June 2007, pp. 1766 1770. [6] A.. Wyner, On source coding with side information at the decoder, IEEE Trans. Inf. Theory, vol. IT-21, no. 3, pp. 294 300, May 1975. [7] A.. Wyner, The rate-distortion function for source coding with side information at the decodr-{ii}: General sources, Information and Control, 38:60 80, 1978. [8] A.. Wyner and J. Ziv, The rate-distortion function for source coding with side information at the decoder, IEEE Trans. Inf. Theory, vol. IT-22, no. 1, pp. 1 10, January 1976. [9] H. Yamamoto, Source coding theory for cascade and branching communication systems, IEEE Trans. Inf. Theory, vol. IT-27, no. 3, pp. 299 308, May 1981. 270