A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality

Similar documents
Subset Typicality Lemmas and Improved Achievable Regions in Multiterminal Source Coding

On Common Information and the Encoding of Sources that are Not Successively Refinable

On Scalable Coding in the Presence of Decoder Side Information

Multiuser Successive Refinement and Multiple Description Coding

3238 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 60, NO. 6, JUNE 2014

On Scalable Source Coding for Multiple Decoders with Side Information

Approximating the Gaussian multiple description rate region under symmetric distortion constraints

Afundamental problem of multiple description coding is to

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels

arxiv: v1 [cs.it] 5 Feb 2016

Lossy Distributed Source Coding

Multiuser Successive Refinement and Multiple Description Coding

Joint Source-Channel Coding for the Multiple-Access Relay Channel

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels

ON SCALABLE CODING OF HIDDEN MARKOV SOURCES. Mehdi Salehifar, Tejaswi Nanjundaswamy, and Kenneth Rose

A Comparison of Two Achievable Rate Regions for the Interference Channel

Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function

YAMAMOTO [1] considered the cascade source coding

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers

Upper Bounds on the Capacity of Binary Intermittent Communication

Amobile satellite communication system, like Motorola s

Lecture 4 Noisy Channel Coding

On Gaussian MIMO Broadcast Channels with Common and Private Messages

SUCCESSIVE refinement of information, or scalable

Approximating the Gaussian Multiple Description Rate Region Under Symmetric Distortion Constraints

Extended Gray Wyner System with Complementary Causal Side Information

The Capacity of the Semi-Deterministic Cognitive Interference Channel and its Application to Constant Gap Results for the Gaussian Channel

New Results on the Equality of Exact and Wyner Common Information Rates

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

On Optimal Coding of Hidden Markov Sources

A Random Variable Substitution Lemma With Applications to Multiple Description Coding

Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel

A Comparison of Superposition Coding Schemes

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying

Interactive Decoding of a Broadcast Message

An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel

The Poisson Channel with Side Information

Error Exponent Region for Gaussian Broadcast Channels

Soft Covering with High Probability

Degrees of Freedom Region of the Gaussian MIMO Broadcast Channel with Common and Private Messages

The Duality Between Information Embedding and Source Coding With Side Information and Some Applications

K User Interference Channel with Backhaul

An Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel

Delayed Sequential Coding of Correlated Sources

Linear Exact Repair Rate Region of (k + 1, k, k) Distributed Storage Systems: A New Approach

On the Duality between Multiple-Access Codes and Computation Codes

On the Duality of Gaussian Multiple-Access and Broadcast Channels

Randomized Quantization and Optimal Design with a Marginal Constraint

An Achievable Error Exponent for the Mismatched Multiple-Access Channel

The Capacity Region for Multi-source Multi-sink Network Coding

Energy Efficient Estimation of Gaussian Sources Over Inhomogeneous Gaussian MAC Channels

The Gaussian Many-Help-One Distributed Source Coding Problem Saurabha Tavildar, Pramod Viswanath, Member, IEEE, and Aaron B. Wagner, Member, IEEE

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets

On Source-Channel Communication in Networks

Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel

Information Theory Meets Game Theory on The Interference Channel

On Side-Informed Coding of Noisy Sensor Observations

Reliable Computation over Multiple-Access Channels

QUANTIZATION FOR DISTRIBUTED ESTIMATION IN LARGE SCALE SENSOR NETWORKS

On the Capacity of the Interference Channel with a Relay

Lin Song Shuo Shao Jun Chen. 11 September 2013

Sum Capacity of General Deterministic Interference Channel with Channel Output Feedback

Lecture 5 Channel Coding over Continuous Channels

Simultaneous Nonunique Decoding Is Rate-Optimal

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 62, NO. 5, MAY

SCALABLE AUDIO CODING USING WATERMARKING

Cut-Set Bound and Dependence Balance Bound

On the Capacity Region of the Gaussian Z-channel

Energy State Amplification in an Energy Harvesting Communication System

Side-information Scalable Source Coding

On Lossless Coding With Coded Side Information Daniel Marco, Member, IEEE, and Michelle Effros, Fellow, IEEE

Computing sum of sources over an arbitrary multiple access channel

ProblemsWeCanSolveWithaHelper

Optimal Power Control in Decentralized Gaussian Multiple Access Channels

Equivalence for Networks with Adversarial State

Remote Source Coding with Two-Sided Information

Interference Channel aided by an Infrastructure Relay

Dispersion of the Gilbert-Elliott Channel

Regenerating Codes and Locally Recoverable. Codes for Distributed Storage Systems

Design of Optimal Quantizers for Distributed Source Coding

THIS paper addresses the quadratic Gaussian two-encoder

Cases Where Finding the Minimum Entropy Coloring of a Characteristic Graph is a Polynomial Time Problem

On Multiple User Channels with State Information at the Transmitters

Shannon s noisy-channel theorem

Layered Synthesis of Latent Gaussian Trees

The Gallager Converse

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel

Noise-Shaped Predictive Coding for Multiple Descriptions of a Colored Gaussian Source

Capacity Region of the Permutation Channel

Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel

Constellation Shaping for Communication Channels with Quantized Outputs

II. THE TWO-WAY TWO-RELAY CHANNEL

Approximately achieving the feedback interference channel capacity with point-to-point codes

Quantization for Distributed Estimation

Minimum Expected Distortion in Gaussian Source Coding with Uncertain Side Information

Feedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case

Duality Between Channel Capacity and Rate Distortion With Two-Sided State Information

Transcription:

0 IEEE Information Theory Workshop A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality Kumar Viswanatha, Emrah Akyol and Kenneth Rose ECE Department, University of California - Santa Barbara {kumar,eakyol,rose}@ece.ucsb.edu Abstract This paper addresses the L channel multiple descriptions problem for a Gaussian source under mean squared error (MSE distortion metric. We focus on particular crosssections of the general rate-distortion region where a subset the L distortion constraints are not imposed. Specifically, we assume that certain descriptions are never received simultaneously at the decoder and thereby the transmitted codewords require joint typicality only within prescribed subsets. We derive a new encoding scheme and an associated rate-distortion region wherein joint typicality of codewords only within the prescribed subsets is maintained. We show that enforcing joint typicality of all the codewords entails strict suboptimality in the achievable rate-distortion region. Specifically, we consider a 3 descriptions scenario wherein descriptions and 3 are never received simultaneously at the decoder and show that a strictly larger achievable region is obtained when the encoder maintains joint typicality of codewords only within the required subsets. To prove these results, we derive a lemma called the subset typicality lemma which plays a critical role in establishing the new achievable region. Index Terms Subset typicality lemma, Gaussian multiple descriptions under MSE, L channel multiple descriptions I. INTRODUCTION The Multiple Descriptions (MD problem has been studied extensively since the late seventies, see for eg. [, [, [3, [4, [5, [6, [7, [8, [9, [0, [. It was originally designed as a source-channel coding strategy to cope with channel failures, where multiple source descriptions are generated and sent over different paths. In the L descriptions setting, the encoder generates L packets which are transmitted over L available channels. It is assumed that the decoder receives a subset of the descriptions perfectly and the remaining are lost. The objective is to characterize the set of achievable rate-distortion tuples for the L description rates and the L distortions, one for each possible combination of the received descriptions. One of the first achievable regions for the -channel MD problem was derived by El-Gamal and Cover in 98 [. It is given by the convex closure over all tuples (R, R, D, D, D for which there exist auxiliary random variables ˆX, ˆX jointly distributed with X taking values over arbitrary alphabets and functions ψ K, K {,, } such The work was supported by the NSF under grants CCF - 0686 and CCF-8075 that, R i I(X; ˆX i R + R I(X; ˆX, ˆX + I( ˆX ; ˆX [ D K E d K (X, ψ K K K {,, } ( where { ˆX} K denotes the set { ˆX i : i K}. Ozarow [4 showed that for the descriptions setting, the El-Gamal Cover region is complete for a Gaussian source under mean squared error (MSE distortion metric. Specifically, he showed that the complete region can be achieved using a correlated quantization scheme which imposes the following joint distribution for ˆX, ˆX : ˆX X + W, ˆX X + W ( where W and W are zero mean Gaussian random variables independent of X with covariance matrix K WW and the functions ψ K K are given by the respective minimum mean squared error (MMSE optimal estimators, for eg., ψ ( ˆX, ˆX E [X ˆX, ˆX. The covariance matrix K WW is set to satisfy all the distortion constraints. We denote the complete Gaussian-MSE -descriptions region by RD. It has been shown in the literature that the El-Gamal Cover region is complete for general sources and distortion measures in the no-excess rate regime [3. However, Zhang and Berger showed in [ that this region is not complete in general. Specifically, for a binary source and Hamming distortion measure, they showed that sending a common codeword among the two descriptions can lead to a strictly larger achievable region. There have been several publications on extending the Gaussian -channel results to the L channel framework [5, [0, [, [. A natural extension of the correlated quantization coding scheme for the L channel setting (see for example [5 is described as follows. Let W, W,..., W L be zero mean Gaussian random variables independent of X with covariance matrix K W and let ˆXi be such that: ˆX i X + W i, i L (3 Note that the original characterization of El-Gamal and Cover also includes a refinement random variable ˆX. However, it was shown recently in [7 that ˆX can be set to a constant without loss of optimality. 978--4673-03-4//$3.00 0 IEEE 45

0 IEEE Information Theory Workshop where L {,,..., L}. Let K L, ψ K K E [X { ˆX} K and the covariance matrix K W be such that: Var (X { ˆX} K D K (4 i.e., all the L distortion constraints are satisfied. Then any rate tuple (R, R,..., R L satisfying the following conditions is achievable: h( ˆX i h K X K L (5 i K R i i K The convex closure of such achievable rate tuples over all K W satisfying the distortion constraints is an achievable region for the L channel Gaussian MD problem and is denoted hereafter by RD L. The coding scheme involves generating L independent codebooks, one each for ˆX i i L, at respective rates of nri. Upon observing a typical sequence X n, the encoder looks for one codeword from each codebook such that they are all jointly typical with the observed sequence X n. If the encoder fails to find one such codeword tuple, it randomly picks a tuple. The index of the codeword from codebook i is sent in description i i L. The decoders estimate the source using MMSE estimators based on the subset of the codewords they receive. If the encoding is errorfree, all the prescribed distortion constraints are met as the covariance matrix K W satisfies (4. It can be shown using standard typicality arguments (see for example, the proof of Theorem in [5 that the probability of error at the encoder asymptotically approaches zero if the rates satisfy (5. It was shown recently that RD L is complete when one imposes only individual and central distortion constraints [0, i.e., when only D, D,..., D L and D {...L} are imposed. It was further shown in [ that the above strategy, in conjunction with a random binning mechanism used to further exploit the correlations among the descriptions, is sum rate optimal under symmetric distortion constraints when distortions at only two levels of receivers are imposed, i.e, when D K: K k and D {...L} are imposed, where K denotes the cardinality of set K. Our objective in this paper is to derive a new coding scheme and an associated achievable region for the L channel Gaussian MD problem which strictly improves upon RD L for particular cross-sections of the rate-distortion region where we impose only a particular subset of the L distortion constraints. Specifically, let K, K,..., K M be M proper (strict subsets of L. We focus on cross-sections wherein only distortions {D} K, {D} K,..., {D} K are imposed, where K denotes the set of all subsets (power set of K and {D} K i {D S : S Ki }. It is critical to observe that, under these distortion constraints, joint typicality of codewords within subsets { ˆX} K, { ˆX} K,..., { ˆX} KM is infact sufficient to achieve the prescribed distortions and imposing joint typicality of all the transmitted codewords is an unnecessary restriction. In this paper, we derive a new coding scheme which maintains joint typicality only within the required subsets and show that enforcing joint typicality of all the codewords entails strict suboptimality in the achievable rate-distortion region. In preparation, we derive a critical lemma called the subset typicality lemma, which plays a pivotal role in achieving the new region. We note that these results have potential broader implications in related problems in multi-terminal information theory which are beyond the scope of this paper. We also note that the principles underlying the proposed scheme can be easily extended to incorporate refinement layer random variables similar to [5 in conjunction with binning modules as indicated in [6, [ and to include common layer random variables as shown in [, [8, [9. However, in general, the optimal joint distributions of the auxiliary random variables are hard to establish for these schemes and explicit characterizations of the achievable regions are not easily tractable. The rest of the paper is organized as follows. In section II, we state the subset typicality lemma which plays an important role in the proofs that follow. In section III, we formally state the problem and derive a new achievable rate-distortion region. In section IV, we consider a particular cross-section of the 3- descriptions MD setup and show that the new rate-distortion region strictly improves upon R. II. SUBSET TYPICALITY LEMMA We introduce the notation before establishing the lemma. n independent and identically distributed (iid random variables and their realizations are denoted by X0 n and x n 0 respectively. We denote by Tɛ n (P, the length n, ɛ-typical set corresponding to a random variable distributed according to P. Through out the paper, for any set S, we use the shorthand {U} S to denote the set {U i : i S}. Also, in the following Lemma, we use the notation P (A. nr to denote n(r+δ(ɛ P (A n(r δ(ɛ for some δ(ɛ 0 as ɛ 0. Lemma. Subset Typicality Lemma :Let (X, X,... X N be N random variables taking values on arbitrary finite alphabets (X, X,... X N with marginal distributions P (X, P (X..., P N (X N, respectively. Let S, S... S M be M subsets of {,,..., N} and for all j {,,..., M}, let P Sj ({X} Sj be any given joint distribution for {X} Si consistent with each other and with the given marginal distributions. Generate sequences x n, x n... x n N, each independent of the other, where x n i is drawn iid according to the marginal distribution P i (X i, i.e., x n i n l P i(x il. Then, ( P {x} n S j Tɛ n (. PSj, j {... M} n( N i H(Xi H(P where P is a distribution over (X, X..., X N which satisfies: ( P arg max H P (7 P subject to P ({X} Sj P Sj ({X} Sj j {... M}. To avoid resolvable but unnecessary complications, we assume that there exists at least one joint distribution consistent with the prescribed per subset distributions for subsets S, S,..., S M. (6 46

0 IEEE Information Theory Workshop Proof: The proof follows directly from Sanov s Theorm. We omit the details here and refer to [3 for a detailed analysis. In [3 we also establish a conditional version of Lemma. This conditional version plays an important role in several related network information theoretic setups. However in this paper, the above lemma is sufficient to establish the new region. We also note that a particular instance of Lemma was derived in [4. However, as it turns out, for the setup they consider, this Lemma does not help in deriving an improved achievable region. III. PROBLEM SETUP AND A NEW ACHIEVABLE REGION In this section, we first formally state the problem being addressed in this paper and then derive a new achievable ratedistortion region for the L channel MD setup. We assume the source to be a zero mean, unit variance Gaussian random variable, i.e., X N (0,. In the L channel MD setup, there are L encoding functions, f l ( l L, which map X n to the descriptions J l f l (X n, where J l takes on values in the set {,... B l }. The rate of description l is defined as R l n log (B l. Description l is sent over channel l and is either received at the decoder error free or is completely lost. Let K, K,..., K M be M proper (strict subsets of L and let S denote the union over all i {,..., M} of the set of all subsets of K i, i.e., S {K : K K i for some i {,,..., M}, K φ}, where φ denotes the null set. In this paper, we assume that the decoder always receives a subset K of the transmitted descriptions such that K S. There is a decoding function for each possible received com- bination of the descriptions ˆX n K ( ˆX( K, ˆX ( K..., ˆX (n g K (J l : l K, K S, where ˆX K takes on real values. The distortion (MSE at the decoder when a subset K of the descriptions is received is measured as: D K E [ n n (X (t t ˆX (t K We say that a rate-distortion tuple (R i, D K : i L, K S is achievable if there exit L encoding functions with rates (R..., R L and S decoding functions yielding respective distortions D K. The closure of the set of all achievable ratedistortion tuples is defined as the rate-distortion region for the set S. Note that, this region has L + S dimensions and is a cross-section of the general rate-distortion region for the L channel Gaussian MD problem. In the following theorem, we give an achievable region for any given subsets K, K,..., K M. Theorem. Let W, W,..., W L be zero mean Gaussian random variables independent of X with covariance matrix K W and let ˆXi be defined as: ˆX i X + W i i {,,..., L} (9 Let ψ K K E [X { ˆX} K and the covariance matrix K (8 K W be such that: Var (X { ˆX} K D K K S (0 Then any rate tuple (R, R,..., R L satisfying the following conditions is achievable: h( ˆX i h K X K L ( where, i K R i i K ( h { ˆX} ( X K max h { X} K X P ( X K X where P ( X K X is subject to: P ({ X} X K Kj P X K Kj ( j {... M} (3 The convex closure over all such K W satisfying (0 is an achievable rate-distortion region and is denoted by RD L. Proof: We again omit the details of the proof here as it follows almost directly from Theorem in [3. Remark. Observe that for any K S, the conditions in ( are the same as the conditions in (5. However, when K / S, the above theorem asserts that all the constraints in (5 can be set to their respective optima simultaneously. This is precisely what provides the strict improvement over RD L as each of the constraints are typically optimized by a different K W and hence it is impossible to achieve simultaneous optimality of all the constraints in RD L. Remark. We note that the rate-distortion region can be enlarged, in general, by defining a convex closure of the achievable rate-distortion tuples over all joint densities P X L and functions ψ K K that satisfy the distortion constraints. However, it can be shown using the contrapolymatroidal nature of the achievable region that for the Gaussian source under MSE distortion metric, the form considered in the theorem for P X L and ψ K K is sufficient to achieve all the corner points. IV. STRICT IMPROVEMENT In the following theorem, we show that RD L is strictly larger than RD L for any L >. Theorem. (i RD L RD L, i.e., in general, RD L subsumes RD L. (ii L >, There exists scenarios for which, RD L RD L (4 i.e RD L contains points that are strictly outside RD L L >. Proof: We note that part (i of the theorem follows directly as the RHS in ( is smaller than or equal to the RHS in (5 for each hyperplane. The challenging task to the prove part (ii of the theorem. Also note that to prove (ii it is sufficient for us to show the result for L 3. We consider the following cross-section of the 3-descriptions rate-distortion 47

0 IEEE Information Theory Workshop region wherein we impose distortions D, D,, D and. Observe that for this cross-section, we have K {, } and K {, 3} and hence it is sufficient to maintain joint typicality of sequences (ˆx n, ˆx n, x n and (ˆx n, ˆx n 3, x n respectively. We will show that imposing joint typicality of all the sequences leads to a strict sub-optimality. We first specialize and restate the regions R and RD 3 for K {, }, K {, 3}. Let W, W, W 3 be zero mean Gaussian random variables independent of X with covariance matrix K W and let ˆXi X + W i. Let K W be such that all the prescribed distortion constraints are satisfied. Then all rate tuples satisfying the following conditions are part of R : R i I(X; ˆX i i {,, 3} R + R I(X; ˆX, ˆX + I( ˆX ; ˆX R + R 3 I(X; ˆX, ˆX 3 + I( ˆX ; ˆX 3 R + R 3 h( ˆX + h( ˆX 3 h( ˆX, ˆX 3 X R + R + R 3 h( ˆX + h( ˆX + h( ˆX 3 h( ˆX, ˆX, ˆX 3 X (5 The corresponding constraints for RD 3 are given by: R i I(X; ˆX i i {,, 3} R + R I(X; ˆX, ˆX + I( ˆX ; ˆX R + R 3 I(X; ˆX, ˆX 3 + I( ˆX ; ˆX 3 R + R + R 3 h( ˆX + h( ˆX + h( ˆX 3 h ( ˆX, ˆX, ˆX 3 X (a I( ˆX ; X, ˆX + I( ˆX ; X, ˆX 3 +I(X; X (6 The convex closure of such rates over all K W satisfying the distortion constraints defines the respective regions R and RD 3. Observe that the condition on R + R 3 is not imposed in (6. This is because the term h( ˆX, ˆX 3 X in (5 is set to its maximum entropy subject to the pairwise distributions of (X, ˆX and (X, ˆX 3 (refer to (3 in Theorem, i.e., h( ˆX, ˆX 3 X h( ˆX X + h( ˆX 3 X and hence the constraint on R + R 3 is redundant. Equality (a follows from the fact that the maximum entropy term, h ( ˆX, ˆX, ˆX 3 X, subject to the given distributions within subsets (X, ˆX, ˆX and (X, ˆX 3, ˆX is achieved by a joint density of the form ˆX (X, ˆX ˆX 3 and hence h ( ˆX, ˆX, ˆX 3 X can be written as h( ˆX X + h( ˆX X, ˆX + h( ˆX 3 X, ˆX. We will show that in (6, the K W which achieves the optimum sum rate (which of course satisfies ˆX (X, ˆX ˆX 3 does not satisfy the condition ˆX X ˆX 3 and hence leads to a suboptimal bound on R + R 3. This argument is in fact sufficient to show that RD 3 strictly subsumes R. More precisely, we consider one point on the boundary of RD 3 and prove that this point does not lie in R. Define P min { log( D, R min, log( } where R min is given by: { R min inf R : R log(, R 3 D log( } (R, R, R 3 R (D, D,, D, (7 Similarly we define Pmin { log( D, Rmin, log( } where Rmin is obtained by replacing R by RD 3 in (7. Observe that strict improvement is proved if we show that Rmin < R min. Recall that log( D is the rate distortion function at distortion D for a Gaussian source under MSE distortion measure and the unique rate-distortion optimal reconstruction random variable is jointly Gaussian with the source. Hence imposing R log( D and R 3 log( in (5, enforces the random variables ˆX and ˆX 3 to be independent given X such that P ( ˆX X and P ( ˆX X are the respective RD-optimal channels. This implies that W and W 3 are independent zero mean Gaussian random variables with respective variances D /( D and /(. Hence the minimization for R min in (7 is over all positive definite K W of the form: K W σw ρ σ W σ W 0 ρ σ W σ W ρ 3 σ W σ W3 0 ρ 3 σ W σ W3 σw 3 D (8 where σw D and σw 3 and where, ρ, ρ 3 achieve the other three distortion constraints. However, observe that Rmin is obtained by optimizing over all positive definite K W of the form: σw ρ σ W σ W ρ 3 σ W σ W3 K W ρ σ W σ W ρ 3 σ W σ W3 (9 ρ 3 σ W σ W3 ρ 3 σ W σ W3 σw 3 D where σw D and σw 3 and where, ρ, ρ 3, ρ 3 achieve the remaining distortion constraints. It is this extra degree of freedom that provides leeway to achieve a strictly lower Rmin. Hence, all that remains for us to show is that Rmin is not achieved by a K W which has ρ 3 0. Towards this end, we first find R from (6 for a fixed K W which has the form (9: R ( log ( + σ W ( ρ ρ 3 ρ 3 + ρ ρ 3 ρ 3 (0 To find Rmin, we need to minimize the RHS of (0 over all, ρ, ρ 3, ρ 3 subject to the distortion constraints, D, D and. Observe that ρ 3 does not contribute to any of the distortion constraints and hence can be optimized by setting the differential of the RHS in (0 to 0 for fixed, ρ and ρ 3. We obtain the optimum ρ 3 as: ρ 3 ρ ρ 3 ( Note that if ρ 3 ρ ρ 3, then K W is always positive definite for any ρ, ρ 3 : ρ <, ρ 3 <. Also observe that ρ 3 ρ ρ 3 induces the condition ˆX (X, ˆX ˆX 3 48

0 IEEE Information Theory Workshop which is necessary for sum rate optimality in (6. Now, the RHS in (0 can be rewritten as: R ( log ( + σ W ( ρ ( ρ 3 ( and Rmin is obtained by minimizing ( over all, ρ, ρ 3 subject to the distortion constraints. We first σ denote W by D (+σw. Clearly, we require D D to satisfy the distortion constraint for ˆX. Next, observe that for a fixed D, R decreases as ρ,ρ 3 decrease. Hence, using arguments similar to [4, [4, it follows that ρ and ρ 3 should be set equal to the respective largest values in the range (, 0, such that distortions D and are satisfied with equality. It can be easily shown that (see [5: πijd +γij ij π ijdij Dij ρ ( D ij D D j + D ij D i i Dj ij π ij Dj + D D i Dj ij D i 0 Dij D j (3 for i {, 3} and[ j, where π ij ( D i ( D j and γ ij ( D ij (D i D ij ( D j D ij + D ij D i Dj Dij and where D D ij ijd i D i D ij+d id ij. We choose D to be such that D + D < + D and + D < + hold. Then it is sufficient for us to show that the optimum D is strictly greater than D and, i.e., Dj > D ij for both i, 3. Then we will have ρ and ρ 3 strictly less than 0, implying ρ 3 0. On substituting the optimum values for ρ ij (assuming D ij < D j + D ij D i holds for both i, 3, we have the following expression for R : R ( log D 3 (4 ij D i( D ij ( D ij [( D ij ( D i( D j (D i D ij( D j D ij i {, 3} and j. Rmin is then obtained by minimizing the RHS of (4 with respect to D subject to the constraint that D D. In general, it is hard to derive a closed form solution for the minimizing ( D. Instead, we plot the variation of log D 3 as a function of D for a particular D,, D and and show that D > D : {R D D > R D D }. Let us set D 0. and D 0.05 ( and D 0.85. Fig. shows the variation of log D 3 in the range D D + D D. It can be verified that R D D.99 and R 0.063 D0.36.73. Hence the optimum D is strictly greater than D and. Therefore, it follows that ρ and ρ 3 are strictly less than 0 and consequently ρ 3 0 proving the theorem. V. CONCLUSION In this paper, we focused on the L channel Gaussian MD problem where it is known apriori that certain descriptions are Figure. Variation of R as a function of D not received simultaneously at the decoder. This assumption allows us to design encoding schemes wherein joint typicality of codewords is required only within prescribed subsets. We derived a new achievable rate-distortion region for this setup by establishing a lemma called the subset typicality lemma. We showed using a 3 descriptions setting that a strictly larger ratedistortion region is obtained by maintaining joint typicality only within the required subsets as opposed a scheme which maintains joint typicality of all the transmitted codewords. REFERENCES [ A. El Gamal and T. M. Cover, Achievable rates for multiple descriptions, IEEE Trans. Inf. Theory, vol. IT-8, pp. 85 857, Nov. 98. [ Z. Zhang and T. Berger, New results in binary multiple descriptions, IEEE Trans. Inf. Theory, vol. IT-33, pp. 50 5, July 987. [3 R. Ahlswede, The rate-distortion region for multiple descriptions without excess rate," IEEE Trans. Inf. Theory, vol. 3, pp. 7-76, Nov. 985. [4 L. Ozarow, On a source-coding problem with two channels and three receivers, Bell Syst. Tech. J., vol. 59, no. 0, pp. 909-9, Dec. 980. [5 R. Venkataramani, G. Kramer, V.K. Goyal, Multiple description coding with many channels, IEEE Trans. on Information Theory, vol.49, no.9, pp. 06-4, Sept 003. [6 R. Puri, S. S. Pradhan, and K. Ramchandran, n-channel symmetric multiple descriptions-part II: an achievable rate-distortion region, IEEE Trans. Information Theory, vol. 5, pp. 377-39, Apr. 005. [7 J. Wang, J. Chen, L. Zhao, P. Cuff and H. Permuter, On the role of the refinement layer in multiple description coding and scalable coding, IEEE Transactions on Information Theory, vol.57, no.3, pp.443-456, Mar. 0. [8 K. Viswanatha, E. Akyol and K. Rose, A strictly larger achievable region for multiple descriptions using combinatorial message sharing, in Proc. IEEE Information Theory Workshop (ITW, Oct. 0. [9 E. Akyol, K. Viswanatha and K. Rose, On random binning versus conditional codebook methods in multiple descriptions coding, in Proc. IEEE Information Theory Workshop (ITW, Sep. 0. [0 J. Chen, Rate region of Gaussian multiple description coding with individual and central distortion constraints," IEEE Transactions on Information Theory, vol.55, no.9, pp.399-4005, Sep. 009. [ S. Mohajer, C. Tian and S. Diggavi, Asymmetric multilevel diversity coding and asymmetric Gaussian multiple descriptions, IEEE Transactions on Information Theory, vol.56, no.9, pp.4367-4387, Sep. 00. [ H. Wang and P. Viswanath,, Vector Gaussian multiple description with two levels of receivers, IEEE Transactions on Information Theory, vol.55, no., pp.40-40, Jan. 009. [3 K. Viswanatha, E. Akyol and K. Rose, Subset typicality lemmas and improved achievable regions in multiterminal source coding, In Technical report, Available for download at: http://www.scl.ece.ucsb.edu/subset_typicality.pdf [4 E. Perron, S. Diggavi, E. Telatar, On the role of encoder sideinformation in source coding for multiple decoders," In Proc. IEEE International Symposium on Information Theory (ISIT, vol., no., pp.33-335, 9-4 Jul 006. [5 R. Zamir, Gaussian codes and Shannon bounds for multiple descriptions, IEEE Transactions on Information Theory, vol.45, no.7, pp.69-636, Nov. 999. 49