Multiterminal Source Coding with an Entropy-Based Distortion Measure
|
|
- Elmer Robertson
- 5 years ago
- Views:
Transcription
1 Multiterminal Source Coding with an Entropy-Based Distortion Measure Thomas Courtade and Rick Wesel Department of Electrical Engineering University of California, Los Angeles 4 August, 2011 IEEE International Symposium on Information Theory Saint Petersburg, Russia Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
2 Motivation The lossless one-helper problem X Decoder ˆX Y Question What is the achievable rate region for a lossless one-helper network with a single source? Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
3 Motivation The lossless one-helper problem X Decoder ˆX Y Question What is the achievable rate region for a lossless one-helper network with a single source? The answer to this question appears to be out of reach for now. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
4 Motivation The lossless one-helper problem X R x Decoder ˆX Y R y Theorem (Ahlswede-Körner-Wyner 1975) R = (R x, R y ) : R x H(X U) R y I(Y ; U) for some distribution p(x, y, u) = p(x, y)p(u y), where U Y + 2 Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
5 Motivation The lossless one-helper problem X R x Decoder ˆX R 0 Y R y Theorem (Kaspi-Berger 1982) R = (R 0, R x, R y ) : p(x, y, v, u) = p(x, y)p(u x)p(v y, u) such that R 0 I(X; U Y ), R x H(X V, U), R x + R y H(X) + I(Y ; V U, X) Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
6 Motivation The lossless one-helper problem X R x Decoder ˆX R 0 Y R y R 1 Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
7 Motivation The lossless one-helper problem X R x Decoder ˆX R 0 Y R y R 1 Achievable rate region appears to be unknown. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
8 Motivation The lossless one-helper problem X R x Decoder ˆX R 0 Y R y R 1 Achievable rate region appears to be unknown. The encoder without a source is problematic. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
9 Motivation The lossless one-helper problem X R x Decoder ˆX R 0 Y R y R 1 Achievable rate region appears to be unknown. The encoder without a source is problematic. Intuitively, it should send some lossy estimate of X. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
10 Multiterminal Source Coding with an entropy-based distortion measure X R x Decoder Ẑ Y R y We study two cases of the MTSC problem: ] 1 Joint distortion constraint: E [d(x, Y, Ẑ) D, ] 2 Distortion constraint only on X: E [d(x, Ẑ) D. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
11 Multiterminal Source Coding with an entropy-based distortion measure X R x Decoder Ẑ Y R y We study two cases of the MTSC problem: ] 1 Joint distortion constraint: E [d(x, Y, Ẑ) D, ] 2 Distortion constraint only on X: E [d(x, Ẑ) D. We consider a particular choice of Ẑ and d( ) (Case 1): X Y Ẑ = R + (i.e., ( the set ) of functions from X Y to R + ). d(x, y, ẑ) = log. 1 ẑ(x,y) Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
12 Multiterminal Source Coding with an entropy-based distortion measure X R x Decoder Ẑ Y R y We study two cases of the MTSC problem: ] 1 Joint distortion constraint: E [d(x, Y, Ẑ) D, ] 2 Distortion constraint only on X: E [d(x, Ẑ) D. We consider a particular choice of Ẑ and d( ) (Case 2): Ẑ = R X + (i.e., ( the ) set of functions from X to R + ). d(x, ẑ) = log. 1 ẑ(x) Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
13 Examples An entropy-based distortion measure To make things interesting, we assume (x,y) X Y ẑ(x, y) M. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
14 Examples An entropy-based distortion measure To make things interesting, we assume (x,y) X Y ẑ(x, y) M. Without loss of generality: ẑ(x, y) = 1 d(x, y, ẑ) = D ( 1 {(x,y )=(x,y)} ẑ(x, y ) ). (x,y) X Y Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
15 Examples An entropy-based distortion measure To make things interesting, we assume (x,y) X Y ẑ(x, y) M. Without loss of generality: ẑ(x, y) = 1 d(x, y, ẑ) = D ( 1 {(x,y )=(x,y)} ẑ(x, y ) ). (x,y) X Y We obtain the erasure distortion measure if we restrict ẑ(x, y) to functions of the form: { 1 if (x, y) = (x ẑ(x, y) =, y ) 1 and ẑ(x, y) = x, y. 0 otherwise X Y Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
16 Examples An entropy-based distortion measure To make things interesting, we assume (x,y) X Y ẑ(x, y) M. Without loss of generality: ẑ(x, y) = 1 d(x, y, ẑ) = D ( 1 {(x,y )=(x,y)} ẑ(x, y ) ). (x,y) X Y We obtain the erasure distortion measure if we restrict ẑ(x, y) to functions of the form: { 1 if (x, y) = (x ẑ(x, y) =, y ) 1 and ẑ(x, y) = x, y. 0 otherwise X Y If (R x, R y ) = (0, 0), setting ẑ(x, y) = p(x, y) results in distortion H(X, Y ). Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
17 Examples An entropy-based distortion measure To make things interesting, we assume (x,y) X Y ẑ(x, y) M. Without loss of generality: ẑ(x, y) = 1 d(x, y, ẑ) = D ( 1 {(x,y )=(x,y)} ẑ(x, y ) ). (x,y) X Y We obtain the erasure distortion measure if we restrict ẑ(x, y) to functions of the form: { 1 if (x, y) = (x ẑ(x, y) =, y ) 1 and ẑ(x, y) = x, y. 0 otherwise X Y If (R x, R y ) = (0, 0), setting ẑ(x, y) = p(x, y) results in distortion H(X, Y ). If (R x, R y ) = (0, H(Y )), setting ẑ(x, y) = p(x y) results in distortion H(X Y ). Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
18 Examples Rate distortion with side information Theorem (Wyner-Ziv 1976) Let (X, Y ) be drawn i.i.d. and let d(x, ẑ) be given. The rate distortion function with side information is R Y (D) = min min p(w x) f I(X; W Y ) where the minimization is over all functions f : Y W Ẑ and conditional distributions p(w x) such that E [d(x, f(y, W ))] D. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
19 Examples Rate distortion with side information Theorem (Wyner-Ziv 1976) Let (X, Y ) be drawn i.i.d. and let d(x, ẑ) be given. The rate distortion function with side information is R Y (D) = min min p(w x) f I(X; W Y ) where the minimization is over all functions f : Y W Ẑ and conditional distributions p(w x) such that E [d(x, f(y, W ))] D. Corollary For our choice of Ẑ and d( ), the rate distortion function with side information is: R Y (D) = H(X Y ) D. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
20 Examples Rate distortion with side information Proof of Corollary: Therefore: D E [d(x, f(y, W ))] = E [ ( )] 1 log f(y, W )[X] = D(p(x y, w) f(y, w)[x]) + H(X Y, W ) H(X Y, W ). R Y (D) = min min p(w x) f Taking f(y, w)[x] = p(x y, w) and W = {H(X Y ) H(X Y, W )} H(X Y ) D. achieves equality throughout R Y (D) = H(X Y ) D. { X with probability 1 D H(X Y ) with probability D H(X Y ) Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
21 Examples Rate distortion with side information X H(X Y ) D Decoder Ẑ Ed(X, Ẑ) D Y Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
22 Examples Rate distortion with side information X H(X Y ) D Decoder Ẑ Ed(X, Ẑ) D Y Intuition: every bit of distortion we tolerate saves one bit of rate. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
23 Examples Rate distortion with side information X H(X Y ) D Decoder Ẑ Ed(X, Ẑ) D Y Intuition: every bit of distortion we tolerate saves one bit of rate. What if Y is rate limited? Does a similar theme prevail? Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
24 d-lossy Coding of Correlated Sources Joint distortion criterion X R x Decoder Ẑ Ed(X, Y, Ẑ) D Y R y Theorem R = (R x, R y, D) : R x H(X Y ) D R y H(Y X) D R x + R y H(X, Y ) D Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
25 d-lossy Coding of Correlated Sources Joint distortion criterion We obtain a D-bit enlargement of the achievable rate region in all directions. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
26 d-lossy Coding of Correlated Sources Joint distortion criterion We obtain a D-bit enlargement of the achievable rate region in all directions. The proof relies on the WZ corollary and lemmas of the form: Lemma (Distortion Preimage) Define A(ẑ n ) = {(x n, y n ) : d(x n, y n, ẑ n ) D + ɛ}. Then A(ẑ n ) 2 n(d+2ɛ). Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
27 d-lossy Coding of Correlated Sources Joint distortion criterion We obtain a D-bit enlargement of the achievable rate region in all directions. The proof relies on the WZ corollary and lemmas of the form: Lemma (Distortion Preimage) Define A(ẑ n ) = {(x n, y n ) : d(x n, y n, ẑ n ) D + ɛ}. Then A(ẑ n ) 2 n(d+2ɛ). Intuitively, if we have a reconstruction ẑ n then nd additional bits of information about (x n, y n ) are required to determine (x n, y n ) completely. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
28 d-lossy Coding of Correlated Sources Joint distortion criterion We obtain a D-bit enlargement of the achievable rate region in all directions. The proof relies on the WZ corollary and lemmas of the form: Lemma (Distortion Preimage) Define A(ẑ n ) = {(x n, y n ) : d(x n, y n, ẑ n ) D + ɛ}. Then A(ẑ n ) 2 n(d+2ɛ). Intuitively, if we have a reconstruction ẑ n then nd additional bits of information about (x n, y n ) are required to determine (x n, y n ) completely. This hints at a relationship with the Slepian-Wolf region. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
29 d-lossy Coding of Correlated Sources Joint distortion criterion: Proof sketch Claim 1 The basic idea is to show a correspondence with the Slepian-Wolf region as follows: If (R x, R y, D) is an achievable RD point, then (R x + θd, R y + (1 θ)d) is an achievable Slepian-Wolf rate pair for some θ [0, 1]. Proved using the distortion preimage lemmas combined with a decomposition of the distortion measure and a random binning argument. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
30 d-lossy Coding of Correlated Sources Joint distortion criterion: Proof sketch Claim 1 The basic idea is to show a correspondence with the Slepian-Wolf region as follows: If (R x, R y, D) is an achievable RD point, then (R x + θd, R y + (1 θ)d) is an achievable Slepian-Wolf rate pair for some θ [0, 1]. ClaimProved 2 using the distortion preimage lemmas combined with a If (Rdecomposition x + θd, R y + (1 of the θ)d) distortion an achievable measure and Slepian-Wolf a random rate binning pair for someargument. θ [0, 1], then (R x, R y, D) is an achievable RD point. Proved via the WZ corollary and timesharing. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
31 d-lossy Coding of Correlated Sources Joint distortion criterion: Proof sketch Claim 1 The basic idea is to show a correspondence with the Slepian-Wolf region as follows: If (R x, R y, D) is an achievable RD point, then (R x + θd, R y + (1 θ)d) is an achievable Slepian-Wolf rate pair for some θ [0, 1]. ClaimProved 2 using the distortion preimage lemmas combined with a If (Rdecomposition x + θd, R y + (1 of the θ)d) distortion an achievable measure and Slepian-Wolf a random rate binning pair for someargument. θ [0, 1], then (R x, R y, D) is an achievable RD point. Proved via the WZ corollary and timesharing. Combining the two claims with the known form of the Slepian-Wolf region gives the expression for the achievable rate distortion region. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
32 Decomposing the Entropy Measure d( ) Joint distortion criterion: Proof sketch Given a rate distortion code, ẑ can be thought of as a probability mass function on X Y, we can decompose it uniquely as: ẑ(x, y) = ẑ(x)ẑ(y x). Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
33 Decomposing the Entropy Measure d( ) Joint distortion criterion: Proof sketch Given a rate distortion code, ẑ can be thought of as a probability mass function on X Y, we can decompose it uniquely as: ẑ(x, y) = ẑ(x)ẑ(y x). This allows the definition [ of the marginal ] and conditional distortions D x and D y x : D E d(x n, Y n, Ẑn ) [ ] [ ] = E d x (X n, Ẑn ) + E d y x (X n, Y n, Ẑn ). }{{}}{{} D x D y x Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
34 Decomposing the Entropy Measure d( ) Joint distortion criterion: Proof sketch Given a rate distortion code, ẑ can be thought of as a probability mass function on X Y, we can decompose it uniquely as: ẑ(x, y) = ẑ(x)ẑ(y x). This allows the definition [ of the marginal ] and conditional distortions D x and D y x : D E d(x n, Y n, Ẑn ) [ ] [ ] = E d x (X n, Ẑn ) + E d y x (X n, Y n, Ẑn ). }{{}}{{} D x D y x This prompts the following lemma: Lemma (Marginal and Conditional Distortion Preimages) Define A x (ẑ n ) = {(x n ) : d x (x n, ẑ n ) D x + ɛ} and A y x (x n, ẑ n ) = { (y n ) : d y x (x n, y n, ẑ n ) D y x + ɛ }. Then A x (ẑ n ) 2 n(dx+2ɛ) and A y x (x n, ẑ n ) 2 n(d y x+2ɛ). Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
35 Proving Claim 1 Joint distortion criterion: Proof sketch Claim 1 If (R x, R y, D) is an achievable RD point, then (R x + θd, R y + (1 θ)d) is an achievable Slepian-Wolf rate pair for some θ [0, 1]. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
36 Proving Claim 1 Joint distortion criterion: Proof sketch Claim 1 If (R x, R y, D) is an achievable RD point, then (R x + θd, R y + (1 θ)d) is an achievable Slepian-Wolf rate pair for some θ [0, 1]. Suppose we have a sequence of (2 nrx, 2 nry, n) rate-distortion codes achieving average distortion D. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
37 Proving Claim 1 Joint distortion criterion: Proof sketch Claim 1 If (R x, R y, D) is an achievable RD point, then (R x + θd, R y + (1 θ)d) is an achievable Slepian-Wolf rate pair for some θ [0, 1]. Suppose we have a sequence of (2 nrx, 2 nry, n) rate-distortion codes achieving average distortion [ D. ] With high probability, E d x (X n, Ẑn ) D x + ɛ and [ ] E d y x (X n, Y n, Ẑn ) D y x + ɛ. Also, D x + D y x D + ɛ. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
38 Proving Claim 1 Joint distortion criterion: Proof sketch Claim 1 If (R x, R y, D) is an achievable RD point, then (R x + θd, R y + (1 θ)d) is an achievable Slepian-Wolf rate pair for some θ [0, 1]. Suppose we have a sequence of (2 nrx, 2 nry, n) rate-distortion codes achieving average distortion [ D. ] With high probability, E d x (X n, Ẑn ) D x + ɛ and [ ] E d y x (X n, Y n, Ẑn ) D y x + ɛ. Also, D x + D y x D + ɛ. Bin the x n s into 2 n(dx+3ɛ) bins and send bin index along with rate-distortion codeword. This requires rate R x + D x + 3ɛ and allows the decoder to recover X n w.h.p. by the MDPI lemma. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
39 Proving Claim 1 Joint distortion criterion: Proof sketch Claim 1 If (R x, R y, D) is an achievable RD point, then (R x + θd, R y + (1 θ)d) is an achievable Slepian-Wolf rate pair for some θ [0, 1]. Suppose we have a sequence of (2 nrx, 2 nry, n) rate-distortion codes achieving average distortion [ D. ] With high probability, E d x (X n, Ẑn ) D x + ɛ and [ ] E d y x (X n, Y n, Ẑn ) D y x + ɛ. Also, D x + D y x D + ɛ. Bin the x n s into 2 n(dx+3ɛ) bins and send bin index along with rate-distortion codeword. This requires rate R x + D x + 3ɛ and allows the decoder to recover X n w.h.p. by the MDPI lemma. Bin the y n s into 2 n(d y x+3ɛ) bins and send bin index along with rate-distortion codeword. This requires rate R y + D y x + 3ɛ and allows the decoder to recover Y n w.h.p. by the CDPI lemma. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
40 Proving Claim 2 Joint distortion criterion: Proof sketch Claim 2 If (R x + θd, R y + (1 θ)d) is an achievable Slepian-Wolf rate pair for some θ [0, 1], then (R x, R y, D) is an achievable RD point. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
41 Proving Claim 2 Joint distortion criterion: Proof sketch Claim 2 If (R x + θd, R y + (1 θ)d) is an achievable Slepian-Wolf rate pair for some θ [0, 1], then (R x, R y, D) is an achievable RD point. Suppose ( R x, R y ) is an achievable Slepian-Wolf rate pair. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
42 Proving Claim 2 Joint distortion criterion: Proof sketch Claim 2 If (R x + θd, R y + (1 θ)d) is an achievable Slepian-Wolf rate pair for some θ [0, 1], then (R x, R y, D) is an achievable RD point. Suppose ( R x, R y ) is an achievable Slepian-Wolf rate pair. Can assume ( R x, R y ) = (1 θ) (H(X), H(Y X)) + θ (H(X Y ), H(Y )). Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
43 Proving Claim 2 Joint distortion criterion: Proof sketch Claim 2 If (R x + θd, R y + (1 θ)d) is an achievable Slepian-Wolf rate pair for some θ [0, 1], then (R x, R y, D) is an achievable RD point. Suppose ( R x, R y ) is an achievable Slepian-Wolf rate pair. Can assume ( R x, R y ) = (1 θ) (H(X), H(Y X)) + θ (H(X Y ), H(Y )). By the WZ corollary, (R x, R y, D) = ( R x θd, R y (1 θ)d, D) is an achievable RD point. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
44 Proving Claim 2 Joint distortion criterion: Proof sketch Claim 2 If (R x + θd, R y + (1 θ)d) is an achievable Slepian-Wolf rate pair for some θ [0, 1], then (R x, R y, D) is an achievable RD point. Suppose ( R x, R y ) is an achievable Slepian-Wolf rate pair. Can assume ( R x, R y ) = (1 θ) (H(X), H(Y X)) + θ (H(X Y ), H(Y )). By the WZ corollary, (R x, R y, D) = ( R x θd, R y (1 θ)d, D) is an achievable RD point. Making the substitution ( R x, R y ) = (R x + θd, R y + (1 θ)d) completes the proof. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
45 d-lossy Coding with Coded Side Information Distortion constraint on X only X R x Decoder Ẑ Ed(X, Ẑ) D Y Theorem R y R = (R x, R y, D) : R x H(X U) D R y I(Y ; U) for some distribution p(x, y, u) = p(x, y)p(u y), where U Y + 2 Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
46 d-lossy Coding with Coded Side Information Distortion constraint on X only In this case, the achievable rate region is enlarged by D bits in the R x -direction only. The idea is that Y is already at D max, and thus any increase in D doesn t necessarily help decrease R y. Theorem R = (R x, R y, D) : R x H(X U) D R y I(Y ; U) for some distribution p(x, y, u) = p(x, y)p(u y), where U Y + 2. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
47 d-lossy Coding with Coded Side Information Distortion constraint on X only In this case, the achievable rate region is enlarged by D bits in the R x -direction only. The idea is that Y is already at D max, and thus any increase in D doesn t necessarily help decrease R y. The proof is similar in spirit to the previous case. In particular, we show a correspondence with the Ahlswede-Körner-Wyner region. Theorem R = (R x, R y, D) : R x H(X U) D R y I(Y ; U) for some distribution p(x, y, u) = p(x, y)p(u y), where U Y + 2. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
48 An Application Gambling with Rate-Limited Side Information Suppose we bet on n i.i.d. horse races with outcomes X n and correlated side information Y n. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
49 An Application Gambling with Rate-Limited Side Information Suppose we bet on n i.i.d. horse races with outcomes X n and correlated side information Y n. Our insider friend is only able to provide rate-limited side information f(y n ) at rate R y. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
50 An Application Gambling with Rate-Limited Side Information Suppose we bet on n i.i.d. horse races with outcomes X n and correlated side information Y n. Our insider friend is only able to provide rate-limited side information f(y n ) at rate R y. Starting with 1 ruble, in the i th race we bet a fraction ẑ i (x) of our wealth on horse x. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
51 An Application Gambling with Rate-Limited Side Information Suppose we bet on n i.i.d. horse races with outcomes X n and correlated side information Y n. Our insider friend is only able to provide rate-limited side information f(y n ) at rate R y. Starting with 1 ruble, in the i th race we bet a fraction ẑ i (x) of our wealth on horse x. The expected doubling rate is: E 1 n log ẑ i (x i )o(x i ) = E log o(x) Ed(X n, n Ẑn ). i=1 Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
52 An Application Gambling with Rate-Limited Side Information Suppose we bet on n i.i.d. horse races with outcomes X n and correlated side information Y n. Our insider friend is only able to provide rate-limited side information f(y n ) at rate R y. Starting with 1 ruble, in the i th race we bet a fraction ẑ i (x) of our wealth on horse x. The expected doubling rate is: E 1 n log ẑ i (x i )o(x i ) = E log o(x) Ed(X n, n Ẑn ). i=1 Theorem (Erkip 1996) The optimum doubling rate is: W = E log o(x) D, D = where inf {D : H(X U) D, I(Y ; U) R y}. p(x,y,u)=p(u y)p(x,y) Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
53 An Application Gambling with Rate-Limited Side Information Suppose we bet on n i.i.d. horse races with outcomes X n and correlated side information Y n. Our insider friend is only able to provide rate-limited side information f(y n ) at rate R y, and an oracle provides side info g(x n ) at rate R x. Starting with 1 ruble, in the i th race we bet a fraction ẑ i (x) of our wealth on horse x. The expected doubling rate is: E 1 n log ẑ i (x i )o(x i ) = E log o(x) Ed(X n, n Ẑn ). i=1 Theorem (One oracle bit adds more than one bit to doubling rate) The optimum doubling rate is: W = E log o(x) D, where D = inf {D : H(X U) R x D, I(Y ; U) R y }. p(x,y,u)=p(u y)p(x,y) Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
54 Concluding Remarks Two lossy source coding problems which are open in general can be solved for our choice of d( ): 1 MTSC with a joint distortion constraint. 2 MTSC with a single distortion constraint: D x = D, D y = D max. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
55 Concluding Remarks Two lossy source coding problems which are open in general can be solved for our choice of d( ): 1 MTSC with a joint distortion constraint. 2 MTSC with a single distortion constraint: D x = D, D y = D max. It is possible that more cases can be solved. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
56 Concluding Remarks Two lossy source coding problems which are open in general can be solved for our choice of d( ): 1 MTSC with a joint distortion constraint. 2 MTSC with a single distortion constraint: D x = D, D y = D max. It is possible that more cases can be solved. The entropy-based distortion measure d( ) has shown up in the literature before (cf. image compression, the information bottleneck problem). This may point to some potential applications of the results presented today. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
57 Concluding Remarks Two lossy source coding problems which are open in general can be solved for our choice of d( ): 1 MTSC with a joint distortion constraint. 2 MTSC with a single distortion constraint: D x = D, D y = D max. It is possible that more cases can be solved. The entropy-based distortion measure d( ) has shown up in the literature before (cf. image compression, the information bottleneck problem). This may point to some potential applications of the results presented today. Due to the relationship between d( ) and relative entropy, there may be some additional applications in estimation/portfolio/gambling theory. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
58 Concluding Remarks Two lossy source coding problems which are open in general can be solved for our choice of d( ): 1 MTSC with a joint distortion constraint. 2 MTSC with a single distortion constraint: D x = D, D y = D max. It is possible that more cases can be solved. The entropy-based distortion measure d( ) has shown up in the literature before (cf. image compression, the information bottleneck problem). This may point to some potential applications of the results presented today. Due to the relationship between d( ) and relative entropy, there may be some additional applications in estimation/portfolio/gambling theory. Algorithmically satisfying from an engineering perspective. Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
59 Thank You! Courtade and Wesel (UCLA) MTSC: Entropy-Based Distortion ISIT / 23
SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003
SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003 SLEPIAN-WOLF RESULT { X i} RATE R x ENCODER 1 DECODER X i V i {, } { V i} ENCODER 0 RATE R v Problem: Determine R, the
More informationInformation Masking and Amplification: The Source Coding Setting
202 IEEE International Symposium on Information Theory Proceedings Information Masking and Amplification: The Source Coding Setting Thomas A. Courtade Department of Electrical Engineering University of
More informationDistributed Functional Compression through Graph Coloring
Distributed Functional Compression through Graph Coloring Vishal Doshi, Devavrat Shah, Muriel Médard, and Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology
More informationOn Scalable Coding in the Presence of Decoder Side Information
On Scalable Coding in the Presence of Decoder Side Information Emrah Akyol, Urbashi Mitra Dep. of Electrical Eng. USC, CA, US Email: {eakyol, ubli}@usc.edu Ertem Tuncel Dep. of Electrical Eng. UC Riverside,
More informationProblemsWeCanSolveWithaHelper
ITW 2009, Volos, Greece, June 10-12, 2009 ProblemsWeCanSolveWitha Haim Permuter Ben-Gurion University of the Negev haimp@bgu.ac.il Yossef Steinberg Technion - IIT ysteinbe@ee.technion.ac.il Tsachy Weissman
More informationOn Optimum Conventional Quantization for Source Coding with Side Information at the Decoder
On Optimum Conventional Quantization for Source Coding with Side Information at the Decoder by Lin Zheng A thesis presented to the University of Waterloo in fulfillment of the thesis requirement for the
More informationRemote Source Coding with Two-Sided Information
Remote Source Coding with Two-Sided Information Basak Guler Ebrahim MolavianJazi Aylin Yener Wireless Communications and Networking Laboratory Department of Electrical Engineering The Pennsylvania State
More informationOn Common Information and the Encoding of Sources that are Not Successively Refinable
On Common Information and the Encoding of Sources that are Not Successively Refinable Kumar Viswanatha, Emrah Akyol, Tejaswi Nanjundaswamy and Kenneth Rose ECE Department, University of California - Santa
More informationLossy Distributed Source Coding
Lossy Distributed Source Coding John MacLaren Walsh, Ph.D. Multiterminal Information Theory, Spring Quarter, 202 Lossy Distributed Source Coding Problem X X 2 S {,...,2 R } S 2 {,...,2 R2 } Ẑ Ẑ 2 E d(z,n,
More informationEE5585 Data Compression May 2, Lecture 27
EE5585 Data Compression May 2, 2013 Lecture 27 Instructor: Arya Mazumdar Scribe: Fangying Zhang Distributed Data Compression/Source Coding In the previous class we used a H-W table as a simple example,
More informationGraph Coloring and Conditional Graph Entropy
Graph Coloring and Conditional Graph Entropy Vishal Doshi, Devavrat Shah, Muriel Médard, Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology Cambridge,
More informationOn Scalable Source Coding for Multiple Decoders with Side Information
On Scalable Source Coding for Multiple Decoders with Side Information Chao Tian School of Computer and Communication Sciences Laboratory for Information and Communication Systems (LICOS), EPFL, Lausanne,
More informationLattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function
Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Dinesh Krithivasan and S. Sandeep Pradhan Department of Electrical Engineering and Computer Science,
More informationDistributed Lossy Interactive Function Computation
Distributed Lossy Interactive Function Computation Solmaz Torabi & John MacLaren Walsh Dept. of Electrical and Computer Engineering Drexel University Philadelphia, PA 19104 Email: solmaz.t@drexel.edu &
More informationLecture 22: Final Review
Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information
More information18.2 Continuous Alphabet (discrete-time, memoryless) Channel
0-704: Information Processing and Learning Spring 0 Lecture 8: Gaussian channel, Parallel channels and Rate-distortion theory Lecturer: Aarti Singh Scribe: Danai Koutra Disclaimer: These notes have not
More informationThe Gallager Converse
The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing
More informationNetwork coding for multicast relation to compression and generalization of Slepian-Wolf
Network coding for multicast relation to compression and generalization of Slepian-Wolf 1 Overview Review of Slepian-Wolf Distributed network compression Error exponents Source-channel separation issues
More informationInteractive Hypothesis Testing with Communication Constraints
Fiftieth Annual Allerton Conference Allerton House, UIUC, Illinois, USA October - 5, 22 Interactive Hypothesis Testing with Communication Constraints Yu Xiang and Young-Han Kim Department of Electrical
More informationExtended Gray Wyner System with Complementary Causal Side Information
Extended Gray Wyner System with Complementary Causal Side Information Cheuk Ting Li and Abbas El Gamal Department of Electrical Engineering, Stanford University Email: ctli@stanford.edu, abbas@ee.stanford.edu
More informationCommunication Cost of Distributed Computing
Communication Cost of Distributed Computing Abbas El Gamal Stanford University Brice Colloquium, 2009 A. El Gamal (Stanford University) Comm. Cost of Dist. Computing Brice Colloquium, 2009 1 / 41 Motivation
More information3238 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 60, NO. 6, JUNE 2014
3238 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 60, NO. 6, JUNE 2014 The Lossy Common Information of Correlated Sources Kumar B. Viswanatha, Student Member, IEEE, Emrah Akyol, Member, IEEE, and Kenneth
More informationECE Information theory Final (Fall 2008)
ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1
More informationEE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions
EE/Stat 376B Handout #5 Network Information Theory October, 14, 014 1. Problem.4 parts (b) and (c). Homework Set # Solutions (b) Consider h(x + Y ) h(x + Y Y ) = h(x Y ) = h(x). (c) Let ay = Y 1 + Y, where
More informationOn The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers
On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers Albrecht Wolf, Diana Cristina González, Meik Dörpinghaus, José Cândido Silveira Santos Filho, and Gerhard Fettweis Vodafone
More informationCommon Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014
Common Information Abbas El Gamal Stanford University Viterbi Lecture, USC, April 2014 Andrew Viterbi s Fabulous Formula, IEEE Spectrum, 2010 El Gamal (Stanford University) Disclaimer Viterbi Lecture 2
More informationExercises with solutions (Set D)
Exercises with solutions Set D. A fair die is rolled at the same time as a fair coin is tossed. Let A be the number on the upper surface of the die and let B describe the outcome of the coin toss, where
More informationLecture 11: Polar codes construction
15-859: Information Theory and Applications in TCS CMU: Spring 2013 Lecturer: Venkatesan Guruswami Lecture 11: Polar codes construction February 26, 2013 Scribe: Dan Stahlke 1 Polar codes: recap of last
More informationQuantization for Distributed Estimation
0 IEEE International Conference on Internet of Things ithings 0), Green Computing and Communications GreenCom 0), and Cyber-Physical-Social Computing CPSCom 0) Quantization for Distributed Estimation uan-yu
More informationDesign of Optimal Quantizers for Distributed Source Coding
Design of Optimal Quantizers for Distributed Source Coding David Rebollo-Monedero, Rui Zhang and Bernd Girod Information Systems Laboratory, Electrical Eng. Dept. Stanford University, Stanford, CA 94305
More informationSide-information Scalable Source Coding
Side-information Scalable Source Coding Chao Tian, Member, IEEE, Suhas N. Diggavi, Member, IEEE Abstract The problem of side-information scalable (SI-scalable) source coding is considered in this work,
More informationInteractive Communication for Data Exchange
Interactive Communication for Data Exchange Himanshu Tyagi Indian Institute of Science, Bangalore Joint work with Pramod Viswanath and Shun Watanabe The Data Exchange Problem [ElGamal-Orlitsky 84], [Csiszár-Narayan
More informationCh. 8 Math Preliminaries for Lossy Coding. 8.4 Info Theory Revisited
Ch. 8 Math Preliminaries for Lossy Coding 8.4 Info Theory Revisited 1 Info Theory Goals for Lossy Coding Again just as for the lossless case Info Theory provides: Basis for Algorithms & Bounds on Performance
More informationSource and Channel Coding for Correlated Sources Over Multiuser Channels
Source and Channel Coding for Correlated Sources Over Multiuser Channels Deniz Gündüz, Elza Erkip, Andrea Goldsmith, H. Vincent Poor Abstract Source and channel coding over multiuser channels in which
More informationBinary Rate Distortion With Side Information: The Asymmetric Correlation Channel Case
Binary Rate Distortion With Side Information: The Asymmetric Correlation Channel Case Andrei Sechelea*, Samuel Cheng, Adrian Munteanu*, and Nikos Deligiannis* *Department of Electronics and Informatics,
More informationUniversal Incremental Slepian-Wolf Coding
Proceedings of the 43rd annual Allerton Conference, Monticello, IL, September 2004 Universal Incremental Slepian-Wolf Coding Stark C. Draper University of California, Berkeley Berkeley, CA, 94720 USA sdraper@eecs.berkeley.edu
More informationReliable Computation over Multiple-Access Channels
Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,
More informationApproximately achieving the feedback interference channel capacity with point-to-point codes
Approximately achieving the feedback interference channel capacity with point-to-point codes Joyson Sebastian*, Can Karakus*, Suhas Diggavi* Abstract Superposition codes with rate-splitting have been used
More informationSecret Key and Private Key Constructions for Simple Multiterminal Source Models
Secret Key and Private Key Constructions for Simple Multiterminal Source Models arxiv:cs/05050v [csit] 3 Nov 005 Chunxuan Ye Department of Electrical and Computer Engineering and Institute for Systems
More informationarxiv: v1 [cs.it] 19 Aug 2008
Distributed Source Coding using Abelian Group Codes arxiv:0808.2659v1 [cs.it] 19 Aug 2008 Dinesh Krithivasan and S. Sandeep Pradhan, Department of Electrical Engineering and Computer Science, University
More informationThe Distributed Karhunen-Loève Transform
1 The Distributed Karhunen-Loève Transform Michael Gastpar, Member, IEEE, Pier Luigi Dragotti, Member, IEEE, and Martin Vetterli, Fellow, IEEE Manuscript received November 2004; revised March 5, 2006;
More informationPerformance of Polar Codes for Channel and Source Coding
Performance of Polar Codes for Channel and Source Coding Nadine Hussami AUB, Lebanon, Email: njh03@aub.edu.lb Satish Babu Korada and üdiger Urbanke EPFL, Switzerland, Email: {satish.korada,ruediger.urbanke}@epfl.ch
More informationUniversality of Logarithmic Loss in Lossy Compression
Universality of Logarithmic Loss in Lossy Compression Albert No, Member, IEEE, and Tsachy Weissman, Fellow, IEEE arxiv:709.0034v [cs.it] Sep 207 Abstract We establish two strong senses of universality
More informationCoding for Computing. ASPITRG, Drexel University. Jie Ren 2012/11/14
Coding for Computing ASPITRG, Drexel University Jie Ren 2012/11/14 Outline Background and definitions Main Results Examples and Analysis Proofs Background and definitions Problem Setup Graph Entropy Conditional
More informationSolutions to Homework Set #1 Sanov s Theorem, Rate distortion
st Semester 00/ Solutions to Homework Set # Sanov s Theorem, Rate distortion. Sanov s theorem: Prove the simple version of Sanov s theorem for the binary random variables, i.e., let X,X,...,X n be a sequence
More informationECE Information theory Final
ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the
More informationInformation Theory. Coding and Information Theory. Information Theory Textbooks. Entropy
Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is
More informationHypothesis Testing with Communication Constraints
Hypothesis Testing with Communication Constraints Dinesh Krithivasan EECS 750 April 17, 2006 Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 1 / 21 Presentation Outline
More informationDistributed Lossy Interactive Function Computation
Distributed Lossy Interactive Function Computation Solmaz Torabi & John MacLaren Walsh Dept. of Electrical and Computer Engineering Drexel University Philadelphia, PA 19104 Email: solmaz.t@drexel.edu &
More informationIEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER 2006 5177 The Distributed Karhunen Loève Transform Michael Gastpar, Member, IEEE, Pier Luigi Dragotti, Member, IEEE, and Martin Vetterli,
More informationInformation Theory and Networks
Information Theory and Networks Lecture 17: Gambling with Side Information Paul Tune http://www.maths.adelaide.edu.au/matthew.roughan/ Lecture_notes/InformationTheory/ School
More informationCommon Randomness Principles of Secrecy
Common Randomness Principles of Secrecy Himanshu Tyagi Department of Electrical and Computer Engineering and Institute of Systems Research 1 Correlated Data, Distributed in Space and Time Sensor Networks
More informationDistributed Lossless Compression. Distributed lossless compression system
Lecture #3 Distributed Lossless Compression (Reading: NIT 10.1 10.5, 4.4) Distributed lossless source coding Lossless source coding via random binning Time Sharing Achievability proof of the Slepian Wolf
More informationEECS 750. Hypothesis Testing with Communication Constraints
EECS 750 Hypothesis Testing with Communication Constraints Name: Dinesh Krithivasan Abstract In this report, we study a modification of the classical statistical problem of bivariate hypothesis testing.
More informationOn the Duality between Multiple-Access Codes and Computation Codes
On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch
More informationPaul Cuff, Han-I Su, and Abbas EI Gamal Department of Electrical Engineering Stanford University {cuff, hanisu,
Cascade Multiterminal Source Coding Paul Cuff, Han-I Su, and Abbas EI Gamal Department of Electrical Engineering Stanford University E-mail: {cuff, hanisu, abbas}@stanford.edu Abstract-We investigate distributed
More informationLecture 5 Channel Coding over Continuous Channels
Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From
More informationInterpretations of Directed Information in Portfolio Theory, Data Compression, and Hypothesis Testing
Interpretations of Directed Information in Portfolio Theory, Data Compression, and Hypothesis Testing arxiv:092.4872v [cs.it] 24 Dec 2009 Haim H. Permuter, Young-Han Kim, and Tsachy Weissman Abstract We
More informationAdvanced Topics in Information Theory
Advanced Topics in Information Theory Lecture Notes Stefan M. Moser c Copyright Stefan M. Moser Signal and Information Processing Lab ETH Zürich Zurich, Switzerland Institute of Communications Engineering
More informationHomework Set #2 Data Compression, Huffman code and AEP
Homework Set #2 Data Compression, Huffman code and AEP 1. Huffman coding. Consider the random variable ( x1 x X = 2 x 3 x 4 x 5 x 6 x 7 0.50 0.26 0.11 0.04 0.04 0.03 0.02 (a Find a binary Huffman code
More informationSecret Key Generation and Secure Computing
Secret Key Generation and Secure Computing Himanshu Tyagi Department of Electrical and Computer Engineering and Institute of System Research University of Maryland, College Park, USA. Joint work with Prakash
More informationInteractive Decoding of a Broadcast Message
In Proc. Allerton Conf. Commun., Contr., Computing, (Illinois), Oct. 2003 Interactive Decoding of a Broadcast Message Stark C. Draper Brendan J. Frey Frank R. Kschischang University of Toronto Toronto,
More informationDistributed Source Coding Using LDPC Codes
Distributed Source Coding Using LDPC Codes Telecommunications Laboratory Alex Balatsoukas-Stimming Technical University of Crete May 29, 2010 Telecommunications Laboratory (TUC) Distributed Source Coding
More information4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information
4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information Ramji Venkataramanan Signal Processing and Communications Lab Department of Engineering ramji.v@eng.cam.ac.uk
More informationInformation Leakage of Correlated Source Coded Sequences over a Channel with an Eavesdropper
Information Leakage of Correlated Source Coded Sequences over a Channel with an Eavesdropper Reevana Balmahoon and Ling Cheng School of Electrical and Information Engineering University of the Witwatersrand
More informationExtended Subspace Error Localization for Rate-Adaptive Distributed Source Coding
Introduction Error Correction Extended Subspace Simulation Extended Subspace Error Localization for Rate-Adaptive Distributed Source Coding Mojtaba Vaezi and Fabrice Labeau McGill University International
More informationPerformance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)
Performance-based Security for Encoding of Information Signals FA9550-15-1-0180 (2015-2018) Paul Cuff (Princeton University) Contributors Two students finished PhD Tiance Wang (Goldman Sachs) Eva Song
More informationOn Lossless Coding With Coded Side Information Daniel Marco, Member, IEEE, and Michelle Effros, Fellow, IEEE
3284 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 7, JULY 2009 On Lossless Coding With Coded Side Information Daniel Marco, Member, IEEE, Michelle Effros, Fellow, IEEE Abstract This paper considers
More informationHorse Racing Redux. Background
and Networks Lecture 17: Gambling with Paul Tune http://www.maths.adelaide.edu.au/matthew.roughan/ Lecture_notes/InformationTheory/ Part I Gambling with School of Mathematical
More informationLinear Coding Schemes for the Distributed Computation of Subspaces
Linear Coding Schemes for the Distributed Computation of Subspaces V. Lalitha, N. Prakash, K. Vinodh, P. Vijay Kumar and S. Sandeep Pradhan Abstract arxiv:1302.5021v1 [cs.it] 20 Feb 2013 Let X 1,...,X
More informationThe Shannon s basic inequalities refer to the following fundamental properties of entropy function:
COMMUNICATIONS IN INFORMATION AND SYSTEMS c 2003 International Press Vol. 3, No. 1, pp. 47-60, June 2003 004 ON A NEW NON-SHANNON TYPE INFORMATION INEQUALITY ZHEN ZHANG Abstract. Recently, K. Makarychev,
More information5218 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER 2006
5218 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER 2006 Source Coding With Limited-Look-Ahead Side Information at the Decoder Tsachy Weissman, Member, IEEE, Abbas El Gamal, Fellow,
More informationIEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 62, NO. 5, MAY
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 6, NO. 5, MAY 06 85 Duality of a Source Coding Problem and the Semi-Deterministic Broadcast Channel With Rate-Limited Cooperation Ziv Goldfeld, Student Member,
More informationLecture 20: Quantization and Rate-Distortion
Lecture 20: Quantization and Rate-Distortion Quantization Introduction to rate-distortion theorem Dr. Yao Xie, ECE587, Information Theory, Duke University Approimating continuous signals... Dr. Yao Xie,
More informationA New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality
0 IEEE Information Theory Workshop A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality Kumar Viswanatha, Emrah Akyol and Kenneth Rose ECE Department, University of California
More informationAmobile satellite communication system, like Motorola s
I TRANSACTIONS ON INFORMATION THORY, VOL. 45, NO. 4, MAY 1999 1111 Distributed Source Coding for Satellite Communications Raymond W. Yeung, Senior Member, I, Zhen Zhang, Senior Member, I Abstract Inspired
More informationLecture 11: Continuous-valued signals and differential entropy
Lecture 11: Continuous-valued signals and differential entropy Biology 429 Carl Bergstrom September 20, 2008 Sources: Parts of today s lecture follow Chapter 8 from Cover and Thomas (2007). Some components
More informationSecret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe
Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe Two party secret key agreement Maurer 93, Ahlswede-Csiszár 93 X F Y K x K y ArandomvariableK
More informationOn Network Functional Compression
On Network Functional Compression Soheil Feizi, Student Member, IEEE, Muriel Médard, Fellow, IEEE arxiv:0.5496v2 [cs.it] 30 Nov 200 Abstract In this paper, we consider different aspects of the problem
More informationSHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe
SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/40 Acknowledgement Praneeth Boda Himanshu Tyagi Shun Watanabe 3/40 Outline Two-terminal model: Mutual
More informationCOMM901 Source Coding and Compression. Quiz 1
German University in Cairo - GUC Faculty of Information Engineering & Technology - IET Department of Communication Engineering Winter Semester 2013/2014 Students Name: Students ID: COMM901 Source Coding
More informationLecture 3. Mathematical methods in communication I. REMINDER. A. Convex Set. A set R is a convex set iff, x 1,x 2 R, θ, 0 θ 1, θx 1 + θx 2 R, (1)
3- Mathematical methods in communication Lecture 3 Lecturer: Haim Permuter Scribe: Yuval Carmel, Dima Khaykin, Ziv Goldfeld I. REMINDER A. Convex Set A set R is a convex set iff, x,x 2 R, θ, θ, θx + θx
More information2015 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media,
2015 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising
More informationSource Coding and Function Computation: Optimal Rate in Zero-Error and Vanishing Zero-Error Regime
Source Coding and Function Computation: Optimal Rate in Zero-Error and Vanishing Zero-Error Regime Solmaz Torabi Dept. of Electrical and Computer Engineering Drexel University st669@drexel.edu Advisor:
More informationChapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University
Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission
More informationAN INTRODUCTION TO SECRECY CAPACITY. 1. Overview
AN INTRODUCTION TO SECRECY CAPACITY BRIAN DUNN. Overview This paper introduces the reader to several information theoretic aspects of covert communications. In particular, it discusses fundamental limits
More informationOn Computing a Function of Correlated Sources
1 On Computing a Function of Correlated Sources Milad Sefidgaran Aslan Tchamkerten arxiv:1107.5806v3 [cs.it] 11 Oct 2012 Abstract A receiver wants to compute a function f of two correlated sources X Y
More informationCh. 8 Math Preliminaries for Lossy Coding. 8.5 Rate-Distortion Theory
Ch. 8 Math Preliminaries for Lossy Coding 8.5 Rate-Distortion Theory 1 Introduction Theory provide insight into the trade between Rate & Distortion This theory is needed to answer: What do typical R-D
More informationThe Communication Complexity of Correlation. Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan
The Communication Complexity of Correlation Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan Transmitting Correlated Variables (X, Y) pair of correlated random variables Transmitting
More informationNon-binary Distributed Arithmetic Coding
Non-binary Distributed Arithmetic Coding by Ziyang Wang Thesis submitted to the Faculty of Graduate and Postdoctoral Studies In partial fulfillment of the requirements For the Masc degree in Electrical
More informationMultimedia Communications. Mathematical Preliminaries for Lossless Compression
Multimedia Communications Mathematical Preliminaries for Lossless Compression What we will see in this chapter Definition of information and entropy Modeling a data source Definition of coding and when
More informationCOMPLEMENTARY graph entropy was introduced
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 6, JUNE 2009 2537 On Complementary Graph Entropy Ertem Tuncel, Member, IEEE, Jayanth Nayak, Prashant Koulgi, Student Member, IEEE, Kenneth Rose, Fellow,
More informationNetwork Distributed Quantization
Network Distributed uantization David Rebollo-Monedero and Bernd Girod Information Systems Laboratory, Department of Electrical Engineering Stanford University, Stanford, CA 94305 {drebollo,bgirod}@stanford.edu
More informationSparse Regression Codes for Multi-terminal Source and Channel Coding
Sparse Regression Codes for Multi-terminal Source and Channel Coding Ramji Venkataramanan Yale University Sekhar Tatikonda Allerton 2012 1 / 20 Compression with Side-Information X Encoder Rate R Decoder
More informationSHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe
SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/41 Outline Two-terminal model: Mutual information Operational meaning in: Channel coding: channel
More informationLECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem
LECTURE 15 Last time: Feedback channel: setting up the problem Perfect feedback Feedback capacity Data compression Lecture outline Joint source and channel coding theorem Converse Robustness Brain teaser
More informationarxiv: v1 [cs.it] 5 Feb 2016
An Achievable Rate-Distortion Region for Multiple Descriptions Source Coding Based on Coset Codes Farhad Shirani and S. Sandeep Pradhan Dept. of Electrical Engineering and Computer Science Univ. of Michigan,
More informationLecture 2: August 31
0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 2: August 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy
More informationthe Information Bottleneck
the Information Bottleneck Daniel Moyer December 10, 2017 Imaging Genetics Center/Information Science Institute University of Southern California Sorry, no Neuroimaging! (at least not presented) 0 Instead,
More informationInformation Theoretic Limits of Randomness Generation
Information Theoretic Limits of Randomness Generation Abbas El Gamal Stanford University Shannon Centennial, University of Michigan, September 2016 Information theory The fundamental problem of communication
More information