Coding into a source: an inverse rate-distortion theorem
|
|
- Derick Merritt
- 5 years ago
- Views:
Transcription
1 Coding into a source: an inverse rate-distortion theorem Anant Sahai joint work with: Mukul Agarwal Sanjoy K. Mitter Wireless Foundations Department of Electrical Engineering and Computer Sciences University of California at Berkeley LIDS Department of Electrical Engineering and Computer Sciences Massachusetts Institute of Technology 2006 Allerton Conference on Communication, Control, and Computation Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
2 Suppose the aliens landed... Your mission: reverse-engineer their communications technology Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
3 Suppose the aliens landed... Your mission: reverse-engineer their communications technology What assumptions should you make? Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
4 Suppose the aliens landed... Your mission: reverse-engineer their communications technology What assumptions should you make? They are already here! Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
5 Does information theory determine architecture? Channel coding is an optimal interface. Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
6 Does information theory determine architecture? Channel coding is an optimal interface. Is it the unique interface? Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
7 Outline 1 Motivation and introduction 2 The equivalence perspective Communication problems Separation as equivalence 3 Main result: a direct converse for rate-distortion Basic: finite-alphabet sources Extension: real-alphabet with difference distortion Conditional rate-distortion: steganography with a public cover-story 4 Application: understanding information within unstable processes 5 Conclusions and open problems Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
8 Abstract model of communication problems U t 1 W t Specified Source S t X t Designed Encoder E t Common Randomness R 1 V t Feedback and Memory From Pattern I Y t Noisy Channel f c 1 Step delay U t Designed Decoder D t Z t Problem: Source S, Information pattern I, and Objective V. Constrained resource: Noisy channel f c Designed solution: Encoder E, Decoder D Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
9 Focus: what channels are good enough Channel f c solves the problem if E, D so system satisfies V Problem A is harder than problem B if any f c that solves A solves B. Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
10 Focus: what channels are good enough Channel f c solves the problem if E, D so system satisfies V Problem A is harder than problem B if any f c that solves A solves B. Information theory is an asymptotic theory Pick V family with an appropriate slack parameter Consider the set of channels that solve the problem. Take union over slack parameter choices. Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
11 Shannon bit-transfer problems A R,ǫ,d Source: noninteractive X i (R bits): fair coin tosses Information pattern: Di has access to Z1 i Ei gets access to X1 i Performance objective: V(ǫ, d) is satisfied if P(X i U i+d ) ǫ for every i 0. Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
12 Shannon bit-transfer problems A R,ǫ,d Source: noninteractive X i (R bits): fair coin tosses Information pattern: Di has access to Z1 i Ei gets access to X1 i Performance objective: V(ǫ, d) is satisfied if P(X i U i+d ) ǫ for every i 0. Slack parameter: permitted delay d Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
13 Shannon bit-transfer problems A R,ǫ,d Source: noninteractive X i (R bits): fair coin tosses Information pattern: Di has access to Z1 i Ei gets access to X1 i Performance objective: V(ǫ, d) is satisfied if P(X i U i+d ) ǫ for every i 0. Slack parameter: permitted delay d Natural orderings: larger ǫ, d is easier but larger R is harder. Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
14 Shannon bit-transfer problems A R,ǫ,d Source: noninteractive X i (R bits): fair coin tosses Information pattern: Di has access to Z1 i Ei gets access to X1 i Performance objective: V(ǫ, d) is satisfied if P(X i U i+d ) ǫ for every i 0. Slack parameter: permitted delay d Natural orderings: larger ǫ, d is easier but larger R is harder. Classical capacity C R = ǫ>0 R <R d>0 {f c f c solves A R,ǫ,d} C Shannon (f c ) = sup{r > 0 f c C R } Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
15 Estimation to a distortion constraint: A (FX,ρ,D,ǫ,d) Source: noninteractive X i drawn iid from F X Same information pattern Performance objective: V(ρ, D, ǫ, d) is satisfied if for all τ, lim n P( 1 τ+n 1 n i=τ ρ(x i, U i+d ) > D) ǫ. Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
16 Estimation to a distortion constraint: A (FX,ρ,D,ǫ,d) Source: noninteractive X i drawn iid from F X Same information pattern Performance objective: V(ρ, D, ǫ, d) is satisfied if for all τ, lim n P( 1 τ+n 1 n i=τ ρ(x i, U i+d ) > D) ǫ. Slack parameter: permitted delay d Natural orderings: larger D, d,ǫ is easier Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
17 Estimation to a distortion constraint: A (FX,ρ,D,ǫ,d) Source: noninteractive X i drawn iid from F X Same information pattern Performance objective: V(ρ, D, ǫ, d) is satisfied if for all τ, lim n P( 1 τ+n 1 n i=τ ρ(x i, U i+d ) > D) ǫ. Slack parameter: permitted delay d Natural orderings: larger D, d,ǫ is easier Channels that are good enough C e,(fx,ρ,d) = ǫ>d D >D d>0 {f c f c solves A (FX,ρ,D,ǫ,d)} Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
18 Estimation to a distortion constraint: A (FX,ρ,D,ǫ,d) Source: noninteractive X i drawn iid from F X Same information pattern Performance objective: V(ρ, D, ǫ, d) is satisfied if for all τ, lim n P( 1 τ+n 1 n i=τ ρ(x i, U i+d ) > D) ǫ. Slack parameter: permitted delay d Natural orderings: larger D, d,ǫ is easier Channels that are good enough C e,(fx,ρ,d) = ǫ>d D >D d>0 Separation Theorem if ρ is finite. {f c f c solves A (FX,ρ,D,ǫ,d)} (C R(D) C m ) = (C e,(fx,ρ,d) C m ) Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
19 Classical separation revisited Source codes Shannon Bit Transfer Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
20 Classical separation revisited Source codes Shannon Bit Transfer Equivalence proved using mutual information characterizations of R(D) and C Shannon. Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
21 Classical separation revisited Source codes Shannon Bit Transfer Equivalence proved using mutual information characterizations of R(D) and C Shannon. Bidirectional reductions at the problem level. Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
22 Outline 1 Motivation and introduction 2 The equivalence perspective Communication problems Separation as equivalence 3 Main result: a direct converse for rate-distortion Basic: finite-alphabet sources Extension: real-alphabet with difference distortion Conditional rate-distortion: steganography with a public cover-story 4 Application: understanding information within unstable processes 5 Conclusions and open problems Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
23 Main Result Suppose we have a family of black-box systems indexed by ǫ that can communicate streams from input alphabet {X } satisfying τ: τ+n 1 lim P(1 ρ(x i, ˆX i ) > D) ǫ n n i=τ Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
24 Main Result Suppose we have a family of black-box systems indexed by ǫ that can communicate streams from input alphabet {X } satisfying τ: τ+n 1 lim P(1 ρ(x i, ˆX i ) > D) ǫ n n i=τ The distortion measure ρ is non-negative and additive. Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
25 Main Result Suppose we have a family of black-box systems indexed by ǫ that can communicate streams from input alphabet {X } satisfying τ: τ+n 1 lim P(1 ρ(x i, ˆX i ) > D) ǫ n n i=τ The distortion measure ρ is non-negative and additive. The probability measure P is for {X i } iid according to P(x) Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
26 Main Result Suppose we have a family of black-box systems indexed by ǫ that can communicate streams from input alphabet {X } satisfying τ: τ+n 1 lim P(1 ρ(x i, ˆX i ) > D) ǫ n n i=τ The distortion measure ρ is non-negative and additive. The probability measure P is for {X i } iid according to P(x) Then, assuming access to common-randomness at both the encoder and decoder, it is possible to reliably communicate bits at all rates R < R(D) bits per source symbol over the black-box system so that the probability of bit error is as small as desired. Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
27 Relationships to existing perspectives Coding theory Faith in distance rather than probabilistic model for channel. Generalizes Hamming distance to arbitrary additive distortion measures. Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
28 Relationships to existing perspectives Coding theory Faith in distance rather than probabilistic model for channel. Generalizes Hamming distance to arbitrary additive distortion measures. Arbitrarily varying channels Attacker is not limited to a set of attack channels. Attacker is not forced to be causal. Attacker only constrained on long blocks, not individual letters. Attacker only promises low distortion for inputs that look like iid P(x). Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
29 Relationships to existing perspectives Coding theory Faith in distance rather than probabilistic model for channel. Generalizes Hamming distance to arbitrary additive distortion measures. Arbitrarily varying channels Attacker is not limited to a set of attack channels. Attacker is not forced to be causal. Attacker only constrained on long blocks, not individual letters. Attacker only promises low distortion for inputs that look like iid P(x). Steganography/Watermarking No cover-text In conditional case, a cover-story instead Otherwise, Merhav and Somekh-Baruch closest to this work. Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
30 Finite alphabet case Random encoder Nearest typical neighbor decoder Error events: Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
31 Finite alphabet case Random encoder Randomly draw 2 nr length n codewords iid according to P X using common randomness. Nearest typical neighbor decoder Error events: Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
32 Finite alphabet case Random encoder Randomly draw 2 nr length n codewords iid according to P X using common randomness. Transmit the one corresponding to the message. Nearest typical neighbor decoder Error events: Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
33 Finite alphabet case Random encoder Randomly draw 2 nr length n codewords iid according to P X using common randomness. Transmit the one corresponding to the message. Nearest typical neighbor decoder Define the typical codeword set CR to be codewords with type P X ± ǫ for small ǫ. Error events: Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
34 Finite alphabet case Random encoder Randomly draw 2 nr length n codewords iid according to P X using common randomness. Transmit the one corresponding to the message. Nearest typical neighbor decoder Define the typical codeword set CR to be codewords with type P X ± ǫ for small ǫ. Let y n 1 be the received string. Error events: Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
35 Finite alphabet case Random encoder Randomly draw 2 nr length n codewords iid according to P X using common randomness. Transmit the one corresponding to the message. Nearest typical neighbor decoder Define the typical codeword set CR to be codewords with type P X ± ǫ for small ǫ. Let y n 1 be the received string. Decode to ˆX 1 n C R closest to y n 1 in a ρ-distortion sense. Error events: Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
36 Finite alphabet case Random encoder Randomly draw 2 nr length n codewords iid according to P X using common randomness. Transmit the one corresponding to the message. Nearest typical neighbor decoder Define the typical codeword set CR to be codewords with type P X ± ǫ for small ǫ. Let y n 1 be the received string. Decode to ˆX 1 n C R closest to y n 1 in a ρ-distortion sense. Error events: Atypical codeword X n 1. Probability tends to 0. Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
37 Finite alphabet case Random encoder Randomly draw 2 nr length n codewords iid according to P X using common randomness. Transmit the one corresponding to the message. Nearest typical neighbor decoder Define the typical codeword set CR to be codewords with type P X ± ǫ for small ǫ. Let y n 1 be the received string. Decode to ˆX 1 n C R closest to y n 1 in a ρ-distortion sense. Error events: Atypical codeword X n 1. Probability tends to 0. Excess average distortion (> D) on true codeword. Probability tends to 0 by assumption. Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
38 Finite alphabet case Random encoder Randomly draw 2 nr length n codewords iid according to P X using common randomness. Transmit the one corresponding to the message. Nearest typical neighbor decoder Define the typical codeword set CR to be codewords with type P X ± ǫ for small ǫ. Let y n 1 be the received string. Decode to ˆX 1 n C R closest to y n 1 in a ρ-distortion sense. Error events: Atypical codeword X n 1. Probability tends to 0. Excess average distortion (> D) on true codeword. Probability tends to 0 by assumption. There exists a false codeword has average distortion < D Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
39 Probability of false codeword being close Suppose y n 1 has type q Y. Consider z n 1 C R that is false. Let q XY be the resulting joint type. Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
40 Probability of false codeword being close Suppose y n 1 has type q Y. Consider z n 1 C R that is false. Let q XY be the resulting joint type. q X p X ± ǫ Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
41 Probability of false codeword being close Suppose y n 1 has type q Y. Consider z n 1 C R that is false. Let q XY be the resulting joint type. q X p X ± ǫ E qxy [ρ(x, Y)] D if an error. Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
42 Probability of false codeword being close Suppose y n 1 has type q Y. Consider z n 1 C R that is false. Let q XY be the resulting joint type. q X p X ± ǫ E qxy [ρ(x, Y)] D if an error. nq Y (1) nq Y (j) nq Y ( Y ) Blown up nq Y (j)q X Y (1 j) nq Y (j)q X Y (i j) nq Y (j)q X Y ( X j) Sort y n 1 and correspondingly shuffle codeword zn 1 Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
43 Probability of false codeword being close Suppose y n 1 has type q Y. Consider z n 1 C R that is false. Let q XY be the resulting joint type. q X p X ± ǫ E qxy [ρ(x, Y)] D if an error. nq Y (1) nq Y (j) nq Y ( Y ) Blown up nq Y (j)q X Y (1 j) nq Y (j)q X Y (i j) nq Y (j)q X Y ( X j) Sort y n 1 and correspondingly shuffle codeword zn 1 P(Z n 1 collides) j 2 nq Y(j)D(q X Y=j p X ) = 2 nd(q XY p X q Y ) Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
44 Bounding probability of collision Union bound over 2 nr codewords and (n + 1) X Y joint types q XY. Minimize divergence D(q XY p X q Y ) over set of joint types q XY satisfying: EqXY [ρ(x, Y)] D qx p X ± ǫ Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
45 Bounding probability of collision Union bound over 2 nr codewords and (n + 1) X Y joint types q XY. Minimize divergence D(q XY p X q Y ) over set of joint types q XY satisfying: EqXY [ρ(x, Y)] D qx p X ± ǫ Key step: D(q XY p X q Y ) = D(q X p X ) + D(q XY q X q Y ) D(q XY q X q Y ) = I q (X; Y) Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
46 Bounding probability of collision Union bound over 2 nr codewords and (n + 1) X Y joint types q XY. Minimize divergence D(q XY p X q Y ) over set of joint types q XY satisfying: EqXY [ρ(x, Y)] D qx p X ± ǫ Key step: D(q XY p X q Y ) = D(q X p X ) + D(q XY q X q Y ) D(q XY q X q Y ) = I q (X; Y) Minimizing I q (X; Y) gives R(D) when ǫ 0 Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
47 Bounding probability of collision Union bound over 2 nr codewords and (n + 1) X Y joint types q XY. Minimize divergence D(q XY p X q Y ) over set of joint types q XY satisfying: EqXY [ρ(x, Y)] D qx p X ± ǫ Key step: D(q XY p X q Y ) = D(q X p X ) + D(q XY q X q Y ) D(q XY q X q Y ) = I q (X; Y) Minimizing I q (X; Y) gives R(D) when ǫ 0 If R < R(D), collision probability 0 with n. Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
48 Outline 1 Motivation and introduction 2 The equivalence perspective Communication problems Separation as equivalence 3 Main result: a direct converse for rate-distortion Basic: finite-alphabet sources Extension: real-alphabet with difference distortion Conditional rate-distortion: steganography with a public cover-story 4 Application: understanding information within unstable processes 5 Conclusions and open problems Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
49 Extending to real vector alphabets Compact support and difference-distortion Same codebook construction Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
50 Extending to real vector alphabets Compact support and difference-distortion Same codebook construction Pick a fine enough quantization Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
51 Extending to real vector alphabets Compact support and difference-distortion Same codebook construction Pick a fine enough quantization Reduces to finite alphabet case at decoder with a small factor increase in distortion. Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
52 Extending to real vector alphabets Compact support and difference-distortion Same codebook construction Pick a fine enough quantization Reduces to finite alphabet case at decoder with a small factor increase in distortion. Unbounded support New codebook construction: View F X(x) = (1 δ)f X(x) + δf X(x) where X has compact support. Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
53 Extending to real vector alphabets Compact support and difference-distortion Same codebook construction Pick a fine enough quantization Reduces to finite alphabet case at decoder with a small factor increase in distortion. Unbounded support New codebook construction: View F X(x) = (1 δ)f X(x) + δf X(x) where X has compact support. First flip n iid unfair coins with P(head) = δ. Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
54 Extending to real vector alphabets Compact support and difference-distortion Same codebook construction Pick a fine enough quantization Reduces to finite alphabet case at decoder with a small factor increase in distortion. Unbounded support New codebook construction: View F X(x) = (1 δ)f X(x) + δf X(x) where X has compact support. First flip n iid unfair coins with P(head) = δ. Mark positions as dirty or clean. Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
55 Extending to real vector alphabets Compact support and difference-distortion Same codebook construction Pick a fine enough quantization Reduces to finite alphabet case at decoder with a small factor increase in distortion. Unbounded support New codebook construction: View F X(x) = (1 δ)f X(x) + δf X(x) where X has compact support. First flip n iid unfair coins with P(head) = δ. Mark positions as dirty or clean. Draw codewords from X in clean positions and X in dirty ones. Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
56 Extending to real vector alphabets Compact support and difference-distortion Same codebook construction Pick a fine enough quantization Reduces to finite alphabet case at decoder with a small factor increase in distortion. Unbounded support New codebook construction: View F X(x) = (1 δ)f X(x) + δf X(x) where X has compact support. First flip n iid unfair coins with P(head) = δ. Mark positions as dirty or clean. Draw codewords from X in clean positions and X in dirty ones. Modified decoding rule: Declare error if fewer than (1 2δ)n clean positions. Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
57 Extending to real vector alphabets Compact support and difference-distortion Same codebook construction Pick a fine enough quantization Reduces to finite alphabet case at decoder with a small factor increase in distortion. Unbounded support New codebook construction: View F X(x) = (1 δ)f X(x) + δf X(x) where X has compact support. First flip n iid unfair coins with P(head) = δ. Mark positions as dirty or clean. Draw codewords from X in clean positions and X in dirty ones. Modified decoding rule: Declare error if fewer than (1 2δ)n clean positions. Restrict codebook to clean positions only for decoding purposes. Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
58 Extending to real vector alphabets Compact support and difference-distortion Same codebook construction Pick a fine enough quantization Reduces to finite alphabet case at decoder with a small factor increase in distortion. Unbounded support New codebook construction: View F X(x) = (1 δ)f X(x) + δf X(x) where X has compact support. First flip n iid unfair coins with P(head) = δ. Mark positions as dirty or clean. Draw codewords from X in clean positions and X in dirty ones. Modified decoding rule: Declare error if fewer than (1 2δ)n clean positions. Restrict codebook to clean positions only for decoding purposes. Increases distortion by a factor of 1+2δ 1 2δ. Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
59 Extending to real vector alphabets Compact support and difference-distortion Same codebook construction Pick a fine enough quantization Reduces to finite alphabet case at decoder with a small factor increase in distortion. Unbounded support New codebook construction: View F X(x) = (1 δ)f X(x) + δf X(x) where X has compact support. First flip n iid unfair coins with P(head) = δ. Mark positions as dirty or clean. Draw codewords from X in clean positions and X in dirty ones. Modified decoding rule: Declare error if fewer than (1 2δ)n clean positions. Restrict codebook to clean positions only for decoding purposes. Increases distortion by a factor of 1+2δ 1 2δ. lim 0 lim δ 0 R,δ (D 1+2δ 1 2δ ) = R(D) Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
60 Conditional Rate-Distortion Assume coverstory V1 n drawn according to P(V) is known to all parties: encoder, decoder, and attacker. All R < R X V (D) are achievable. Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
61 Conditional Rate-Distortion Assume coverstory V1 n drawn according to P(V) is known to all parties: encoder, decoder, and attacker. All R < R X V (D) are achievable. Encoder draws conditionally using P(X V). Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
62 Conditional Rate-Distortion Assume coverstory V1 n drawn according to P(V) is known to all parties: encoder, decoder, and attacker. All R < R X V (D) are achievable. Encoder draws conditionally using P(X V). Codeword typicality defined similarly (include V n 1 ) Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
63 Conditional Rate-Distortion Assume coverstory V1 n drawn according to P(V) is known to all parties: encoder, decoder, and attacker. All R < R X V (D) are achievable. Encoder draws conditionally using P(X V). Codeword typicality defined similarly (include V n 1 ) Nearest-conditionally-typical codeword decoding Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
64 Conditional Rate-Distortion Assume coverstory V1 n drawn according to P(V) is known to all parties: encoder, decoder, and attacker. All R < R X V (D) are achievable. Encoder draws conditionally using P(X V). Codeword typicality defined similarly (include V n 1 ) Nearest-conditionally-typical codeword decoding Parallel proof nq V (1) nq V (s) nq V ( V ) nq V (s)q Y V (1 s) Blown up nq V (s)q Y V (j s) nq V (s)q Y V ( Y s) Blown up nq V (s)q Y V (j s)q X VY (i s,j) Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
65 Outline 1 Motivation and introduction 2 The equivalence perspective Communication problems Separation as equivalence 3 Main result: a direct converse for rate-distortion Basic: finite-alphabet sources Extension: real-alphabet with difference distortion Conditional rate-distortion: steganography with a public cover-story 4 Application: understanding information within unstable processes 5 Conclusions and open problems Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
66 Unstable Markov Processes: R(D) X t+1 = AX t + W t where A > 1 1 Squared error distortion "Sequential" Rate distortion (obeys causality) Rate distortion curve (non causal) Stable counterpart (non causal) Rate (in bits) Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
67 Unstable Markov Processes: two kinds of information Accumulation: Look at {X kn } Can embed R1 < n log 2 A bits per symbol These bits are recovered with anytime reliability if black-box has finite error moments. Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
68 Unstable Markov Processes: two kinds of information Accumulation: Look at {X kn } Can embed R1 < n log 2 A bits per symbol These bits are recovered with anytime reliability if black-box has finite error moments. Dissipation: Look at {X kn 1 k(n 1)+1 X kn} Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
69 Unstable Markov Processes: two kinds of information Accumulation: Look at {X kn } Can embed R1 < n log 2 A bits per symbol These bits are recovered with anytime reliability if black-box has finite error moments. Dissipation: Look at {X kn 1 k(n 1)+1 X kn} Can be transformed to look iid Fall under our results Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
70 Unstable Markov Processes: two kinds of information Accumulation: Look at {X kn } Can embed R1 < n log 2 A bits per symbol These bits are recovered with anytime reliability if black-box has finite error moments. Dissipation: Look at {X kn 1 k(n 1)+1 X kn} Can be transformed to look iid Fall under our results Can embed R2 < R(D) log 2 A bits per symbol Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
71 Unstable Markov Processes: two kinds of information Accumulation: Look at {X kn } Can embed R1 < n log 2 A bits per symbol These bits are recovered with anytime reliability if black-box has finite error moments. Dissipation: Look at {X kn 1 k(n 1)+1 X kn} Can be transformed to look iid Fall under our results Can embed R2 < R(D) log 2 A bits per symbol Two-tiered nature of information-flow proved by direct reduction. Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
72 Conclusions and open problems Embedding codes Source codes Already extend to stationary-ergodic processes that mix. Shannon Bit Transfer Traditional point-to-point source-channel separation is a consequence of a problem-level equivalence that can be proved using direct reductions in both directions. Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
73 Conclusions and open problems Embedding codes Source codes Shannon Bit Transfer Already extend to stationary-ergodic processes that mix. Can the gap between R seq (D) and R(D) be used to carry information? Traditional point-to-point source-channel separation is a consequence of a problem-level equivalence that can be proved using direct reductions in both directions. Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
74 Conclusions and open problems Embedding codes Shannon Bit Transfer Source codes Traditional point-to-point source-channel separation is a consequence of a problem-level equivalence that can be proved using direct reductions in both directions. Already extend to stationary-ergodic processes that mix. Can the gap between R seq (D) and R(D) be used to carry information? Apply equivalence perspective to better understand multiterminal problems. Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
75 Conclusions and open problems Embedding codes Shannon Bit Transfer Source codes Traditional point-to-point source-channel separation is a consequence of a problem-level equivalence that can be proved using direct reductions in both directions. Already extend to stationary-ergodic processes that mix. Can the gap between R seq (D) and R(D) be used to carry information? Apply equivalence perspective to better understand multiterminal problems. Is there another reason to suspect that channel coding is fundamental to communication? Anant Sahai (UC Berkeley) Inverse Rate Distortion Sep 27, / 27
Coding into a source: a direct inverse Rate-Distortion theorem
Coding into a source: a direct inverse Rate-Distortion theorem Mukul Agarwal, Anant Sahai, and Sanjoy Mitter Abstract Shannon proved that if we can transmit bits reliably at rates larger than the rate
More informationDelay, feedback, and the price of ignorance
Delay, feedback, and the price of ignorance Anant Sahai based in part on joint work with students: Tunc Simsek Cheng Chang Wireless Foundations Department of Electrical Engineering and Computer Sciences
More informationUniversal Anytime Codes: An approach to uncertain channels in control
Universal Anytime Codes: An approach to uncertain channels in control paper by Stark Draper and Anant Sahai presented by Sekhar Tatikonda Wireless Foundations Department of Electrical Engineering and Computer
More informationShannon s noisy-channel theorem
Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for
More informationThe Method of Types and Its Application to Information Hiding
The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,
More informationSource coding and channel requirements for unstable processes. Anant Sahai, Sanjoy Mitter
Source coding and channel requirements for unstable processes Anant Sahai, Sanjoy Mitter sahai@eecs.berkeley.edu, mitter@mit.edu Abstract Our understanding of information in systems has been based on the
More informationLecture 6 I. CHANNEL CODING. X n (m) P Y X
6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder
More informationA Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding
A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation
More informationShannon s Noisy-Channel Coding Theorem
Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 13, 2015 Lucas Slot, Sebastian Zur Shannon s Noisy-Channel Coding Theorem February 13, 2015 1 / 29 Outline 1 Definitions and Terminology
More informationEntropies & Information Theory
Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information
More informationList Decoding of Reed Solomon Codes
List Decoding of Reed Solomon Codes p. 1/30 List Decoding of Reed Solomon Codes Madhu Sudan MIT CSAIL Background: Reliable Transmission of Information List Decoding of Reed Solomon Codes p. 2/30 List Decoding
More informationLecture 4 Noisy Channel Coding
Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem
More informationLecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122
Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel
More informationLecture 5 Channel Coding over Continuous Channels
Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From
More information(Classical) Information Theory II: Source coding
(Classical) Information Theory II: Source coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract The information content of a random variable
More informationControl Over Noisy Channels
IEEE RANSACIONS ON AUOMAIC CONROL, VOL??, NO??, MONH?? 004 Control Over Noisy Channels Sekhar atikonda, Member, IEEE, and Sanjoy Mitter, Fellow, IEEE, Abstract Communication is an important component of
More informationThe connection between information theory and networked control
The connection between information theory and networked control Anant Sahai based in part on joint work with students: Tunc Simsek, Hari Palaiyanur, and Pulkit Grover Wireless Foundations Department of
More information(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute
ENEE 739C: Advanced Topics in Signal Processing: Coding Theory Instructor: Alexander Barg Lecture 6 (draft; 9/6/03. Error exponents for Discrete Memoryless Channels http://www.enee.umd.edu/ abarg/enee739c/course.html
More informationNotes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel
Introduction to Coding Theory CMU: Spring 2010 Notes 3: Stochastic channels and noisy coding theorem bound January 2010 Lecturer: Venkatesan Guruswami Scribe: Venkatesan Guruswami We now turn to the basic
More information(Classical) Information Theory III: Noisy channel coding
(Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way
More informationLecture 7 September 24
EECS 11: Coding for Digital Communication and Beyond Fall 013 Lecture 7 September 4 Lecturer: Anant Sahai Scribe: Ankush Gupta 7.1 Overview This lecture introduces affine and linear codes. Orthogonal signalling
More informationAN INFORMATION THEORY APPROACH TO WIRELESS SENSOR NETWORK DESIGN
AN INFORMATION THEORY APPROACH TO WIRELESS SENSOR NETWORK DESIGN A Thesis Presented to The Academic Faculty by Bryan Larish In Partial Fulfillment of the Requirements for the Degree Doctor of Philosophy
More informationStabilization over discrete memoryless and wideband channels using nearly memoryless observations
Stabilization over discrete memoryless and wideband channels using nearly memoryless observations Anant Sahai Abstract We study stabilization of a discrete-time scalar unstable plant over a noisy communication
More informationHomework Set #2 Data Compression, Huffman code and AEP
Homework Set #2 Data Compression, Huffman code and AEP 1. Huffman coding. Consider the random variable ( x1 x X = 2 x 3 x 4 x 5 x 6 x 7 0.50 0.26 0.11 0.04 0.04 0.03 0.02 (a Find a binary Huffman code
More informationDigital Communications III (ECE 154C) Introduction to Coding and Information Theory
Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I
More informationELEC546 Review of Information Theory
ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random
More informationThe Duality Between Information Embedding and Source Coding With Side Information and Some Applications
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 49, NO. 5, MAY 2003 1159 The Duality Between Information Embedding and Source Coding With Side Information and Some Applications Richard J. Barron, Member,
More informationCS6304 / Analog and Digital Communication UNIT IV - SOURCE AND ERROR CONTROL CODING PART A 1. What is the use of error control coding? The main use of error control coding is to reduce the overall probability
More informationOn Source-Channel Communication in Networks
On Source-Channel Communication in Networks Michael Gastpar Department of EECS University of California, Berkeley gastpar@eecs.berkeley.edu DIMACS: March 17, 2003. Outline 1. Source-Channel Communication
More informationOn Multiple User Channels with State Information at the Transmitters
On Multiple User Channels with State Information at the Transmitters Styrmir Sigurjónsson and Young-Han Kim* Information Systems Laboratory Stanford University Stanford, CA 94305, USA Email: {styrmir,yhk}@stanford.edu
More informationEE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.
EE376A - Information Theory Final, Monday March 14th 216 Solutions Instructions: You have three hours, 3.3PM - 6.3PM The exam has 4 questions, totaling 12 points. Please start answering each question on
More informationA Simple Encoding And Decoding Strategy For Stabilization Over Discrete Memoryless Channels
A Simple Encoding And Decoding Strategy For Stabilization Over Discrete Memoryless Channels Anant Sahai and Hari Palaiyanur Dept. of Electrical Engineering and Computer Sciences University of California,
More informationLecture 18: Shanon s Channel Coding Theorem. Lecture 18: Shanon s Channel Coding Theorem
Channel Definition (Channel) A channel is defined by Λ = (X, Y, Π), where X is the set of input alphabets, Y is the set of output alphabets and Π is the transition probability of obtaining a symbol y Y
More informationComputing and Communications 2. Information Theory -Entropy
1896 1920 1987 2006 Computing and Communications 2. Information Theory -Entropy Ying Cui Department of Electronic Engineering Shanghai Jiao Tong University, China 2017, Autumn 1 Outline Entropy Joint entropy
More informationShannon s A Mathematical Theory of Communication
Shannon s A Mathematical Theory of Communication Emre Telatar EPFL Kanpur October 19, 2016 First published in two parts in the July and October 1948 issues of BSTJ. First published in two parts in the
More informationClassical Information Theory Notes from the lectures by prof Suhov Trieste - june 2006
Classical Information Theory Notes from the lectures by prof Suhov Trieste - june 2006 Fabio Grazioso... July 3, 2006 1 2 Contents 1 Lecture 1, Entropy 4 1.1 Random variable...............................
More informationAn instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1
Kraft s inequality An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if N 2 l i 1 Proof: Suppose that we have a tree code. Let l max = max{l 1,...,
More informationCapacity of a channel Shannon s second theorem. Information Theory 1/33
Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,
More informationEquivalence for Networks with Adversarial State
Equivalence for Networks with Adversarial State Oliver Kosut Department of Electrical, Computer and Energy Engineering Arizona State University Tempe, AZ 85287 Email: okosut@asu.edu Jörg Kliewer Department
More informationSolutions to Homework Set #1 Sanov s Theorem, Rate distortion
st Semester 00/ Solutions to Homework Set # Sanov s Theorem, Rate distortion. Sanov s theorem: Prove the simple version of Sanov s theorem for the binary random variables, i.e., let X,X,...,X n be a sequence
More informationEE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions
EE/Stat 376B Handout #5 Network Information Theory October, 14, 014 1. Problem.4 parts (b) and (c). Homework Set # Solutions (b) Consider h(x + Y ) h(x + Y Y ) = h(x Y ) = h(x). (c) Let ay = Y 1 + Y, where
More informationMultiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets
Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,
More information18.2 Continuous Alphabet (discrete-time, memoryless) Channel
0-704: Information Processing and Learning Spring 0 Lecture 8: Gaussian channel, Parallel channels and Rate-distortion theory Lecturer: Aarti Singh Scribe: Danai Koutra Disclaimer: These notes have not
More informationAppendix B Information theory from first principles
Appendix B Information theory from first principles This appendix discusses the information theory behind the capacity expressions used in the book. Section 8.3.4 is the only part of the book that supposes
More informationShannon s Noisy-Channel Coding Theorem
Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 2015 Abstract In information theory, Shannon s Noisy-Channel Coding Theorem states that it is possible to communicate over a noisy
More information4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information
4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information Ramji Venkataramanan Signal Processing and Communications Lab Department of Engineering ramji.v@eng.cam.ac.uk
More informationLossy Distributed Source Coding
Lossy Distributed Source Coding John MacLaren Walsh, Ph.D. Multiterminal Information Theory, Spring Quarter, 202 Lossy Distributed Source Coding Problem X X 2 S {,...,2 R } S 2 {,...,2 R2 } Ẑ Ẑ 2 E d(z,n,
More informationStochastic Stability and Ergodicity of Linear Gaussian Systems Controlled over Discrete Channels
Stochastic Stability and Ergodicity of Linear Gaussian Systems Controlled over Discrete Channels Serdar Yüksel Abstract We present necessary and sufficient conditions for stochastic stabilizability of
More informationCOMM901 Source Coding and Compression. Quiz 1
German University in Cairo - GUC Faculty of Information Engineering & Technology - IET Department of Communication Engineering Winter Semester 2013/2014 Students Name: Students ID: COMM901 Source Coding
More informationNon-binary Distributed Arithmetic Coding
Non-binary Distributed Arithmetic Coding by Ziyang Wang Thesis submitted to the Faculty of Graduate and Postdoctoral Studies In partial fulfillment of the requirements For the Masc degree in Electrical
More informationLecture 8: Shannon s Noise Models
Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have
More informationChapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University
Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission
More informationSolutions to Set #2 Data Compression, Huffman code and AEP
Solutions to Set #2 Data Compression, Huffman code and AEP. Huffman coding. Consider the random variable ( ) x x X = 2 x 3 x 4 x 5 x 6 x 7 0.50 0.26 0. 0.04 0.04 0.03 0.02 (a) Find a binary Huffman code
More informationAnytime Capacity of the AWGN+Erasure Channel with Feedback. Qing Xu. B.S. (Beijing University) 1997 M.S. (University of California at Berkeley) 2000
Anytime Capacity of the AWGN+Erasure Channel with Feedback by Qing Xu B.S. (Beijing University) 1997 M.S. (University of California at Berkeley) 2000 A dissertation submitted in partial satisfaction of
More information3F1 Information Theory, Lecture 3
3F1 Information Theory, Lecture 3 Jossy Sayir Department of Engineering Michaelmas 2011, 28 November 2011 Memoryless Sources Arithmetic Coding Sources with Memory 2 / 19 Summary of last lecture Prefix-free
More informationEE5585 Data Compression May 2, Lecture 27
EE5585 Data Compression May 2, 2013 Lecture 27 Instructor: Arya Mazumdar Scribe: Fangying Zhang Distributed Data Compression/Source Coding In the previous class we used a H-W table as a simple example,
More information1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H.
Problem sheet Ex. Verify that the function H(p,..., p n ) = k p k log p k satisfies all 8 axioms on H. Ex. (Not to be handed in). looking at the notes). List as many of the 8 axioms as you can, (without
More informationNetwork coding for multicast relation to compression and generalization of Slepian-Wolf
Network coding for multicast relation to compression and generalization of Slepian-Wolf 1 Overview Review of Slepian-Wolf Distributed network compression Error exponents Source-channel separation issues
More informationCSCI 2570 Introduction to Nanocomputing
CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication
More informationTowards a Theory of Information Flow in the Finitary Process Soup
Towards a Theory of in the Finitary Process Department of Computer Science and Complexity Sciences Center University of California at Davis June 1, 2010 Goals Analyze model of evolutionary self-organization
More informationTowards control over fading channels
Towards control over fading channels Paolo Minero, Massimo Franceschetti Advanced Network Science University of California San Diego, CA, USA mail: {minero,massimo}@ucsd.edu Invited Paper) Subhrakanti
More informationMultimedia Communications. Mathematical Preliminaries for Lossless Compression
Multimedia Communications Mathematical Preliminaries for Lossless Compression What we will see in this chapter Definition of information and entropy Modeling a data source Definition of coding and when
More informationSOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003
SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003 SLEPIAN-WOLF RESULT { X i} RATE R x ENCODER 1 DECODER X i V i {, } { V i} ENCODER 0 RATE R v Problem: Determine R, the
More informationAn Achievable Error Exponent for the Mismatched Multiple-Access Channel
An Achievable Error Exponent for the Mismatched Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@camacuk Albert Guillén i Fàbregas ICREA & Universitat Pompeu Fabra University of
More informationThe Poisson Channel with Side Information
The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH
More informationThe necessity and sufficiency of anytime capacity for control over a noisy communication link: Parts I and II
The necessity and sufficiency of anytime capacity for control over a noisy communication link: Parts I and II Anant Sahai, Sanjoy Mitter sahai@eecs.berkeley.edu, mitter@mit.edu Abstract We review how Shannon
More information10-704: Information Processing and Learning Fall Lecture 9: Sept 28
10-704: Information Processing and Learning Fall 2016 Lecturer: Siheng Chen Lecture 9: Sept 28 Note: These notes are based on scribed notes from Spring15 offering of this course. LaTeX template courtesy
More informationLECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem
LECTURE 15 Last time: Feedback channel: setting up the problem Perfect feedback Feedback capacity Data compression Lecture outline Joint source and channel coding theorem Converse Robustness Brain teaser
More informationEE 4TM4: Digital Communications II. Channel Capacity
EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.
More informationClassification & Information Theory Lecture #8
Classification & Information Theory Lecture #8 Introduction to Natural Language Processing CMPSCI 585, Fall 2007 University of Massachusetts Amherst Andrew McCallum Today s Main Points Automatically categorizing
More informationLecture 11: Quantum Information III - Source Coding
CSCI5370 Quantum Computing November 25, 203 Lecture : Quantum Information III - Source Coding Lecturer: Shengyu Zhang Scribe: Hing Yin Tsang. Holevo s bound Suppose Alice has an information source X that
More informationThe source coding game with a cheating switcher
The source coding game with a cheating switcher Hari Palaiyanur, Student Member, Cheng Chang, Anant Sahai, Member Abstract Motivated by the lossy compression of an active-vision video stream, we consider
More information3F1 Information Theory, Lecture 3
3F1 Information Theory, Lecture 3 Jossy Sayir Department of Engineering Michaelmas 2013, 29 November 2013 Memoryless Sources Arithmetic Coding Sources with Memory Markov Example 2 / 21 Encoding the output
More informationExercises with solutions (Set B)
Exercises with solutions (Set B) 3. A fair coin is tossed an infinite number of times. Let Y n be a random variable, with n Z, that describes the outcome of the n-th coin toss. If the outcome of the n-th
More informationCoupling AMS Short Course
Coupling AMS Short Course January 2010 Distance If µ and ν are two probability distributions on a set Ω, then the total variation distance between µ and ν is Example. Let Ω = {0, 1}, and set Then d TV
More informationCharacterization of Information Channels for Asymptotic Mean Stationarity and Stochastic Stability of Nonstationary/Unstable Linear Systems
6332 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 58, NO 10, OCTOBER 2012 Characterization of Information Channels for Asymptotic Mean Stationarity and Stochastic Stability of Nonstationary/Unstable Linear
More informationJoint Write-Once-Memory and Error-Control Codes
1 Joint Write-Once-Memory and Error-Control Codes Xudong Ma Pattern Technology Lab LLC, U.S.A. Email: xma@ieee.org arxiv:1411.4617v1 [cs.it] 17 ov 2014 Abstract Write-Once-Memory (WOM) is a model for many
More informationEE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018
Please submit the solutions on Gradescope. EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 1. Optimal codeword lengths. Although the codeword lengths of an optimal variable length code
More informationSource Coding with Lists and Rényi Entropy or The Honey-Do Problem
Source Coding with Lists and Rényi Entropy or The Honey-Do Problem Amos Lapidoth ETH Zurich October 8, 2013 Joint work with Christoph Bunte. A Task from your Spouse Using a fixed number of bits, your spouse
More informationSuperposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels
Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:
More informationNational University of Singapore Department of Electrical & Computer Engineering. Examination for
National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:
More informationInformation Theory. Week 4 Compressing streams. Iain Murray,
Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 4 Compressing streams Iain Murray, 2014 School of Informatics, University of Edinburgh Jensen s inequality For convex functions: E[f(x)]
More informationNoisy channel communication
Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 6 Communication channels and Information Some notes on the noisy channel setup: Iain Murray, 2012 School of Informatics, University
More informationITCT Lecture IV.3: Markov Processes and Sources with Memory
ITCT Lecture IV.3: Markov Processes and Sources with Memory 4. Markov Processes Thus far, we have been occupied with memoryless sources and channels. We must now turn our attention to sources with memory.
More informationRECENT advances in technology have led to increased activity
IEEE TRANSACTIONS ON AUTOMATIC CONTROL, VOL 49, NO 9, SEPTEMBER 2004 1549 Stochastic Linear Control Over a Communication Channel Sekhar Tatikonda, Member, IEEE, Anant Sahai, Member, IEEE, and Sanjoy Mitter,
More informationA Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels
A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels S. Sandeep Pradhan a, Suhan Choi a and Kannan Ramchandran b, a {pradhanv,suhanc}@eecs.umich.edu, EECS Dept.,
More informationArimoto Channel Coding Converse and Rényi Divergence
Arimoto Channel Coding Converse and Rényi Divergence Yury Polyanskiy and Sergio Verdú Abstract Arimoto proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code
More informationExercise 1. = P(y a 1)P(a 1 )
Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a
More informationOn Capacity Under Received-Signal Constraints
On Capacity Under Received-Signal Constraints Michael Gastpar Dept. of EECS, University of California, Berkeley, CA 9470-770 gastpar@berkeley.edu Abstract In a world where different systems have to share
More informationFeedback Capacity of a Class of Symmetric Finite-State Markov Channels
Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Nevroz Şen, Fady Alajaji and Serdar Yüksel Department of Mathematics and Statistics Queen s University Kingston, ON K7L 3N6, Canada
More informationCoding of memoryless sources 1/35
Coding of memoryless sources 1/35 Outline 1. Morse coding ; 2. Definitions : encoding, encoding efficiency ; 3. fixed length codes, encoding integers ; 4. prefix condition ; 5. Kraft and Mac Millan theorems
More informationAn introduction to basic information theory. Hampus Wessman
An introduction to basic information theory Hampus Wessman Abstract We give a short and simple introduction to basic information theory, by stripping away all the non-essentials. Theoretical bounds on
More informationPerformance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)
Performance-based Security for Encoding of Information Signals FA9550-15-1-0180 (2015-2018) Paul Cuff (Princeton University) Contributors Two students finished PhD Tiance Wang (Goldman Sachs) Eva Song
More informationLecture 14 February 28
EE/Stats 376A: Information Theory Winter 07 Lecture 4 February 8 Lecturer: David Tse Scribe: Sagnik M, Vivek B 4 Outline Gaussian channel and capacity Information measures for continuous random variables
More informationIntroduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information.
L65 Dept. of Linguistics, Indiana University Fall 205 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission rate
More informationRepresentation of Correlated Sources into Graphs for Transmission over Broadcast Channels
Representation of Correlated s into Graphs for Transmission over Broadcast s Suhan Choi Department of Electrical Eng. and Computer Science University of Michigan, Ann Arbor, MI 80, USA Email: suhanc@eecs.umich.edu
More informationDept. of Linguistics, Indiana University Fall 2015
L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 28 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission
More informationMismatched Multi-letter Successive Decoding for the Multiple-Access Channel
Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@cam.ac.uk Alfonso Martinez Universitat Pompeu Fabra alfonso.martinez@ieee.org
More informationApplications of on-line prediction. in telecommunication problems
Applications of on-line prediction in telecommunication problems Gábor Lugosi Pompeu Fabra University, Barcelona based on joint work with András György and Tamás Linder 1 Outline On-line prediction; Some
More information4 An Introduction to Channel Coding and Decoding over BSC
4 An Introduction to Channel Coding and Decoding over BSC 4.1. Recall that channel coding introduces, in a controlled manner, some redundancy in the (binary information sequence that can be used at the
More information