Delay, feedback, and the price of ignorance

Size: px
Start display at page:

Download "Delay, feedback, and the price of ignorance"

Transcription

1 Delay, feedback, and the price of ignorance Anant Sahai based in part on joint work with students: Tunc Simsek Cheng Chang Wireless Foundations Department of Electrical Engineering and Computer Sciences University of California at Berkeley Major Support from NSF ITR EPFL Summer Research Institute: July 8th, 26 Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 / 55

2 Shannon tells us Architectural implication: separate source and channel coding Delay is the most basic price of reliability Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 2 / 55

3 Shannon tells us Architectural implication: separate source and channel coding Delay is the most basic price of reliability [The duality between source and channel coding] can be pursued further and is related to a duality between past and future and the notions of control and knowledge. Thus we may have knowledge of the past and cannot control it; we may control the future but have no knowledge of it. Claude Shannon 959 Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 2 / 55

4 Shannon tells us Architectural implication: separate source and channel coding Delay is the most basic price of reliability [The duality between source and channel coding] can be pursued further and is related to a duality between past and future and the notions of control and knowledge. Thus we may have knowledge of the past and cannot control it; we may control the future but have no knowledge of it. Claude Shannon 959 What did he mean? Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 2 / 55

5 Review of block coding Long block codes are the traditional info theory approach Source: X n B Rn X n Channel: B Rn Y n Zn B Rn Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 3 / 55

6 Review of block coding Long block codes are the traditional info theory approach Source: X n B Rn X n Channel: B Rn Y n Zn B Rn No real sense of time, except trivial interpretation Source-coding: randomness is before encoding Channel-coding: randomness is after encoding Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 3 / 55

7 Review of block coding Long block codes are the traditional info theory approach Source: X n B Rn X n Channel: B Rn Y n Zn B Rn No real sense of time, except trivial interpretation Source-coding: randomness is before encoding Channel-coding: randomness is after encoding Block error exponents: P e exp( ne(r)) Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 3 / 55

8 Review of block coding Long block codes are the traditional info theory approach Source: X n B Rn X n Channel: B Rn Y n Zn B Rn No real sense of time, except trivial interpretation Source-coding: randomness is before encoding Channel-coding: randomness is after encoding Block error exponents: P e exp( ne(r)) Source coding: E b (R) = min Q:H(Q) R D(Q P) = sup ρ E (ρ) = ln[ x ρr E (ρ) P(x) +ρ ] (+ρ) Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 3 / 55

9 Review of block coding Long block codes are the traditional info theory approach Source: X n B Rn X n Channel: B Rn Y n Zn B Rn No real sense of time, except trivial interpretation Source-coding: randomness is before encoding Channel-coding: randomness is after encoding Block error exponents: P e exp( ne(r)) Channel sphere-packing bound: E sp (R) = max q = sup ρ min D (G P q) G:I( q,g) R E (ρ) ρr E (ρ) = max ln q z [ q y p y ] (+ρ) +ρ y,z Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 3 / 55

10 Outline Motivation and introduction 2 Fixed-delay channel coding Without feedback The BEC example The focusing bound Approaching the focusing bound with feedback 3 The source-coding analog Without side-information With side-information 4 Conclusions Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 4 / 55

11 What about fixed delay? B B 2 B 3 B 4 B 5 B 6 B 7 B 8 B 9 B B B 2 B 3 Y Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y Y Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y 2 Y 2 Y 22 Y 23 Y 24 Y 25 Y 26 Z Z 2 Z 3 Z 4 Z 5 Z 6 Z 7 Z 8 Z 9 Z Z Z 2 Z 3 Z 4 Z 5 Z 6 Z 7 Z 8 Z 9 Z 2 Z 2 Z 22 Z 23 Z 24 Z 25 Z 26 B B 2 B 3 B 4 B 5 B 6 B 7 B 8 B 9 fixed delay d = 7 Consider hard deadlines today. ( Soft deadlines allow erasures ) Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 6 / 55

12 What about fixed delay? B B 2 B 3 B 4 B 5 B 6 B 7 B 8 B 9 B B B 2 B 3 Y Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y Y Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y 2 Y 2 Y 22 Y 23 Y 24 Y 25 Y 26 Z Z 2 Z 3 Z 4 Z 5 Z 6 Z 7 Z 8 Z 9 Z Z Z 2 Z 3 Z 4 Z 5 Z 6 Z 7 Z 8 Z 9 Z 2 Z 2 Z 22 Z 23 Z 24 Z 25 Z 26 B B 2 B 3 B 4 B 5 B 6 B 7 B 8 B 9 fixed delay d = 7 Consider hard deadlines today. ( Soft deadlines allow erasures ) Can achieve E r (R) with delay using convolutional codes. Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 6 / 55

13 What about fixed delay? B B 2 B 3 B 4 B 5 B 6 B 7 B 8 B 9 B B B 2 B 3 Y Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y Y Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y 2 Y 2 Y 22 Y 23 Y 24 Y 25 Y 26 Z Z 2 Z 3 Z 4 Z 5 Z 6 Z 7 Z 8 Z 9 Z Z Z 2 Z 3 Z 4 Z 5 Z 6 Z 7 Z 8 Z 9 Z 2 Z 2 Z 22 Z 23 Z 24 Z 25 Z 26 B B 2 B 3 B 4 B 5 B 6 B 7 B 8 B 9 fixed delay d = 7 Consider hard deadlines today. ( Soft deadlines allow erasures ) Can achieve E r (R) with delay using convolutional codes. Pinsker (967: PPI ) claimed that the block-exponents continued to govern the non-block case with and without feedback. Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 6 / 55

14 Nonblock codes without feedback Time Infinite binary tree, with iid random labels: Choose a path through the tree based on data bits Transmit the path labels through the channel Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 8 / 55

15 Nonblock codes without feedback Time. ML decoding Disjoint paths are pairwise independent of the true path. Er (R) analysis applies: future events dominate. Infinite binary tree, with iid random labels: Choose a path through the tree based on data bits Transmit the path labels through the channel Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 8 / 55

16 Nonblock codes without feedback Time. Infinite binary tree, with iid random labels: Choose a path through the tree based on data bits Transmit the path labels through the channel ML decoding Disjoint paths are pairwise independent of the true path. Er (R) analysis applies: future events dominate. Can implement with time-varying random convolutional code. Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 8 / 55

17 Nonblock codes without feedback Time. Infinite binary tree, with iid random labels: Choose a path through the tree based on data bits Transmit the path labels through the channel ML decoding Disjoint paths are pairwise independent of the true path. Er (R) analysis applies: future events dominate. Can implement with time-varying random convolutional code. Achieves P e (d) K exp( E r (R)d) for every d for all R < C Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 8 / 55

18 Pinsker s bounding construction explained Without feedback: E sp (R) continues to be a bound. Consider a code with target delay d Use it to construct a block-code with blocksize n >> d Genie-aided decoder: has the truth of all bits before i B feedforward delay Fixed B Z delay decoder Z Fixed delay XOR XOR decoder... t B (t d)r B (t d)r B Causal Y encoder DMC B (t d)r B Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 / 55

19 Pinsker s bounding construction explained Without feedback: E sp (R) continues to be a bound. Consider a code with target delay d Use it to construct a block-code with blocksize n >> d Genie-aided decoder: has the truth of all bits before i Error events for genie-aided system depend only on last d B feedforward delay Fixed B Z delay decoder Z Fixed delay XOR XOR decoder... t B (t d)r B (t d)r B Causal Y encoder DMC B (t d)r B Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 / 55

20 Pinsker s bounding construction explained Without feedback: E sp (R) continues to be a bound. Consider a code with target delay d Use it to construct a block-code with blocksize n >> d Genie-aided decoder: has the truth of all bits before i Error events for genie-aided system depend only on last d Apply a change of measure argument B feedforward delay Fixed B Z delay decoder Z Fixed delay XOR XOR decoder... t B (t d)r B (t d)r B Causal Y encoder DMC B (t d)r B Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 / 55

21 My favorite example: The BEC δ δ e δ δ Simple capacity δ bits per channel use With perfect feedback, simple to achieve: retransmit until it gets through Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 2 / 55

22 My favorite example: The BEC δ δ e δ δ Error Exponent (base 2) Simple capacity δ bits per channel use With perfect feedback, simple to achieve: retransmit until it gets through Rate (in bits) Classical bounds Sphere-packing bound D( R δ) Random coding bound max ρ [,] E (ρ) ρr Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 2 / 55

23 My favorite example: The BEC δ δ e δ δ Error Exponent (base 2) Simple capacity δ bits per channel use With perfect feedback, simple to achieve: retransmit until it gets through Rate (in bits) Classical bounds Sphere-packing bound D( R δ) Random coding bound max ρ [,] E (ρ) ρr What happens with feedback? Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 2 / 55

24 BEC with feedback and fixed blocks At rate R <, have Rn bits to transmit in n channel uses. Typically ( δ)n code bits will be received. Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 3 / 55

25 BEC with feedback and fixed blocks At rate R <, have Rn bits to transmit in n channel uses. Typically ( δ)n code bits will be received. Block errors caused by atypical channel behavior. Doomed if fewer than Rn bits arrive intact. Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 3 / 55

26 BEC with feedback and fixed blocks At rate R <, have Rn bits to transmit in n channel uses. Typically ( δ)n code bits will be received. Block errors caused by atypical channel behavior. Doomed if fewer than Rn bits arrive intact. Feedback can not save us. D( R δ) Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 3 / 55

27 BEC with feedback and fixed blocks At rate R <, have Rn bits to transmit in n channel uses. Typically ( δ)n code bits will be received. Block errors caused by atypical channel behavior. Doomed if fewer than Rn bits arrive intact. Feedback can not save us. D( R δ) Dobrushin showed that this type of behavior is common. Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 3 / 55

28 BEC with feedback and fixed delay R = 2 example: δ 2 δ 2 δ 2 δ 2 ( δ) 2 ( δ) 2 ( δ) 2 ( δ) Birth-death chain: positive recurrent if δ < 2 Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 5 / 55

29 BEC with feedback and fixed delay R = 2 example: δ 2 δ 2 δ 2 δ 2 ( δ) 2 ( δ) 2 ( δ) 2 ( δ) Birth-death chain: positive recurrent if δ < 2 Delay exponent easy to see: P(D d) = P(L > d 2 ) = K( δ δ )d Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 5 / 55

30 BEC with feedback and fixed delay R = 2 example: δ 2 δ 2 δ 2 δ 2 ( δ) 2 ( δ) 2 ( δ) 2 ( δ) Birth-death chain: positive recurrent if δ < 2 Delay exponent easy to see: P(D d) = P(L > d 2 ) = K( δ δ )d.584 vs.294 for block-coding with δ =.4 Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 5 / 55

31 BEC with feedback and fixed delay R = 2 example: δ 2 δ 2 δ 2 δ 2 ( δ) 2 ( δ) 2 ( δ) 2 ( δ) Birth-death chain: positive recurrent if δ < 2 Delay exponent easy to see: P(D d) = P(L > d 2 ) = K( δ δ )d.584 vs.294 for block-coding with δ =.4 Pinsker was wrong! Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 5 / 55

32 Where is this boost coming from? 3 Progres (in bits) time (in BEC uses) Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 6 / 55

33 Outline Motivation and introduction 2 Fixed-delay channel coding Without feedback The BEC example The focusing bound Approaching the focusing bound 3 The source-coding analog Without side-information With side-information 4 Conclusions Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 7 / 55

34 Using E sp to bound α in general Past behavior λn λ λ d Future ( λ)n d λr n bits within deadline bits to ignore The block error probability is like e α( λ)n which cannot exceed the sphere-packing bound e Esp(λR)n Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 9 / 55

35 Using E sp to bound α in general Past behavior λn λ λ d Future ( λ)n d λr n bits within deadline bits to ignore The block error probability is like e α( λ)n which cannot exceed the sphere-packing bound e Esp(λR)n α (R) E sp(λr) λ Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 9 / 55

36 Using E sp to bound α in general Past behavior λn λ λ d Future ( λ)n d λr n bits within deadline bits to ignore The block error probability is like e α( λ)n which cannot exceed the sphere-packing bound e Esp(λR)n α (R) E sp(λr) λ The error events involve both the past and the future. Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 9 / 55

37 Uncertainty focusing bound for symmetric DMCs Minimize over λ for symmetric DMCs to sweep out frontier by varying ρ > : R(ρ) = E (ρ) ρ E + a (ρ) = E (ρ) Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 2 / 55

38 Uncertainty focusing bound for symmetric DMCs Minimize over λ for symmetric DMCs to sweep out frontier by varying ρ > : R(ρ) = E (ρ) ρ E + a (ρ) = E (ρ) Same form as Viterbi s convolutional coding bound for constraint-lengths, but a lot more fundamental! Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 2 / 55

39 Upper bound tight for the BEC with feedback 7 Error Exponent (base 2) Rate (in bits) Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 2 / 55

40 Outline Motivation and introduction 2 Fixed-delay channel coding Without feedback The BEC example The focusing bound Approaching the focusing bound 3 The source-coding analog Without side-information With side-information 4 Conclusions Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

41 A spoonful of sugar helps the bits get across. Original forward DMC channel uses 5 -Fortification noiseless forward side channel uses S S 2 S 3 S 4 S 5 S 6 2 Error Exponent (base e) Rate (in nats) Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

42 Harnessing the power of flow control Forward DMC channel uses... Noiseless forward side channel uses one chunk... deny deny confirm Previous block disambiguation disambiguation Previous block confirmation Group bits into miniblocks of size nr. (n << d) 2 Transmit using an -length random codebook. 3 Use the sugar to tell decoder when it s done. No decoding errors, just queuing plus transmission delays. Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

43 Approaching the focusing bound Error Exponent (base e) (5,8,6) (,3,2) (2,4,3) Rate (in nats) Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

44 The dominant error events: past vs future Ratio (in db) of future to past in dominant error event Future behavior dominates Rate (in nats) Past behavior dominates Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

45 Why this works: operational interpretation of E (ρ) Variable block transmission time T can be bounded by a constant plus a geometric random variable. Error Exponent (base e) Rate (in nats) Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

46 Why this works: operational interpretation of E (ρ) Variable block transmission time T can be bounded by a constant plus a geometric random variable. Error Exponent (base e) Rate (in nats) Need to do list-decoding at low rates. Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

47 Reduces to the low-rate erasure case Pick R < R < C and aim for E + (R ) = E (ρ ) exponent. Error Exponent (base e) Rate (in nats) If n large, effective point-message rate (n( R R )) is small. Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

48 But low-rate erasure exponent log δ Error Exponent (base 2) Rate (in bits) log δ = E (ρ ) in our context. Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 3 / 55

49 Channels with positive zero-error capacity ( θ)c θc ( θ)c θc ( θ)c θc first chunk second chunk third chunk random increment deny random increment deny random increment.... (l + )-bit zero-error feedback block codes confirm + disambiguation Throw away a fraction θ of channel uses for flow-control overhead Asymptotically achieves the focusing bound as θ. Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

50 Can do well even without sugar Time-share flow-control and data and optimize fraction θ for flow-control. Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

51 Channel coding final comments Computation per channel use does not depend on probability of error = infinite computational exponent Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

52 Channel coding final comments Computation per channel use does not depend on probability of error = infinite computational exponent The code is anytime in that it is delay universal application can pick what latency is desired. Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

53 Channel coding final comments Computation per channel use does not depend on probability of error = infinite computational exponent The code is anytime in that it is delay universal application can pick what latency is desired. Queuing delay dominates at all rates. Transmission delay exponents are bounded away from zero at all rates up to capacity. (partially explains Horstein s weird positive error exponents at capacity) Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

54 Outline Motivation and introduction 2 Fixed-delay channel coding Without feedback The BEC example The focusing bound Approaching the focusing bound 3 The source-coding analog Without side-information With side-information 4 Conclusions Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

55 The source coding problem {ˆX t} {X t} Source Encoder Encoded Bitstream Fixed Rate R Source Decoder Assume {X t } iid Application-level interface Symbol error probability: Pe = P(X t ˆX t ) End-to-end latency: d (measured in source timescale) Channel-code interface: fixed rate R (assumed noiseless) Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

56 The source coding problem {ˆX t} {X t} Source Encoder Encoded Bitstream Fixed Rate R Source Decoder Assume {X t } iid Application-level interface Symbol error probability: Pe = P(X t ˆX t ) End-to-end latency: d (measured in source timescale) Channel-code interface: fixed rate R (assumed noiseless) What are the fundamental tradeoffs? Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

57 Using E b to bound E s in general Past behavior Future β d d β dr dr n symbols within deadline symbols to ignore The error probability is bounded by K exp( de s (R)) which cannot exceed the block-coding bound exp( ne b (βr)) Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

58 Using E b to bound E s in general Past behavior Future β d d β dr dr n symbols within deadline symbols to ignore The error probability is bounded by K exp( de s (R)) which cannot exceed the block-coding bound exp( ne b (βr)) E s (R) E b(βr) β Only the past matters! Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

59 Achieving the focusing bound R(ρ) = E (ρ) ρ E s (ρ) = E (ρ) Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 4 / 55

60 Achieving the focusing bound R(ρ) = E (ρ) ρ E s (ρ) = E (ρ) fixed-rate source variable-length buffer code FIFO Queue fixed-rate bitstream Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 4 / 55

61 Achieving the focusing bound R(ρ) = E (ρ) ρ E s (ρ) = E (ρ) fixed-rate source variable-length buffer code FIFO Queue fixed-rate bitstream Pick miniblock n large enough but small relative to d Variable-length codes turn into variable delay at the receiver. Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 4 / 55

62 Achieving the focusing bound R(ρ) = E (ρ) ρ E s (ρ) = E (ρ) fixed-rate source variable-length buffer code FIFO Queue fixed-rate bitstream Pick miniblock n large enough but small relative to d Variable-length codes turn into variable delay at the receiver. Queuing delay dominates Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 4 / 55

63 A simple example Unfair coin tosses P(H) = Reliability Rate Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

64 Side-information at the decoder X, X 2,... (X i, Y i ) p XY Y, Y 2,... Rate R E D X, X 2,.... Encoder may or may not be ignorant of the side-information If P X,Y symmetric with uniform marginals, can do no better than E b (R) with delay. Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

65 Side-information at the decoder X, X 2,... (X i, Y i ) p XY Y, Y 2,... Rate R E D X, X 2,.... Encoder may or may not be ignorant of the side-information If P X,Y symmetric with uniform marginals, can do no better than E b (R) with delay Reliability Rate Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

66 Side-information at the decoder X, X 2,... (X i, Y i ) p XY Y, Y 2,... Rate R E D X, X 2,.... Encoder may or may not be ignorant of the side-information If P X,Y symmetric with uniform marginals, can do no better than E b (R) with delay Reliability.3.2..e5.5e4.e4.5e3 Ratio.e3.5e Rate.e Rate Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

67 Long vs. Large deviations Error Height Rate Slope Past Future Typical Slope Entropy Slope Shorter deviation periods must be larger. Smaller deviations must be over longer periods. Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

68 Conclusions [The duality between source and channel coding] can be pursued further and is related to a duality between past and future and the notions of control and knowledge. Thus we may have knowledge of the past and cannot control it; we may control the future but have no knowledge of it. Claude Shannon 959 Error events are dominated by the: Future: Channel coding without feedback. [Pinsker] Past: Point-to-point lossless source coding [ITW6] Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

69 Conclusions [The duality between source and channel coding] can be pursued further and is related to a duality between past and future and the notions of control and knowledge. Thus we may have knowledge of the past and cannot control it; we may control the future but have no knowledge of it. Claude Shannon 959 Error events are dominated by the: Future: Channel coding without feedback. [Pinsker] Past: Point-to-point lossless source coding [ITW6] Future: Symmetric source coding with receiver side-information [ISIT6] Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

70 Conclusions [The duality between source and channel coding] can be pursued further and is related to a duality between past and future and the notions of control and knowledge. Thus we may have knowledge of the past and cannot control it; we may control the future but have no knowledge of it. Claude Shannon 959 Error events are dominated by the: Future: Channel coding without feedback. [Pinsker] Past: Point-to-point lossless source coding [ITW6] Future: Symmetric source coding with receiver side-information [ISIT6] Combination: Channel coding with feedback Combination: Non-symmetric source-coding with receiver side-information. Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

71 Conclusions [The duality between source and channel coding] can be pursued further and is related to a duality between past and future and the notions of control and knowledge. Thus we may have knowledge of the past and cannot control it; we may control the future but have no knowledge of it. Claude Shannon 959 Error events are dominated by the: Future: Channel coding without feedback. [Pinsker] Past: Point-to-point lossless source coding [ITW6] Future: Symmetric source coding with receiver side-information [ISIT6] Combination: Channel coding with feedback Combination: Non-symmetric source-coding with receiver side-information. Architectural guidance: Make your messages as short as possible while avoiding integer effects. Use variable-length coding where you can Use feedback for hybrid ARQ, not retransmissions Use queues to smooth out your data rates Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

72 n = 6 total chunks in a block Potential slack chunks True minimum number of chunks n R C required based on Shannon capacity Minimum number of chunks t(ρ) n R C(ρ) required based on target reliability E (ρ) n( E (ρ) ρr E (ρ) ) slack worth at least E (ρ) Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, 26 5 / 55

73 . i 2 i i 2 nr bit message block arrival times i i i + Target delay d chunks after arrival Essential delay t(ρ) Extra delay d t(ρ) i i T i 2 T i T i i 3 i 2 i i possible message block decoding times leading to an error Assumed renewal time new block enters an empty queue point messages vs slack. Extra delay d t(ρ) i 3 i 2 i i i + i + 2 i 3 i T i 2 (ρ) T i (ρ) T i (ρ) i 2 i Assumed renewal time Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

74 Optimize fraction θ for flow-control ( θ)c θc ( θ)c θc ( θ)c θc.... first chunk second chunk third chunk.. random increment deny random increment deny random increment.... confirm + disambiguation Flow-control encoded with -length convolutional code. Low-rate feedback anytime code Flow-control exponent θe () Data exponent ( θ)e (ρ) Data rate ( θ) E (ρ) ρ θ = E (ρ) E () + E (ρ) E (ρ) = E (ρ)e () E (ρ) + E () R (ρ) = E (ρ) ρ E (ρ) = E (ρ) Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

75 Low rate feedback convolutional codes At R < E (), sequential decoding expands only a finite number of nodes on average. Each expansion costs the (growing) constraint length. Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

76 Low rate feedback convolutional codes At R < E (), sequential decoding expands only a finite number of nodes on average. Each expansion costs the (growing) constraint length. Idea: run a copy of the decoder at the encoder Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

77 Low rate feedback convolutional codes At R < E (), sequential decoding expands only a finite number of nodes on average. Each expansion costs the (growing) constraint length. Idea: run a copy of the decoder at the encoder Convolve against: (B + B (n)), (B 2 + B 2 (n)),..., (B n + B n (n)), B n Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

78 Low rate feedback convolutional codes At R < E (), sequential decoding expands only a finite number of nodes on average. Each expansion costs the (growing) constraint length. Idea: run a copy of the decoder at the encoder Convolve against: (B + B (n)), (B 2 + B 2 (n)),..., (B n + B n (n)), B n Identical distance properties Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

79 Low rate feedback convolutional codes At R < E (), sequential decoding expands only a finite number of nodes on average. Each expansion costs the (growing) constraint length. Idea: run a copy of the decoder at the encoder Convolve against: (B + B (n)), (B 2 + B 2 (n)),..., (B n + B n (n)), B n Identical distance properties But only a finite number of expected nonzero terms Infinite-constraint length performance at a finite price! Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

80 Low rate feedback convolutional codes At R < E (), sequential decoding expands only a finite number of nodes on average. Each expansion costs the (growing) constraint length. Idea: run a copy of the decoder at the encoder Convolve against: (B + B (n)), (B 2 + B 2 (n)),..., (B n + B n (n)), B n Identical distance properties But only a finite number of expected nonzero terms Infinite-constraint length performance at a finite price! Achieves E r (R) exponent with delay. Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

81 Low rate feedback convolutional codes At R < E (), sequential decoding expands only a finite number of nodes on average. Each expansion costs the (growing) constraint length. Idea: run a copy of the decoder at the encoder Convolve against: (B + B (n)), (B 2 + B 2 (n)),..., (B n + B n (n)), B n Identical distance properties But only a finite number of expected nonzero terms Infinite-constraint length performance at a finite price! Achieves E r (R) exponent with delay. Another trick due to Pinsker can extend computational advantage to higher rates at the cost of lower delay exponents. Anant Sahai (UC Berkeley) Delay and Feedback Jul 8, / 55

Feedback and Side-Information in Information Theory

Feedback and Side-Information in Information Theory Feedback and Side-Information in Information Theory Anant Sahai and Sekhar Tatikonda UC Berkeley and Yale sahai@eecs.berkeley.edu and sekhar.tatikonda@yale.edu ISIT 27 Tutorial T2 Nice, France June 24,

More information

The connection between information theory and networked control

The connection between information theory and networked control The connection between information theory and networked control Anant Sahai based in part on joint work with students: Tunc Simsek, Hari Palaiyanur, and Pulkit Grover Wireless Foundations Department of

More information

Coding into a source: an inverse rate-distortion theorem

Coding into a source: an inverse rate-distortion theorem Coding into a source: an inverse rate-distortion theorem Anant Sahai joint work with: Mukul Agarwal Sanjoy K. Mitter Wireless Foundations Department of Electrical Engineering and Computer Sciences University

More information

Universal Anytime Codes: An approach to uncertain channels in control

Universal Anytime Codes: An approach to uncertain channels in control Universal Anytime Codes: An approach to uncertain channels in control paper by Stark Draper and Anant Sahai presented by Sekhar Tatikonda Wireless Foundations Department of Electrical Engineering and Computer

More information

The error exponent with delay for lossless source coding

The error exponent with delay for lossless source coding The error eponent with delay for lossless source coding Cheng Chang and Anant Sahai Wireless Foundations, University of California at Berkeley cchang@eecs.berkeley.edu, sahai@eecs.berkeley.edu Abstract

More information

On the variable-delay reliability function of discrete memoryless channels with access to noisy feedback

On the variable-delay reliability function of discrete memoryless channels with access to noisy feedback ITW2004, San Antonio, Texas, October 24 29, 2004 On the variable-delay reliability function of discrete memoryless channels with access to noisy feedback Anant Sahai and Tunç Şimşek Electrical Engineering

More information

CSCI 2570 Introduction to Nanocomputing

CSCI 2570 Introduction to Nanocomputing CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication

More information

Relaying Information Streams

Relaying Information Streams Relaying Information Streams Anant Sahai UC Berkeley EECS sahai@eecs.berkeley.edu Originally given: Oct 2, 2002 This was a talk only. I was never able to rigorously formalize the back-of-the-envelope reasoning

More information

The Hallucination Bound for the BSC

The Hallucination Bound for the BSC The Hallucination Bound for the BSC Anant Sahai and Stark Draper Wireless Foundations Department of Electrical Engineering and Computer Sciences University of California at Berkeley ECE Department University

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

Attaining maximal reliability with minimal feedback via joint channel-code and hash-function design

Attaining maximal reliability with minimal feedback via joint channel-code and hash-function design Attaining maimal reliability with minimal feedback via joint channel-code and hash-function design Stark C. Draper, Kannan Ramchandran, Biio Rimoldi, Anant Sahai, and David N. C. Tse Department of EECS,

More information

Noisy channel communication

Noisy channel communication Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 6 Communication channels and Information Some notes on the noisy channel setup: Iain Murray, 2012 School of Informatics, University

More information

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Igal Sason Department of Electrical Engineering Technion - Israel Institute of Technology Haifa 32000, Israel 2009 IEEE International

More information

Lecture 4 Channel Coding

Lecture 4 Channel Coding Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity

More information

Anytime Capacity of the AWGN+Erasure Channel with Feedback. Qing Xu. B.S. (Beijing University) 1997 M.S. (University of California at Berkeley) 2000

Anytime Capacity of the AWGN+Erasure Channel with Feedback. Qing Xu. B.S. (Beijing University) 1997 M.S. (University of California at Berkeley) 2000 Anytime Capacity of the AWGN+Erasure Channel with Feedback by Qing Xu B.S. (Beijing University) 1997 M.S. (University of California at Berkeley) 2000 A dissertation submitted in partial satisfaction of

More information

NAME... Soc. Sec. #... Remote Location... (if on campus write campus) FINAL EXAM EE568 KUMAR. Sp ' 00

NAME... Soc. Sec. #... Remote Location... (if on campus write campus) FINAL EXAM EE568 KUMAR. Sp ' 00 NAME... Soc. Sec. #... Remote Location... (if on campus write campus) FINAL EXAM EE568 KUMAR Sp ' 00 May 3 OPEN BOOK exam (students are permitted to bring in textbooks, handwritten notes, lecture notes

More information

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity 5-859: Information Theory and Applications in TCS CMU: Spring 23 Lecture 8: Channel and source-channel coding theorems; BEC & linear codes February 7, 23 Lecturer: Venkatesan Guruswami Scribe: Dan Stahlke

More information

Error Exponent Region for Gaussian Broadcast Channels

Error Exponent Region for Gaussian Broadcast Channels Error Exponent Region for Gaussian Broadcast Channels Lihua Weng, S. Sandeep Pradhan, and Achilleas Anastasopoulos Electrical Engineering and Computer Science Dept. University of Michigan, Ann Arbor, MI

More information

UNIT I INFORMATION THEORY. I k log 2

UNIT I INFORMATION THEORY. I k log 2 UNIT I INFORMATION THEORY Claude Shannon 1916-2001 Creator of Information Theory, lays the foundation for implementing logic in digital circuits as part of his Masters Thesis! (1939) and published a paper

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

The Poisson Channel with Side Information

The Poisson Channel with Side Information The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH

More information

Chapter 9 Fundamental Limits in Information Theory

Chapter 9 Fundamental Limits in Information Theory Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For

More information

LECTURE 10. Last time: Lecture outline

LECTURE 10. Last time: Lecture outline LECTURE 10 Joint AEP Coding Theorem Last time: Error Exponents Lecture outline Strong Coding Theorem Reading: Gallager, Chapter 5. Review Joint AEP A ( ɛ n) (X) A ( ɛ n) (Y ) vs. A ( ɛ n) (X, Y ) 2 nh(x)

More information

Stabilization over discrete memoryless and wideband channels using nearly memoryless observations

Stabilization over discrete memoryless and wideband channels using nearly memoryless observations Stabilization over discrete memoryless and wideband channels using nearly memoryless observations Anant Sahai Abstract We study stabilization of a discrete-time scalar unstable plant over a noisy communication

More information

Upper Bounds on the Capacity of Binary Intermittent Communication

Upper Bounds on the Capacity of Binary Intermittent Communication Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,

More information

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel Introduction to Coding Theory CMU: Spring 2010 Notes 3: Stochastic channels and noisy coding theorem bound January 2010 Lecturer: Venkatesan Guruswami Scribe: Venkatesan Guruswami We now turn to the basic

More information

Lecture 2: August 31

Lecture 2: August 31 0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 2: August 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy

More information

Lecture 11: Polar codes construction

Lecture 11: Polar codes construction 15-859: Information Theory and Applications in TCS CMU: Spring 2013 Lecturer: Venkatesan Guruswami Lecture 11: Polar codes construction February 26, 2013 Scribe: Dan Stahlke 1 Polar codes: recap of last

More information

The necessity and sufficiency of anytime capacity for control over a noisy communication link: Parts I and II

The necessity and sufficiency of anytime capacity for control over a noisy communication link: Parts I and II The necessity and sufficiency of anytime capacity for control over a noisy communication link: Parts I and II Anant Sahai, Sanjoy Mitter sahai@eecs.berkeley.edu, mitter@mit.edu Abstract We review how Shannon

More information

An introduction to basic information theory. Hampus Wessman

An introduction to basic information theory. Hampus Wessman An introduction to basic information theory Hampus Wessman Abstract We give a short and simple introduction to basic information theory, by stripping away all the non-essentials. Theoretical bounds on

More information

for some error exponent E( R) as a function R,

for some error exponent E( R) as a function R, . Capacity-achieving codes via Forney concatenation Shannon s Noisy Channel Theorem assures us the existence of capacity-achieving codes. However, exhaustive search for the code has double-exponential

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions EE/Stat 376B Handout #5 Network Information Theory October, 14, 014 1. Problem.4 parts (b) and (c). Homework Set # Solutions (b) Consider h(x + Y ) h(x + Y Y ) = h(x Y ) = h(x). (c) Let ay = Y 1 + Y, where

More information

Lecture 8: Shannon s Noise Models

Lecture 8: Shannon s Noise Models Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have

More information

Lecture 11: Quantum Information III - Source Coding

Lecture 11: Quantum Information III - Source Coding CSCI5370 Quantum Computing November 25, 203 Lecture : Quantum Information III - Source Coding Lecturer: Shengyu Zhang Scribe: Hing Yin Tsang. Holevo s bound Suppose Alice has an information source X that

More information

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem LECTURE 15 Last time: Feedback channel: setting up the problem Perfect feedback Feedback capacity Data compression Lecture outline Joint source and channel coding theorem Converse Robustness Brain teaser

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

Lecture 7 September 24

Lecture 7 September 24 EECS 11: Coding for Digital Communication and Beyond Fall 013 Lecture 7 September 4 Lecturer: Anant Sahai Scribe: Ankush Gupta 7.1 Overview This lecture introduces affine and linear codes. Orthogonal signalling

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

LECTURE 3. Last time:

LECTURE 3. Last time: LECTURE 3 Last time: Mutual Information. Convexity and concavity Jensen s inequality Information Inequality Data processing theorem Fano s Inequality Lecture outline Stochastic processes, Entropy rate

More information

On ARQ for Packet Erasure Channels with Bernoulli Arrivals

On ARQ for Packet Erasure Channels with Bernoulli Arrivals On ARQ for Packet Erasure Channels with Bernoulli Arrivals Dinkar Vasudevan, Vijay G. Subramanian and Douglas J. Leith Hamilton Institute, National University of Ireland, Maynooth Abstract We study packet

More information

Second-Order Asymptotics in Information Theory

Second-Order Asymptotics in Information Theory Second-Order Asymptotics in Information Theory Vincent Y. F. Tan (vtan@nus.edu.sg) Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) National Taiwan University November 2015

More information

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 Please submit the solutions on Gradescope. EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 1. Optimal codeword lengths. Although the codeword lengths of an optimal variable length code

More information

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 Capacity Theorems for Discrete, Finite-State Broadcast Channels With Feedback and Unidirectional Receiver Cooperation Ron Dabora

More information

A Simple Encoding And Decoding Strategy For Stabilization Over Discrete Memoryless Channels

A Simple Encoding And Decoding Strategy For Stabilization Over Discrete Memoryless Channels A Simple Encoding And Decoding Strategy For Stabilization Over Discrete Memoryless Channels Anant Sahai and Hari Palaiyanur Dept. of Electrical Engineering and Computer Sciences University of California,

More information

On the Error Exponents of ARQ Channels with Deadlines

On the Error Exponents of ARQ Channels with Deadlines On the Error Exponents of ARQ Channels with Deadlines Praveen Kumar Gopala, Young-Han Nam and Hesham El Gamal arxiv:cs/06006v [cs.it] 8 Oct 2006 March 22, 208 Abstract We consider communication over Automatic

More information

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information.

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information. L65 Dept. of Linguistics, Indiana University Fall 205 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission rate

More information

Dept. of Linguistics, Indiana University Fall 2015

Dept. of Linguistics, Indiana University Fall 2015 L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 28 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission

More information

LECTURE 13. Last time: Lecture outline

LECTURE 13. Last time: Lecture outline LECTURE 13 Last time: Strong coding theorem Revisiting channel and codes Bound on probability of error Error exponent Lecture outline Fano s Lemma revisited Fano s inequality for codewords Converse to

More information

Lecture 12. Block Diagram

Lecture 12. Block Diagram Lecture 12 Goals Be able to encode using a linear block code Be able to decode a linear block code received over a binary symmetric channel or an additive white Gaussian channel XII-1 Block Diagram Data

More information

On Bit Error Rate Performance of Polar Codes in Finite Regime

On Bit Error Rate Performance of Polar Codes in Finite Regime On Bit Error Rate Performance of Polar Codes in Finite Regime A. Eslami and H. Pishro-Nik Abstract Polar codes have been recently proposed as the first low complexity class of codes that can provably achieve

More information

Turbo Compression. Andrej Rikovsky, Advisor: Pavol Hanus

Turbo Compression. Andrej Rikovsky, Advisor: Pavol Hanus Turbo Compression Andrej Rikovsky, Advisor: Pavol Hanus Abstract Turbo codes which performs very close to channel capacity in channel coding can be also used to obtain very efficient source coding schemes.

More information

Chapter 1 Elements of Information Theory for Networked Control Systems

Chapter 1 Elements of Information Theory for Networked Control Systems Chapter 1 Elements of Information Theory for Networked Control Systems Massimo Franceschetti and Paolo Minero 1.1 Introduction Next generation cyber-physical systems [35] will integrate computing, communication,

More information

Introduction to Convolutional Codes, Part 1

Introduction to Convolutional Codes, Part 1 Introduction to Convolutional Codes, Part 1 Frans M.J. Willems, Eindhoven University of Technology September 29, 2009 Elias, Father of Coding Theory Textbook Encoder Encoder Properties Systematic Codes

More information

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1 Kraft s inequality An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if N 2 l i 1 Proof: Suppose that we have a tree code. Let l max = max{l 1,...,

More information

Source Coding with Lists and Rényi Entropy or The Honey-Do Problem

Source Coding with Lists and Rényi Entropy or The Honey-Do Problem Source Coding with Lists and Rényi Entropy or The Honey-Do Problem Amos Lapidoth ETH Zurich October 8, 2013 Joint work with Christoph Bunte. A Task from your Spouse Using a fixed number of bits, your spouse

More information

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122 Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel

More information

Coding into a source: a direct inverse Rate-Distortion theorem

Coding into a source: a direct inverse Rate-Distortion theorem Coding into a source: a direct inverse Rate-Distortion theorem Mukul Agarwal, Anant Sahai, and Sanjoy Mitter Abstract Shannon proved that if we can transmit bits reliably at rates larger than the rate

More information

The Compound Capacity of Polar Codes

The Compound Capacity of Polar Codes The Compound Capacity of Polar Codes S. Hamed Hassani, Satish Babu Korada and Rüdiger Urbanke arxiv:97.329v [cs.it] 9 Jul 29 Abstract We consider the compound capacity of polar codes under successive cancellation

More information

Shannon s Noisy-Channel Coding Theorem

Shannon s Noisy-Channel Coding Theorem Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 13, 2015 Lucas Slot, Sebastian Zur Shannon s Noisy-Channel Coding Theorem February 13, 2015 1 / 29 Outline 1 Definitions and Terminology

More information

6.02 Fall 2012 Lecture #1

6.02 Fall 2012 Lecture #1 6.02 Fall 2012 Lecture #1 Digital vs. analog communication The birth of modern digital communication Information and entropy Codes, Huffman coding 6.02 Fall 2012 Lecture 1, Slide #1 6.02 Fall 2012 Lecture

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

The Method of Types and Its Application to Information Hiding

The Method of Types and Its Application to Information Hiding The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,

More information

Arimoto Channel Coding Converse and Rényi Divergence

Arimoto Channel Coding Converse and Rényi Divergence Arimoto Channel Coding Converse and Rényi Divergence Yury Polyanskiy and Sergio Verdú Abstract Arimoto proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University) Performance-based Security for Encoding of Information Signals FA9550-15-1-0180 (2015-2018) Paul Cuff (Princeton University) Contributors Two students finished PhD Tiance Wang (Goldman Sachs) Eva Song

More information

Multimedia Systems WS 2010/2011

Multimedia Systems WS 2010/2011 Multimedia Systems WS 2010/2011 15.11.2010 M. Rahamatullah Khondoker (Room # 36/410 ) University of Kaiserslautern Department of Computer Science Integrated Communication Systems ICSY http://www.icsy.de

More information

Introduction to Wireless & Mobile Systems. Chapter 4. Channel Coding and Error Control Cengage Learning Engineering. All Rights Reserved.

Introduction to Wireless & Mobile Systems. Chapter 4. Channel Coding and Error Control Cengage Learning Engineering. All Rights Reserved. Introduction to Wireless & Mobile Systems Chapter 4 Channel Coding and Error Control 1 Outline Introduction Block Codes Cyclic Codes CRC (Cyclic Redundancy Check) Convolutional Codes Interleaving Information

More information

Computing and Communications 2. Information Theory -Entropy

Computing and Communications 2. Information Theory -Entropy 1896 1920 1987 2006 Computing and Communications 2. Information Theory -Entropy Ying Cui Department of Electronic Engineering Shanghai Jiao Tong University, China 2017, Autumn 1 Outline Entropy Joint entropy

More information

Energy State Amplification in an Energy Harvesting Communication System

Energy State Amplification in an Energy Harvesting Communication System Energy State Amplification in an Energy Harvesting Communication System Omur Ozel Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland College Park, MD 20742 omur@umd.edu

More information

Network coding for multicast relation to compression and generalization of Slepian-Wolf

Network coding for multicast relation to compression and generalization of Slepian-Wolf Network coding for multicast relation to compression and generalization of Slepian-Wolf 1 Overview Review of Slepian-Wolf Distributed network compression Error exponents Source-channel separation issues

More information

Source coding and channel requirements for unstable processes. Anant Sahai, Sanjoy Mitter

Source coding and channel requirements for unstable processes. Anant Sahai, Sanjoy Mitter Source coding and channel requirements for unstable processes Anant Sahai, Sanjoy Mitter sahai@eecs.berkeley.edu, mitter@mit.edu Abstract Our understanding of information in systems has been based on the

More information

Lecture 3: Channel Capacity

Lecture 3: Channel Capacity Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.

More information

One Lesson of Information Theory

One Lesson of Information Theory Institut für One Lesson of Information Theory Prof. Dr.-Ing. Volker Kühn Institute of Communications Engineering University of Rostock, Germany Email: volker.kuehn@uni-rostock.de http://www.int.uni-rostock.de/

More information

Multimedia Communications. Mathematical Preliminaries for Lossless Compression

Multimedia Communications. Mathematical Preliminaries for Lossless Compression Multimedia Communications Mathematical Preliminaries for Lossless Compression What we will see in this chapter Definition of information and entropy Modeling a data source Definition of coding and when

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 11 Project

More information

X 1 : X Table 1: Y = X X 2

X 1 : X Table 1: Y = X X 2 ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access

More information

Entropies & Information Theory

Entropies & Information Theory Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information

More information

Lecture 14 February 28

Lecture 14 February 28 EE/Stats 376A: Information Theory Winter 07 Lecture 4 February 8 Lecturer: David Tse Scribe: Sagnik M, Vivek B 4 Outline Gaussian channel and capacity Information measures for continuous random variables

More information

Estimating a linear process using phone calls

Estimating a linear process using phone calls Estimating a linear process using phone calls Mohammad Javad Khojasteh, Massimo Franceschetti, Gireeja Ranade Abstract We consider the problem of estimating an undisturbed, scalar, linear process over

More information

Massachusetts Institute of Technology

Massachusetts Institute of Technology Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Department of Mechanical Engineering 6.050J/2.0J Information and Entropy Spring 2005 Issued: March 7, 2005

More information

Error Exponent Regions for Gaussian Broadcast and Multiple Access Channels

Error Exponent Regions for Gaussian Broadcast and Multiple Access Channels Error Exponent Regions for Gaussian Broadcast and Multiple Access Channels Lihua Weng, S. Sandeep Pradhan, and Achilleas Anastasopoulos Submitted: December, 5 Abstract In modern communication systems,

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is

More information

Appendix B Information theory from first principles

Appendix B Information theory from first principles Appendix B Information theory from first principles This appendix discusses the information theory behind the capacity expressions used in the book. Section 8.3.4 is the only part of the book that supposes

More information

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

Run-length & Entropy Coding. Redundancy Removal. Sampling. Quantization. Perform inverse operations at the receiver EEE

Run-length & Entropy Coding. Redundancy Removal. Sampling. Quantization. Perform inverse operations at the receiver EEE General e Image Coder Structure Motion Video x(s 1,s 2,t) or x(s 1,s 2 ) Natural Image Sampling A form of data compression; usually lossless, but can be lossy Redundancy Removal Lossless compression: predictive

More information

National University of Singapore Department of Electrical & Computer Engineering. Examination for

National University of Singapore Department of Electrical & Computer Engineering. Examination for National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:

More information

Intermittent Communication

Intermittent Communication Intermittent Communication Mostafa Khoshnevisan, Student Member, IEEE, and J. Nicholas Laneman, Senior Member, IEEE arxiv:32.42v2 [cs.it] 7 Mar 207 Abstract We formulate a model for intermittent communication

More information

Common Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014

Common Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014 Common Information Abbas El Gamal Stanford University Viterbi Lecture, USC, April 2014 Andrew Viterbi s Fabulous Formula, IEEE Spectrum, 2010 El Gamal (Stanford University) Disclaimer Viterbi Lecture 2

More information

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute ENEE 739C: Advanced Topics in Signal Processing: Coding Theory Instructor: Alexander Barg Lecture 6 (draft; 9/6/03. Error exponents for Discrete Memoryless Channels http://www.enee.umd.edu/ abarg/enee739c/course.html

More information

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes Lecture Notes 7 Random Processes Definition IID Processes Bernoulli Process Binomial Counting Process Interarrival Time Process Markov Processes Markov Chains Classification of States Steady State Probabilities

More information

Lecture 3. Mathematical methods in communication I. REMINDER. A. Convex Set. A set R is a convex set iff, x 1,x 2 R, θ, 0 θ 1, θx 1 + θx 2 R, (1)

Lecture 3. Mathematical methods in communication I. REMINDER. A. Convex Set. A set R is a convex set iff, x 1,x 2 R, θ, 0 θ 1, θx 1 + θx 2 R, (1) 3- Mathematical methods in communication Lecture 3 Lecturer: Haim Permuter Scribe: Yuval Carmel, Dima Khaykin, Ziv Goldfeld I. REMINDER A. Convex Set A set R is a convex set iff, x,x 2 R, θ, θ, θx + θx

More information

Principles of Communications

Principles of Communications Principles of Communications Weiyao Lin Shanghai Jiao Tong University Chapter 10: Information Theory Textbook: Chapter 12 Communication Systems Engineering: Ch 6.1, Ch 9.1~ 9. 92 2009/2010 Meixia Tao @

More information

Error Correcting Codes: Combinatorics, Algorithms and Applications Spring Homework Due Monday March 23, 2009 in class

Error Correcting Codes: Combinatorics, Algorithms and Applications Spring Homework Due Monday March 23, 2009 in class Error Correcting Codes: Combinatorics, Algorithms and Applications Spring 2009 Homework Due Monday March 23, 2009 in class You can collaborate in groups of up to 3. However, the write-ups must be done

More information

On Third-Order Asymptotics for DMCs

On Third-Order Asymptotics for DMCs On Third-Order Asymptotics for DMCs Vincent Y. F. Tan Institute for Infocomm Research (I R) National University of Singapore (NUS) January 0, 013 Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs

More information

Polar Coding. Part 1 - Background. Erdal Arıkan. Electrical-Electronics Engineering Department, Bilkent University, Ankara, Turkey

Polar Coding. Part 1 - Background. Erdal Arıkan. Electrical-Electronics Engineering Department, Bilkent University, Ankara, Turkey Polar Coding Part 1 - Background Erdal Arıkan Electrical-Electronics Engineering Department, Bilkent University, Ankara, Turkey Algorithmic Coding Theory Workshop June 13-17, 2016 ICERM, Providence, RI

More information

Shannon and Poisson. sergio verdú

Shannon and Poisson. sergio verdú Shannon and Poisson sergio verdú P λ (k) = e λλk k! deaths from horse kicks in the Prussian cavalry. photons arriving at photodetector packets arriving at a router DNA mutations Poisson entropy 3.5 3.0

More information

On Common Information and the Encoding of Sources that are Not Successively Refinable

On Common Information and the Encoding of Sources that are Not Successively Refinable On Common Information and the Encoding of Sources that are Not Successively Refinable Kumar Viswanatha, Emrah Akyol, Tejaswi Nanjundaswamy and Kenneth Rose ECE Department, University of California - Santa

More information