Feedback and Side-Information in Information Theory

Size: px
Start display at page:

Download "Feedback and Side-Information in Information Theory"

Transcription

1 Feedback and Side-Information in Information Theory Anant Sahai and Sekhar Tatikonda UC Berkeley and Yale and ISIT 27 Tutorial T2 Nice, France June 24, 27 Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 / 95

2 Our Collaborators Vivek Borkar (TIFR) Cheng Chang (Berkeley) Giacomo Como (Torino) Stark Draper (Wisconsin) Nicola Elia (Iowa) Alek Kavcic (Hawaii) Jialing Liu (Iowa) Sanjoy Mitter (MIT) Hari Palaiyanur (Berkeley) Tunc Simsek (Mathworks) Shaohua Yang (Marvell) Serdar Yuksel (Queens) Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 2 / 95

3 Outline Motivation and background 2 Feedback, delay, and error probability: memoryless channels Idealized cases with magical feedback Unreliable/Noisy/Limited feedback 3 Feedback and memory The power of Markov models State vs output feedback 4 Conclusions and open problems Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 3 / 95

4 Motivation We want to engineer reliable and robust communication systems that appropriately share limited communication resources to deliver high performance at reasonable cost. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 4 / 95

5 Motivation We want to engineer reliable and robust communication systems that appropriately share limited communication resources to deliver high performance at reasonable cost. What is the role of information theory? Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 4 / 95

6 Motivation We want to engineer reliable and robust communication systems that appropriately share limited communication resources to deliver high performance at reasonable cost. What is the role of information theory? Determine fundamental limits to performance Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 4 / 95

7 Motivation We want to engineer reliable and robust communication systems that appropriately share limited communication resources to deliver high performance at reasonable cost. What is the role of information theory? Determine fundamental limits to performance Develop metrics that reflect the goals Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 4 / 95

8 Motivation We want to engineer reliable and robust communication systems that appropriately share limited communication resources to deliver high performance at reasonable cost. What is the role of information theory? Determine fundamental limits to performance Develop metrics that reflect the goals Give computable targets to aim for (e.g. asymptotic capacity and scaling) Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 4 / 95

9 Motivation We want to engineer reliable and robust communication systems that appropriately share limited communication resources to deliver high performance at reasonable cost. What is the role of information theory? Determine fundamental limits to performance Develop metrics that reflect the goals Give computable targets to aim for (e.g. asymptotic capacity and scaling) Identify bottlenecks and key constraints Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 4 / 95

10 Motivation We want to engineer reliable and robust communication systems that appropriately share limited communication resources to deliver high performance at reasonable cost. What is the role of information theory? Determine fundamental limits to performance Develop metrics that reflect the goals Give computable targets to aim for (e.g. asymptotic capacity and scaling) Identify bottlenecks and key constraints Guide overall system architecture Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 4 / 95

11 Motivation We want to engineer reliable and robust communication systems that appropriately share limited communication resources to deliver high performance at reasonable cost. What is the role of information theory? Determine fundamental limits to performance Develop metrics that reflect the goals Give computable targets to aim for (e.g. asymptotic capacity and scaling) Identify bottlenecks and key constraints Guide overall system architecture Develop nonasymptotic algorithms that are robustly implementable. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 4 / 95

12 Feedback and side-information Communication is idealized as a one-way flow of information. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 5 / 95

13 Feedback and side-information Feedback Communication is idealized as a one-way flow of information. But the medium usually supports a path in reverse. How should it be used? Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 5 / 95

14 Feedback and side-information Feedback Communication is idealized as a one-way flow of information. But the medium usually supports a path in reverse. How should it be used? Left alone for other users. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 5 / 95

15 Feedback and side-information Feedback Communication is idealized as a one-way flow of information. But the medium usually supports a path in reverse. How should it be used? Left alone for other users. Reduce communication needs. (Yao 79, Orlitsky/El-Gamal 84, 9, 92) Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 5 / 95

16 Feedback and side-information Feedback Communication is idealized as a one-way flow of information. But the medium usually supports a path in reverse. How should it be used? Left alone for other users. Reduce communication needs. (Yao 79, Orlitsky/El-Gamal 84, 9, 92) For learning/adaptation/universality. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 5 / 95

17 Feedback and side-information Feedback Communication is idealized as a one-way flow of information. But the medium usually supports a path in reverse. How should it be used? Left alone for other users. Reduce communication needs. (Yao 79, Orlitsky/El-Gamal 84, 9, 92) For learning/adaptation/universality. Improve QoS: delay, Pe, rate. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 5 / 95

18 Feedback and side-information Feedback Communication is idealized as a one-way flow of information. But the medium usually supports a path in reverse. How should it be used? Left alone for other users. Reduce communication needs. (Yao 79, Orlitsky/El-Gamal 84, 9, 92) For learning/adaptation/universality. Improve QoS: delay, Pe, rate. Simplify implementation. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 5 / 95

19 Reason for hope: Schalkwijk/Kailath ( 68) scheme Y t = X t + V t where {V t } iid N(, ) E[X 2 t ] X t P = f t (m, Y t ) Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 6 / 95

20 Reason for hope: Schalkwijk/Kailath ( 68) scheme Y t = X t + V t where {V t } iid N(, ) E[X 2 t ] X t P = f t (m, Y t ) Encode message m using 2 nr -level PAM into X. Use feedback to capture first noise Gaussian V. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 6 / 95

21 Reason for hope: Schalkwijk/Kailath ( 68) scheme Y t = X t + V t where {V t } iid N(, ) E[X 2 t ] X t P = f t (m, Y t ) Encode message m using 2 nr -level PAM into X. Use feedback to capture first noise Gaussian V. Recursively MMSE estimate that Gaussian. Channel-input: linear scaling of MMSE error Optimal factor reduction in error with each use. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 6 / 95

22 Reason for hope: Schalkwijk/Kailath ( 68) scheme Y t = X t + V t where {V t } iid N(, ) E[X 2 t ] X t P = f t (m, Y t ) Encode message m using 2 nr -level PAM into X. Use feedback to capture first noise Gaussian V. Recursively MMSE estimate that Gaussian. Channel-input: linear scaling of MMSE error Optimal factor reduction in error with each use. Subtract noise estimate from original Y and quantize. Error if MMSE error is larger than quantization bin. Probability exponentially small in MMSE error standard deviation. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 6 / 95

23 Reason for hope: Schalkwijk/Kailath ( 68) scheme Y t = X t + V t where {V t } iid N(, ) E[X 2 t ] X t P = f t (m, Y t ) Encode message m using 2 nr -level PAM into X. Use feedback to capture first noise Gaussian V. Recursively MMSE estimate that Gaussian. Channel-input: linear scaling of MMSE error Optimal factor reduction in error with each use. Subtract noise estimate from original Y and quantize. Error if MMSE error is larger than quantization bin. Probability exponentially small in MMSE error standard deviation. Double-exponential decay of P e and very simple scheme! Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 6 / 95

24 Reasons for despair: memoryless channels Even the S/K scheme does not beat 2 log( + P N ) Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 7 / 95

25 Reasons for despair: memoryless channels Even the S/K scheme does not beat 2 log( + P N ) Shannon proved that capacity does not increase for general memoryless channels with even perfect feedback. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 7 / 95

26 Reasons for despair: memoryless channels Even the S/K scheme does not beat 2 log( + P N ) Shannon proved that capacity does not increase for general memoryless channels with even perfect feedback. Intuition: consider BEC case. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 7 / 95

27 Reasons for despair: memoryless channels Even the S/K scheme does not beat 2 log( + P N ) Shannon proved that capacity does not increase for general memoryless channels with even perfect feedback. Intuition: consider BEC case. Proof trick: consider mutual-information between uniformly-drawn message and channel outputs. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 7 / 95

28 Reasons for despair: memoryless channels Even the S/K scheme does not beat 2 log( + P N ) Shannon proved that capacity does not increase for general memoryless channels with even perfect feedback. Intuition: consider BEC case. Proof trick: consider mutual-information between uniformly-drawn message and channel outputs. What is wrong with I(X n ; Yn )? Consider X, Y disconnected. Y is just random noise. C = Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 7 / 95

29 Reasons for despair: memoryless channels Even the S/K scheme does not beat 2 log( + P N ) Shannon proved that capacity does not increase for general memoryless channels with even perfect feedback. Intuition: consider BEC case. Proof trick: consider mutual-information between uniformly-drawn message and channel outputs. What is wrong with I(X n; Yn )? Consider X, Y disconnected. Y is just random noise. C = Use Xi = Y i as encoding. I(X n; Yn ) = (n )H(Y) >. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 7 / 95

30 It gets worse Feedback communications was an area of intense activity in A number of authors had shown constructive, even simple, schemes using noiseless feedback to achieve Shannon-like behavior... The situation in 973 is dramatically different... The subject itself seems to be a burned out case... In extending the simple noiseless feedback model to allow for more realistic situations, such as noisy feedback channels, bandlimited channels, and peak power constraints, theorists discovered a certain brittleness or sensitivity in their previous results... Robert Lucky (973) Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 8 / 95

31 Outline Motivation and background 2 Feedback, delay, and error probability: memoryless channels Idealized cases with magical feedback Block-oriented results Fixed-delay with hard deadlines Fixed-delay with soft deadlines Unreliable/Noisy feedback 3 Feedback and memory 4 Conclusions and open problems Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 9 / 95

32 Shannon s Prophecy: Feedback Delay is the most basic price of reliability [The duality between source and channel coding] can be pursued further and is related to a duality between past and future and the notions of control and knowledge. Thus we may have knowledge of the past and cannot control it; we may control the future but have no knowledge of it. Claude Shannon 59 Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 / 95

33 Review of block coding Long block codes are the traditional info theory approach Source: X n B Rn X n Channel: B Rn X n Yn B Rn Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 / 95

34 Review of block coding Long block codes are the traditional info theory approach Source: X n B Rn X n Channel: B Rn X n Yn B Rn No real sense of time, except trivial interpretation Source-coding: randomness is before encoding Channel-coding: randomness is after encoding Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 / 95

35 Review of block coding Long block codes are the traditional info theory approach Source: X n B Rn X n Channel: B Rn X n Yn B Rn No real sense of time, except trivial interpretation Source-coding: randomness is before encoding Channel-coding: randomness is after encoding Block error exponents: P e exp( ne(r)) Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 / 95

36 Review of block coding Long block codes are the traditional info theory approach Source: X n B Rn X n Channel: B Rn X n Yn B Rn No real sense of time, except trivial interpretation Source-coding: randomness is before encoding Channel-coding: randomness is after encoding Block error exponents: P e exp( ne(r)) Source coding: E b (R) = min D(Q P) Q:H(Q) R = sup ρ E (ρ) = ln[ x ρr E (ρ) P(x) +ρ] (+ρ) Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 / 95

37 Review of block coding Long block codes are the traditional info theory approach Source: X n B Rn X n Channel: B Rn X n Yn B Rn No real sense of time, except trivial interpretation Source-coding: randomness is before encoding Channel-coding: randomness is after encoding Block error exponents: P e exp( ne(r)) Generic channels: (Haroutunian 77) E + (R) = inf max D(G P q) G:C(G)<R q Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 / 95

38 Review of block coding Long block codes are the traditional info theory approach Source: X n B Rn X n Channel: B Rn X n Yn B Rn No real sense of time, except trivial interpretation Source-coding: randomness is before encoding Channel-coding: randomness is after encoding Block error exponents: P e exp( ne(r)) Channel sphere-packing bound: (Haroutunian 68, Blahut 74) E sp (R) = max q = sup ρ min D(G P q) G:I( q,g) R E (ρ) ρr E (ρ) = max ln q y [ q x p x ] (+ρ) +ρ y x Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 / 95

39 My favorite example: the BEC δ δ e δ δ Simple capacity δ bits per channel use With perfect feedback, simple to achieve: retransmit until it gets through Time till success: Geometric( δ) Expected time to get through: δ Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 3 / 95

40 My favorite example: the BEC δ δ e δ δ Error Exponent (base 2) Simple capacity δ bits per channel use With perfect feedback, simple to achieve: retransmit until it gets through Time till success: Geometric( δ) Expected time to get through: δ Rate (in bits) Classical bounds Sphere-packing bound D( R δ) Random coding bound max ρ [,] E (ρ) ρr Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 3 / 95

41 My favorite example: the BEC δ δ e δ δ Error Exponent (base 2) Simple capacity δ bits per channel use With perfect feedback, simple to achieve: retransmit until it gets through Time till success: Geometric( δ) Expected time to get through: δ Rate (in bits) Classical bounds Sphere-packing bound D( R δ) Random coding bound max ρ [,] E (ρ) ρr What happens with feedback? Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 3 / 95

42 BEC with feedback and fixed blocks At rate R <, have Rn bits to transmit in n channel uses. Typically ( δ)n code bits will be received. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 4 / 95

43 BEC with feedback and fixed blocks At rate R <, have Rn bits to transmit in n channel uses. Typically ( δ)n code bits will be received. Block errors caused by atypical channel behavior. Doomed if fewer than Rn bits arrive intact. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 4 / 95

44 BEC with feedback and fixed blocks At rate R <, have Rn bits to transmit in n channel uses. Typically ( δ)n code bits will be received. Block errors caused by atypical channel behavior. Doomed if fewer than Rn bits arrive intact. Feedback can not save us. D( R δ) Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 4 / 95

45 BEC with feedback and fixed blocks At rate R <, have Rn bits to transmit in n channel uses. Typically ( δ)n code bits will be received. Block errors caused by atypical channel behavior. Doomed if fewer than Rn bits arrive intact. Feedback can not save us. D( R δ) Dobrushin-62 showed that this type of behavior is common: E + (R) = E sp (R) for symmetric channels. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 4 / 95

46 Review: Fixed blocks x x x Feedback is pointless x x x x x x Hard decision regions cover space Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 5 / 95

47 Review: Fixed blocks, Soft deadlines x x x x x x x x x Hard decisions regions but check hash signatures Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 5 / 95

48 Review: Fixed blocks, Soft deadlines x x x x x x x x x Hard decisions regions but check hash signatures bit feedback can request retransmissions Can interpret as expected block-length Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 5 / 95

49 Review: Fixed blocks, Soft deadlines x x x x x x x x x Hard decisions regions but check hash signatures bit feedback can request retransmissions Can interpret as expected block-length Run close to capacity Use n(c R) bits for signatures Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 5 / 95

50 Review: Fixed blocks, Soft deadlines x x x x x x x x x bit feedback can request retransmissions Can interpret as expected block-length Run close to capacity Use n(c R) bits for signatures Linear slope for error exponent Hard decisions regions but check hash signatures Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 5 / 95

51 Review: Fixed blocks, Soft deadlines: Forney-68 x x x x x x bit feedback can request retransmissions Can interpret as expected block-length x x x Refuse to decide when ambiguous Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 5 / 95

52 Review: Fixed blocks, Soft deadlines: Forney-68 x x x x x x bit feedback can request retransmissions Can interpret as expected block-length Decision regions catch the typical sets only x x x Refuse to decide when ambiguous Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 5 / 95

53 Review: Fixed blocks, Soft deadlines: Forney-68 x x x x x x bit feedback can request retransmissions Can interpret as expected block-length Decision regions catch the typical sets only x x x Better error exponents at lower rates Refuse to decide when ambiguous Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 5 / 95

54 Review: Fixed blocks, Soft deadlines: Burnashev-76 Is more feedback helpful? Burnashev said yes: Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 6 / 95

55 Review: Fixed blocks, Soft deadlines: Burnashev-76 Is more feedback helpful? Burnashev said yes: Considered expected stopping time and used Martingale arguments. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 6 / 95

56 Review: Fixed blocks, Soft deadlines: Burnashev-76 Is more feedback helpful? Burnashev said yes: Considered expected stopping time and used Martingale arguments. Showed C ( R C ) was a bound where C = max i,j D(p i p j ) Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 6 / 95

57 Review: Fixed blocks, Soft deadlines: Burnashev-76 Is more feedback helpful? Burnashev said yes: Considered expected stopping time and used Martingale arguments. Showed C ( R C ) was a bound where C = max i,j D(p i p j ) Burnashev bound exponent Forney performance exponent Sphere packing packing boundexponent Random signature performance error exponent average communication rate (bits/channel use) Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 6 / 95

58 Review: Yamamoto-Itoh-79 strategy attains the Burnashev bound Block length n Data Transmission Ack/Nak... possible... retransmissions Enc λ n Decision feedback = m Dec ( λ ) n Data Transmission: λn channel uses for block code at R C Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 7 / 95

59 Review: Yamamoto-Itoh-79 strategy attains the Burnashev bound Block length n Data Transmission Ack/Nak... possible... retransmissions Enc λ n Decision feedback = m Dec ( λ ) n Data Transmission: λn channel uses for block code at R C Decision Feedback: ˆm sent back Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 7 / 95

60 Review: Yamamoto-Itoh-79 strategy attains the Burnashev bound Block length n Data Transmission Ack/Nak... possible... retransmissions Enc λ n Decision feedback = m Dec ( λ ) n Data Transmission: λn channel uses for block code at R C Decision Feedback: ˆm sent back Confirm/Deny: ( λ)n channel uses to ACK or NAK Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 7 / 95

61 Review: Yamamoto-Itoh-79 strategy attains the Burnashev bound Block length n Data Transmission Ack/Nak... possible... retransmissions Enc λ n Decision feedback = m Dec ( λ ) n Data Transmission: λn channel uses for block code at R C Decision Feedback: ˆm sent back Confirm/Deny: ( λ)n channel uses to ACK or NAK If confirmed, decode to ˆm otherwise erase. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 7 / 95

62 Review: Yamamoto-Itoh-79 strategy attains the Burnashev bound Block length n Data Transmission Ack/Nak... possible... retransmissions λ n ( λ ) n Enc Decision feedback = m Dec Data Transmission: λn channel uses for block code at R C Decision Feedback: ˆm sent back Confirm/Deny: ( λ)n channel uses to ACK or NAK If confirmed, decode to ˆm otherwise erase. Pr[err]=Pr[NAK ACK]=2 ( λ)nc 2 nc ( R C ) Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 7 / 95

63 Review: Yamamoto-Itoh-79 strategy attains the Burnashev bound Block length n Data Transmission Ack/Nak... possible... retransmissions λ n ( λ ) n Enc Decision feedback = m Dec Data Transmission: λn channel uses for block code at R C Decision Feedback: ˆm sent back Confirm/Deny: ( λ)n channel uses to ACK or NAK If confirmed, decode to ˆm otherwise erase. Pr[err]=Pr[NAK ACK]=2 ( λ)nc 2 nc ( R C ) Moral: Collective reward/punishment is good for reliability Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 7 / 95

64 Outline Motivation and background 2 Feedback, delay, and error probability: memoryless channels Block-oriented results Idealized case: fixed-delay with hard deadlines The BEC example Without feedback The focusing bound The source-coding analogy Approaching the focusing bound (with help) No help and universality Idealized case: Fixed-delay with soft deadlines Unreliable/Noisy feedback 3 Feedback and memory 4 Conclusions and open problems Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 8 / 95

65 What do we mean by fixed delay? B B 2 B 3 B 4 B 5 B 6 B 7 B 8 B 9 B B B 2 B 3 X X 2 X 3 X 4 X 5 X 6 X 7 X 8 X 9 X X X 2 X 3 X 4 X 5 X 6 X 7 X 8 X 9 X 2 X 2 X 22 X 23 X 24 X 25 X 26 Y Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y Y Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y 2 Y 2 Y 22 Y 23 Y 24 Y 25 Y 26 bb bb 2 bb 3 bb 4 bb 5 bb 6 bb 7 bb 8 bb 9 fixed delay d = 7 Hard deadlines: must commit to each bit Soft deadlines: allow saying I don t know Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 2 / 95

66 BEC with feedback and fixed delay R = 2 example: δ 2 δ 2 δ 2 δ 2 ( δ) 2 ( δ) 2 ( δ) 2 ( δ) Birth-death chain: positive recurrent if δ < 2 Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

67 BEC with feedback and fixed delay R = 2 example: δ 2 δ 2 δ 2 δ 2 ( δ) 2 ( δ) 2 ( δ) 2 ( δ) Birth-death chain: positive recurrent if δ < 2 Delay exponent easy to see: P(D d) = P(L > d 2 ) = K( δ δ )d Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

68 BEC with feedback and fixed delay R = 2 example: δ 2 δ 2 δ 2 δ 2 ( δ) 2 ( δ) 2 ( δ) 2 ( δ) Birth-death chain: positive recurrent if δ < 2 Delay exponent easy to see: P(D d) = P(L > d 2 ) = K( δ δ )d.584 vs.294 for block-coding with δ =.4 Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

69 BEC with feedback and fixed delay R = 2 example: δ 2 δ 2 δ 2 δ 2 ( δ) 2 ( δ) 2 ( δ) 2 ( δ) Birth-death chain: positive recurrent if δ < 2 Delay exponent easy to see: P(D d) = P(L > d 2 ) = K( δ δ )d.584 vs.294 for block-coding with δ =.4 Block-coding is misleading! Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

70 The BEC: understanding why? δ δ e δ δ With perfect feedback, simple to achieve: retransmit until success Without feedback: send random parities and solve the equations Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

71 The BEC: understanding why? δ δ e δ δ Without feedback: send random parities and solve the equations With perfect feedback, simple to achieve: retransmit until success Queuing perspective: deterministic arrivals with independent geometric service times. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

72 The BEC: understanding why? δ δ e δ δ Without feedback: send random parities and solve the equations With perfect feedback, simple to achieve: retransmit until success Queuing perspective: deterministic arrivals with independent geometric service times. Would behave at least as well if the service times were independent and dominated by a geometric. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

73 So where is this boost coming from? 3 Progres (in bits) time (in BEC uses) Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

74 Is it feedback, delay, or the combination? B B 2 B 3 B 4 B 5 B 6 B 7 B 8 B 9 B B B 2 B 3 X X 2 X 3 X 4 X 5 X 6 X 7 X 8 X 9 X X X 2 X 3 X 4 X 5 X 6 X 7 X 8 X 9 X 2 X 2 X 22 X 23 X 24 X 25 X 26 Y Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y Y Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y 2 Y 2 Y 22 Y 23 Y 24 Y 25 Y 26 bb bb 2 bb 3 bb 4 bb 5 bb 6 bb 7 bb 8 bb 9 fixed delay d = 7 Can generally achieve E r (R) with delay using convolutional codes. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

75 Is it feedback, delay, or the combination? B B 2 B 3 B 4 B 5 B 6 B 7 B 8 B 9 B B B 2 B 3 X X 2 X 3 X 4 X 5 X 6 X 7 X 8 X 9 X X X 2 X 3 X 4 X 5 X 6 X 7 X 8 X 9 X 2 X 2 X 22 X 23 X 24 X 25 X 26 Y Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y Y Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y 2 Y 2 Y 22 Y 23 Y 24 Y 25 Y 26 bb bb 2 bb 3 bb 4 bb 5 bb 6 bb 7 bb 8 bb 9 fixed delay d = 7 Can generally achieve E r (R) with delay using convolutional codes. Pinsker (967: PPI ) claimed that the block-exponents continued to govern the non-block case with and without feedback. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

76 Nonblock codes without feedback Time Infinite binary tree, with iid random labels: Choose a path through the tree based on data bits Transmit the path labels through the channel Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

77 Nonblock codes without feedback Time. ML decoding Disjoint paths are pairwise independent of the true path. Er (R) analysis applies: future events dominate. Infinite binary tree, with iid random labels: Choose a path through the tree based on data bits Transmit the path labels through the channel Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

78 Nonblock codes without feedback Time. Infinite binary tree, with iid random labels: Choose a path through the tree based on data bits Transmit the path labels through the channel ML decoding Disjoint paths are pairwise independent of the true path. Er (R) analysis applies: future events dominate. Can implement with time-varying random convolutional code. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

79 Nonblock codes without feedback Time. Infinite binary tree, with iid random labels: Choose a path through the tree based on data bits Transmit the path labels through the channel ML decoding Disjoint paths are pairwise independent of the true path. Er (R) analysis applies: future events dominate. Can implement with time-varying random convolutional code. Achieves P e (d) K exp( E r (R)d) for every d for all R < C Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

80 Convolutional codes, feedback, and computation At R < E (), sequential decoding expands only a finite number of nodes on average. But each expansion costs the (growing) constraint length. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 3 / 95

81 Convolutional codes, feedback, and computation At R < E (), sequential decoding expands only a finite number of nodes on average. But each expansion costs the (growing) constraint length. Idea: run a copy of the decoder at the encoder Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 3 / 95

82 Convolutional codes, feedback, and computation At R < E (), sequential decoding expands only a finite number of nodes on average. But each expansion costs the (growing) constraint length. Idea: run a copy of the decoder at the encoder Convolve against: (B + B (n)),(b 2 + B 2 (n)),...,(b n + B n (n)), B n Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 3 / 95

83 Convolutional codes, feedback, and computation At R < E (), sequential decoding expands only a finite number of nodes on average. But each expansion costs the (growing) constraint length. Idea: run a copy of the decoder at the encoder Convolve against: (B + B (n)),(b 2 + B 2 (n)),...,(b n + B n (n)), B n Identical distance properties Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 3 / 95

84 Convolutional codes, feedback, and computation At R < E (), sequential decoding expands only a finite number of nodes on average. But each expansion costs the (growing) constraint length. Idea: run a copy of the decoder at the encoder Convolve against: (B + B (n)),(b 2 + B 2 (n)),...,(b n + B n (n)), B n Identical distance properties But only a finite number of expected nonzero terms Infinite-constraint length performance at a finite price! Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 3 / 95

85 Convolutional codes, feedback, and computation At R < E (), sequential decoding expands only a finite number of nodes on average. But each expansion costs the (growing) constraint length. Idea: run a copy of the decoder at the encoder Convolve against: (B + B (n)),(b 2 + B 2 (n)),...,(b n + B n (n)), B n Identical distance properties But only a finite number of expected nonzero terms Infinite-constraint length performance at a finite price! Same probability of error with delay: achieves E r (R) Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 3 / 95

86 Pinsker s bounding construction explained Without feedback: E + (R) continues to be a bound. Consider a code with target delay d Use it to construct a block-code with blocksize n d Genie-aided decoder: has the truth of all bits before i B feedforward delay Fixed bb Y delay decoder Y Fixed delay XOR XOR decoder... t eb (t d)r bb (t d)r eb Causal X encoder DMC B (t d)r B Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

87 Pinsker s bounding construction explained Without feedback: E + (R) continues to be a bound. Consider a code with target delay d Use it to construct a block-code with blocksize n d Genie-aided decoder: has the truth of all bits before i Error events for genie-aided system depend only on last d B feedforward delay Fixed bb Y delay decoder Y Fixed delay XOR XOR decoder... t eb (t d)r bb (t d)r eb Causal X encoder DMC B (t d)r B Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

88 Pinsker s bounding construction explained Without feedback: E + (R) continues to be a bound. Consider a code with target delay d Use it to construct a block-code with blocksize n d Genie-aided decoder: has the truth of all bits before i Error events for genie-aided system depend only on last d Apply a change of measure argument B feedforward delay Fixed bb Y delay decoder Y Fixed delay XOR XOR decoder... t eb (t d)r bb (t d)r eb Causal X encoder DMC B (t d)r B Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

89 Using E + to bound α in general Past behavior λn λ λ d Future ( λ)n d λr n bits within deadline bits to ignore The block error probability is like e α( λ)n which cannot exceed the Haroutunian bound e E+ (λr)n Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

90 Using E + to bound α in general Past behavior λn λ λ d Future ( λ)n d λr n bits within deadline bits to ignore The block error probability is like e α( λ)n which cannot exceed the Haroutunian bound e E+ (λr)n α (R) E+ (λr) λ Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

91 Using E + to bound α in general Past behavior λn λ λ d Future ( λ)n d λr n bits within deadline bits to ignore The block error probability is like e α( λ)n which cannot exceed the Haroutunian bound e E+ (λr)n α (R) E+ (λr) λ The error events involve both the past and the future. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

92 Uncertainty focusing bound for symmetric DMCs Minimize over λ for symmetric DMCs to sweep out frontier by varying ρ > : Using the Gallager function: R(ρ) = E (ρ) ρ E a + (ρ) = E (ρ) E (ρ) = max ln q j ( q i p i ) +ρ +ρ ij Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

93 Uncertainty focusing bound for symmetric DMCs Minimize over λ for symmetric DMCs to sweep out frontier by varying ρ > : Using the Gallager function: R(ρ) = E (ρ) ρ E a + (ρ) = E (ρ) E (ρ) = max ln q j ( q i p i ) +ρ +ρ ij Same form as Viterbi s convolutional coding bound for constraint-lengths, but a lot more fundamental! Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

94 Upper bound tight for the BEC with feedback 7 Error Exponent (base 2) Rate (in bits) Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

95 Outline Motivation and background 2 Feedback, delay, and error probability: memoryless channels Block-oriented results Idealized case: fixed-delay with hard deadlines The BEC example Without feedback The focusing bound The source-coding analogy Approaching the focusing bound (with help) No help and universality Idealized case: Fixed-delay with soft deadlines Unreliable/Noisy feedback 3 Feedback and memory 4 Conclusions and open problems Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

96 The source coding problem {ˆX t} {X t} Source Encoder Encoded Bitstream Fixed Rate R Source Decoder Assume {X t } iid Application-level interface Symbol error probability: Pe = P(X t ˆX t ) End-to-end latency: d (measured in source timescale) Channel-code interface: fixed rate R (assumed noiseless) Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

97 Using E b to bound E s in general Past behavior Future β d d β dr dr n symbols within deadline symbols to ignore The error probability is bounded by K exp( de s (R)) which cannot exceed the block-coding bound exp( ne b (βr)) Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 4 / 95

98 Using E b to bound E s in general Past behavior Future β d d β dr dr n symbols within deadline symbols to ignore The error probability is bounded by K exp( de s (R)) which cannot exceed the block-coding bound exp( ne b (βr)) E s (R) E b(βr) β Only the past matters! Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 4 / 95

99 Achieving the focusing bound R(ρ) = E (ρ) ρ E s (ρ) = E (ρ) Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

100 Achieving the focusing bound R(ρ) = E (ρ) ρ E s (ρ) = E (ρ) fixed-rate source variable-length buffer code FIFO Queue fixed-rate bitstream Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

101 Achieving the focusing bound R(ρ) = E (ρ) ρ E s (ρ) = E (ρ) fixed-rate source variable-length buffer code FIFO Queue fixed-rate bitstream Pick miniblock n large enough but small relative to d. Variable-length codes turn into variable delay at the receiver. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

102 Achieving the focusing bound R(ρ) = E (ρ) ρ E s (ρ) = E (ρ) fixed-rate source variable-length buffer code FIFO Queue fixed-rate bitstream Pick miniblock n large enough but small relative to d. Variable-length codes turn into variable delay at the receiver. Queuing delay dominates. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

103 Achieving the focusing bound R(ρ) = E (ρ) ρ E s (ρ) = E (ρ) fixed-rate source variable-length buffer code FIFO Queue fixed-rate bitstream Pick miniblock n large enough but small relative to d. Variable-length codes turn into variable delay at the receiver. Queuing delay dominates. R translates queue length into delay. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

104 A simple example: related to Jelinek-68 Unfair coin tosses P(H) = Reliability Rate Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

105 How far till Asymptopia? 2 Error probability (symbol) Block coding Causal random coding Simple prefix coding End to end delay Simple example with a ternary source and a specific code (n = 2). Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

106 Side-information at the decoder X, X 2,... (X i, Y i ) p XY Y, Y 2,... Rate R E D X, X 2,.... Encoder may be ignorant of the side-information. (Slepian-Wolf) If P X,Y symmetric with uniform marginals, can do no better than E b (R) with delay. (analogous to Pinsker s argument) Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

107 Side-information at the decoder X, X 2,... (X i, Y i ) p XY Y, Y 2,... Rate R E D X, X 2,.... Encoder may be ignorant of the side-information. (Slepian-Wolf) If P X,Y symmetric with uniform marginals, can do no better than E b (R) with delay. (analogous to Pinsker s argument) Reliability Rate Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

108 Side-information at the decoder X, X 2,... (X i, Y i ) p XY Y, Y 2,... Rate R E D X, X 2,.... Encoder may be ignorant of the side-information. (Slepian-Wolf) If P X,Y symmetric with uniform marginals, can do no better than E b (R) with delay. (analogous to Pinsker s argument) Reliability.3.2..e5.5e4.e4.5e3 Ratio.e3.5e Rate.e Rate Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

109 Intuition: Long vs. Large deviations Error Height Rate Slope Past Future Typical Slope Entropy Slope Shorter deviation periods must be larger. Smaller deviations must be over longer periods. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

110 Outline Motivation and background 2 Feedback, delay, and error probability: memoryless channels Block-oriented results Idealized case: fixed-delay with hard deadlines The BEC example Without feedback The focusing bound The source-coding analogy Approaching the focusing bound (with help) No help and universality Idealized case: Fixed-delay with soft deadlines Unreliable/Noisy feedback 3 Feedback and memory 4 Conclusions and open problems Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 5 / 95

111 A spoonful of sugar helps the bits get across. Original forward DMC channel uses 5 -Fortification noiseless forward side channel uses S S 2 S 3 S 4 S 5 S 6 2 Error Exponent (base e) Rate (in nats) Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 5 / 95

112 Harnessing the power of flow control Forward DMC channel uses... Noiseless forward side channel uses one chunk... deny deny confirm Previous block disambiguation disambiguation Previous block confirmation Group bits into miniblocks of size nr. (n d) 2 Transmit using an -length random codebook. 3 Use the sugar to tell decoder when it s done. No decoding errors, just queuing plus transmission delays. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

113 Why this works: operational interpretation of E (ρ) Variable block transmission time T can be bounded by a constant plus a geometric random variable. Error Exponent (base e) Rate (in nats) Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

114 Why this works: operational interpretation of E (ρ) Variable block transmission time T can be bounded by a constant plus a geometric random variable. Error Exponent (base e) Rate (in nats) Need to do list-decoding at low rates. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

115 Reduces to the low-rate erasure case Pick R < R < C and aim for E + (R ) = E (ρ ) exponent. Error Exponent (base e) Rate (in nats) If n large, effective point-message rate (n( R R )) is small. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

116 But low-rate erasure exponent logδ Error Exponent (base 2) Rate (in bits) log δ = E (ρ ) in our context. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

117 Approaching the focusing bound Error Exponent (base e) (5,8,6) (,3,2) (2,4,3) Rate (in nats) Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

118 The dominant error events: past vs future Ratio (in db) of future to past in dominant error event Future behavior dominates Rate (in nats) Past behavior dominates Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

119 Outline Motivation and background 2 Feedback, delay, and error probability: memoryless channels Block-oriented results Idealized case: fixed-delay with hard deadlines The BEC example Without feedback The focusing bound The source-coding analogy Approaching the focusing bound (with help) No help and universality Idealized case: Fixed-delay with soft deadlines Unreliable/Noisy feedback 3 Feedback and memory 4 Conclusions and open problems Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

120 Channels with positive zero-error capacity ( θ)c θc ( θ)c θc ( θ)c θc first chunk second chunk third chunk random increment deny random increment deny random increment.... (l + )-bit zero-error feedback block codes confirm + disambiguation Throw away a fraction θ of channel uses for flow-control overhead Asymptotically achieves the focusing bound as θ. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 6 / 95

121 Can do well even without sugar Time-share flow-control and data and optimize fraction θ for flow-control. Error Exponent (base e) Rate (in nats) Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

122 Optimize fraction θ for flow-control ( θ)c θc ( θ)c θc ( θ)c θc.... first chunk second chunk third chunk.. random increment deny random increment deny random increment.... confirm + disambiguation Flow-control encoded with -length convolutional code. Low-rate feedback anytime code Flow-control exponent θe () Data exponent ( θ)e (ρ) Data rate ( θ) E (ρ) ρ Error Exponent (base e) Rate (in nats) θ = E (ρ) E () + E (ρ) E(ρ) = E (ρ)e () E (ρ) + E () R (ρ) = E (ρ) ρ E (ρ) = E (ρ) Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

123 Dealing with channel uncertainty Compound DMC Fix input distribution. Assume only I(X; Y) C is known. Non-convex set of channels. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

124 Dealing with channel uncertainty Compound DMC Fix input distribution. Assume only I(X; Y) C is known. Non-convex set of channels. Block-case: MMI decoding achieves E r (R) for whatever channel is there. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

125 Dealing with channel uncertainty Compound DMC Fix input distribution. Assume only I(X; Y) C is known. Non-convex set of channels. Block-case: MMI decoding achieves E r (R) for whatever channel is there. But how big is this guaranteed to be? Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

126 Dealing with channel uncertainty Compound DMC Fix input distribution. Assume only I(X; Y) C is known. Non-convex set of channels. Block-case: MMI decoding achieves E r (R) for whatever channel is there. But how big is this guaranteed to be? Universal bound (Gallager 7 and Lapidoth/Telatar 98): E r (R) (C R) 2 8/e 2 + 4[ln Y ] 2 Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

127 Dealing with channel uncertainty Compound DMC Fix input distribution. Assume only I(X; Y) C is known. Non-convex set of channels. Block-case: MMI decoding achieves E r (R) for whatever channel is there. But how big is this guaranteed to be? Universal bound (Gallager 7 and Lapidoth/Telatar 98): E r (R) (C R) 2 8/e 2 + 4[ln Y ] 2 Can translate to delay and focusing-bound. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

128 Universal bounds illustrated.4 Error Exponent (base e) Rate (in nats) Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

129 Universal bounds illustrated.4 Error Exponent (base e) Rate (in nats) Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

130 Universal bounds illustrated Rate (in nats) Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

131 Hard deadline coding final comments For known channels, computation per channel use does not depend on probability of error = infinite computational exponent Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

132 Hard deadline coding final comments For known channels, computation per channel use does not depend on probability of error = infinite computational exponent The code is anytime in that it is delay universal application can pick what latency is desired. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

133 Hard deadline coding final comments For known channels, computation per channel use does not depend on probability of error = infinite computational exponent The code is anytime in that it is delay universal application can pick what latency is desired. Queuing delay dominates at all rates. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

134 Hard deadline coding final comments For known channels, computation per channel use does not depend on probability of error = infinite computational exponent The code is anytime in that it is delay universal application can pick what latency is desired. Queuing delay dominates at all rates. Even with unknown channels, get strictly positive slope at capacity. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

135 Outline Motivation and background 2 Feedback, delay, and error probability: memoryless channels Block-oriented results Idealized case: fixed-delay with hard deadlines Idealized case: fixed-delay with soft deadlines Kudryashov s scheme Hallucination bound Unreliable/Noisy feedback 3 Feedback and memory 4 Conclusions and open problems Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

136 A streaming data perspective on soft deadlines end to end delay Application data stream Packetizer Queue Tx channel Rx feedback channel Application Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

137 A streaming data perspective on soft deadlines end to end delay Application data stream Packetizer Queue Tx channel Rx feedback channel Application Queue is optional. Needed if retransmissions are required Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

138 A streaming data perspective on soft deadlines end to end delay Application data stream Packetizer Queue Tx channel Rx feedback channel Application Queue is optional. Needed if retransmissions are required If retransmissions are rare, then expected end-to-end delay is dominated by the Tx to Rx delay. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

139 An opportunity presents itself Data Blocks decoding error Nak Acks Erasures are rare so most messages are confirmed. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 7 / 95

140 An opportunity presents itself Data Blocks decoding error Nak Acks Erasures are rare so most messages are confirmed. We are wasting channel uses. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 7 / 95

141 An opportunity presents itself Data Blocks decoding error Nak Acks Erasures are rare so most messages are confirmed. We are wasting channel uses. What if we only sent NAKs when needed? Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 7 / 95

142 An opportunity presents itself Data Blocks decoding error Nak Acks Erasures are rare so most messages are confirmed. We are wasting channel uses. What if we only sent NAKs when needed? Have a special message for this purpose. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 7 / 95

143 Sliding blocks with collective punishment only (Kudryashov-79) Data Blocks Emergency msg, fixed length = decision delay decision delay Final decision on purple msg (unless detect Nak) decision delay Final decision on blue msg (unless detect Nak) Make packet size n much smaller than soft deadline d. Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 7 / 95

144 Sliding blocks with collective punishment only (Kudryashov-79) Data Blocks Emergency msg, fixed length = decision delay decision delay Final decision on purple msg (unless detect Nak) decision delay Final decision on blue msg (unless detect Nak) Make packet size n much smaller than soft deadline d. A NAK collectively denies the past d n packets Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 7 / 95

145 Sliding blocks with collective punishment only (Kudryashov-79) Data Blocks Emergency msg, fixed length = decision delay decision delay Final decision on purple msg (unless detect Nak) decision delay Final decision on blue msg (unless detect Nak) Make packet size n much smaller than soft deadline d. A NAK collectively denies the past d n packets Error only if d n NAKs are all missed Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, 27 7 / 95

146 Unequal error protection required in codebook Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

147 Unequal error protection required in codebook Typical noise sphere Unequal Error Protection Hypothesis testing threshold: Is it a Nak or is it data? Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

148 Specialize to BSC case Data Blocks N Nak for blocks = N N N N q.5 q y q ( p) + ( q) p Gap Use all zero for NAK p codeword composition BSC (p) output composition Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

149 Specialize to BSC case Data Blocks N Nak for blocks = N N N N q.5 q y q ( p) + ( q) p Gap Use all zero for NAK Use composition q code for data: R < H(q) H(p) p codeword composition BSC (p) output composition Sahai/Tatikonda (ISIT7) Feedback Tutorial Jun 24, / 95

Delay, feedback, and the price of ignorance

Delay, feedback, and the price of ignorance Delay, feedback, and the price of ignorance Anant Sahai based in part on joint work with students: Tunc Simsek Cheng Chang Wireless Foundations Department of Electrical Engineering and Computer Sciences

More information

The Hallucination Bound for the BSC

The Hallucination Bound for the BSC The Hallucination Bound for the BSC Anant Sahai and Stark Draper Wireless Foundations Department of Electrical Engineering and Computer Sciences University of California at Berkeley ECE Department University

More information

The connection between information theory and networked control

The connection between information theory and networked control The connection between information theory and networked control Anant Sahai based in part on joint work with students: Tunc Simsek, Hari Palaiyanur, and Pulkit Grover Wireless Foundations Department of

More information

Universal Anytime Codes: An approach to uncertain channels in control

Universal Anytime Codes: An approach to uncertain channels in control Universal Anytime Codes: An approach to uncertain channels in control paper by Stark Draper and Anant Sahai presented by Sekhar Tatikonda Wireless Foundations Department of Electrical Engineering and Computer

More information

On the variable-delay reliability function of discrete memoryless channels with access to noisy feedback

On the variable-delay reliability function of discrete memoryless channels with access to noisy feedback ITW2004, San Antonio, Texas, October 24 29, 2004 On the variable-delay reliability function of discrete memoryless channels with access to noisy feedback Anant Sahai and Tunç Şimşek Electrical Engineering

More information

The error exponent with delay for lossless source coding

The error exponent with delay for lossless source coding The error eponent with delay for lossless source coding Cheng Chang and Anant Sahai Wireless Foundations, University of California at Berkeley cchang@eecs.berkeley.edu, sahai@eecs.berkeley.edu Abstract

More information

Attaining maximal reliability with minimal feedback via joint channel-code and hash-function design

Attaining maximal reliability with minimal feedback via joint channel-code and hash-function design Attaining maimal reliability with minimal feedback via joint channel-code and hash-function design Stark C. Draper, Kannan Ramchandran, Biio Rimoldi, Anant Sahai, and David N. C. Tse Department of EECS,

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

CSCI 2570 Introduction to Nanocomputing

CSCI 2570 Introduction to Nanocomputing CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication

More information

Coding into a source: an inverse rate-distortion theorem

Coding into a source: an inverse rate-distortion theorem Coding into a source: an inverse rate-distortion theorem Anant Sahai joint work with: Mukul Agarwal Sanjoy K. Mitter Wireless Foundations Department of Electrical Engineering and Computer Sciences University

More information

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity 5-859: Information Theory and Applications in TCS CMU: Spring 23 Lecture 8: Channel and source-channel coding theorems; BEC & linear codes February 7, 23 Lecturer: Venkatesan Guruswami Scribe: Dan Stahlke

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

Noisy channel communication

Noisy channel communication Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 6 Communication channels and Information Some notes on the noisy channel setup: Iain Murray, 2012 School of Informatics, University

More information

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Nevroz Şen, Fady Alajaji and Serdar Yüksel Department of Mathematics and Statistics Queen s University Kingston, ON K7L 3N6, Canada

More information

Relaying Information Streams

Relaying Information Streams Relaying Information Streams Anant Sahai UC Berkeley EECS sahai@eecs.berkeley.edu Originally given: Oct 2, 2002 This was a talk only. I was never able to rigorously formalize the back-of-the-envelope reasoning

More information

Stabilization over discrete memoryless and wideband channels using nearly memoryless observations

Stabilization over discrete memoryless and wideband channels using nearly memoryless observations Stabilization over discrete memoryless and wideband channels using nearly memoryless observations Anant Sahai Abstract We study stabilization of a discrete-time scalar unstable plant over a noisy communication

More information

Lecture 4 Channel Coding

Lecture 4 Channel Coding Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity

More information

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel Introduction to Coding Theory CMU: Spring 2010 Notes 3: Stochastic channels and noisy coding theorem bound January 2010 Lecturer: Venkatesan Guruswami Scribe: Venkatesan Guruswami We now turn to the basic

More information

Capacity-achieving Feedback Scheme for Flat Fading Channels with Channel State Information

Capacity-achieving Feedback Scheme for Flat Fading Channels with Channel State Information Capacity-achieving Feedback Scheme for Flat Fading Channels with Channel State Information Jialing Liu liujl@iastate.edu Sekhar Tatikonda sekhar.tatikonda@yale.edu Nicola Elia nelia@iastate.edu Dept. of

More information

A Simple Converse of Burnashev s Reliability Function

A Simple Converse of Burnashev s Reliability Function A Simple Converse of Burnashev s Reliability Function 1 arxiv:cs/0610145v3 [cs.it] 23 Sep 2008 Peter Berlin, Barış Nakiboğlu, Bixio Rimoldi, Emre Telatar School of Computer and Communication Sciences Ecole

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

Interactive Decoding of a Broadcast Message

Interactive Decoding of a Broadcast Message In Proc. Allerton Conf. Commun., Contr., Computing, (Illinois), Oct. 2003 Interactive Decoding of a Broadcast Message Stark C. Draper Brendan J. Frey Frank R. Kschischang University of Toronto Toronto,

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

Appendix B Information theory from first principles

Appendix B Information theory from first principles Appendix B Information theory from first principles This appendix discusses the information theory behind the capacity expressions used in the book. Section 8.3.4 is the only part of the book that supposes

More information

A Simple Encoding And Decoding Strategy For Stabilization Over Discrete Memoryless Channels

A Simple Encoding And Decoding Strategy For Stabilization Over Discrete Memoryless Channels A Simple Encoding And Decoding Strategy For Stabilization Over Discrete Memoryless Channels Anant Sahai and Hari Palaiyanur Dept. of Electrical Engineering and Computer Sciences University of California,

More information

Error Exponent Region for Gaussian Broadcast Channels

Error Exponent Region for Gaussian Broadcast Channels Error Exponent Region for Gaussian Broadcast Channels Lihua Weng, S. Sandeep Pradhan, and Achilleas Anastasopoulos Electrical Engineering and Computer Science Dept. University of Michigan, Ann Arbor, MI

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

Upper Bounds on the Capacity of Binary Intermittent Communication

Upper Bounds on the Capacity of Binary Intermittent Communication Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,

More information

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Igal Sason Department of Electrical Engineering Technion - Israel Institute of Technology Haifa 32000, Israel 2009 IEEE International

More information

The necessity and sufficiency of anytime capacity for control over a noisy communication link: Parts I and II

The necessity and sufficiency of anytime capacity for control over a noisy communication link: Parts I and II The necessity and sufficiency of anytime capacity for control over a noisy communication link: Parts I and II Anant Sahai, Sanjoy Mitter sahai@eecs.berkeley.edu, mitter@mit.edu Abstract We review how Shannon

More information

On the Error Exponents of ARQ Channels with Deadlines

On the Error Exponents of ARQ Channels with Deadlines On the Error Exponents of ARQ Channels with Deadlines Praveen Kumar Gopala, Young-Han Nam and Hesham El Gamal arxiv:cs/06006v [cs.it] 8 Oct 2006 March 22, 208 Abstract We consider communication over Automatic

More information

Second-Order Asymptotics in Information Theory

Second-Order Asymptotics in Information Theory Second-Order Asymptotics in Information Theory Vincent Y. F. Tan (vtan@nus.edu.sg) Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) National Taiwan University November 2015

More information

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 Please submit the solutions on Gradescope. EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 1. Optimal codeword lengths. Although the codeword lengths of an optimal variable length code

More information

The Poisson Channel with Side Information

The Poisson Channel with Side Information The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH

More information

Sparse Regression Codes for Multi-terminal Source and Channel Coding

Sparse Regression Codes for Multi-terminal Source and Channel Coding Sparse Regression Codes for Multi-terminal Source and Channel Coding Ramji Venkataramanan Yale University Sekhar Tatikonda Allerton 2012 1 / 20 Compression with Side-Information X Encoder Rate R Decoder

More information

An introduction to basic information theory. Hampus Wessman

An introduction to basic information theory. Hampus Wessman An introduction to basic information theory Hampus Wessman Abstract We give a short and simple introduction to basic information theory, by stripping away all the non-essentials. Theoretical bounds on

More information

The PPM Poisson Channel: Finite-Length Bounds and Code Design

The PPM Poisson Channel: Finite-Length Bounds and Code Design August 21, 2014 The PPM Poisson Channel: Finite-Length Bounds and Code Design Flavio Zabini DEI - University of Bologna and Institute for Communications and Navigation German Aerospace Center (DLR) Balazs

More information

Arimoto Channel Coding Converse and Rényi Divergence

Arimoto Channel Coding Converse and Rényi Divergence Arimoto Channel Coding Converse and Rényi Divergence Yury Polyanskiy and Sergio Verdú Abstract Arimoto proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code

More information

for some error exponent E( R) as a function R,

for some error exponent E( R) as a function R, . Capacity-achieving codes via Forney concatenation Shannon s Noisy Channel Theorem assures us the existence of capacity-achieving codes. However, exhaustive search for the code has double-exponential

More information

6.02 Fall 2011 Lecture #9

6.02 Fall 2011 Lecture #9 6.02 Fall 2011 Lecture #9 Claude E. Shannon Mutual information Channel capacity Transmission at rates up to channel capacity, and with asymptotically zero error 6.02 Fall 2011 Lecture 9, Slide #1 First

More information

Chapter 1 Elements of Information Theory for Networked Control Systems

Chapter 1 Elements of Information Theory for Networked Control Systems Chapter 1 Elements of Information Theory for Networked Control Systems Massimo Franceschetti and Paolo Minero 1.1 Introduction Next generation cyber-physical systems [35] will integrate computing, communication,

More information

Universal Incremental Slepian-Wolf Coding

Universal Incremental Slepian-Wolf Coding Proceedings of the 43rd annual Allerton Conference, Monticello, IL, September 2004 Universal Incremental Slepian-Wolf Coding Stark C. Draper University of California, Berkeley Berkeley, CA, 94720 USA sdraper@eecs.berkeley.edu

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

National University of Singapore Department of Electrical & Computer Engineering. Examination for

National University of Singapore Department of Electrical & Computer Engineering. Examination for National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

Lecture 15: Conditional and Joint Typicaility

Lecture 15: Conditional and Joint Typicaility EE376A Information Theory Lecture 1-02/26/2015 Lecture 15: Conditional and Joint Typicaility Lecturer: Kartik Venkat Scribe: Max Zimet, Brian Wai, Sepehr Nezami 1 Notation We always write a sequence of

More information

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I

More information

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

18.2 Continuous Alphabet (discrete-time, memoryless) Channel 0-704: Information Processing and Learning Spring 0 Lecture 8: Gaussian channel, Parallel channels and Rate-distortion theory Lecturer: Aarti Singh Scribe: Danai Koutra Disclaimer: These notes have not

More information

Lecture 3: Channel Capacity

Lecture 3: Channel Capacity Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.

More information

Lecture 8: Shannon s Noise Models

Lecture 8: Shannon s Noise Models Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have

More information

LECTURE 13. Last time: Lecture outline

LECTURE 13. Last time: Lecture outline LECTURE 13 Last time: Strong coding theorem Revisiting channel and codes Bound on probability of error Error exponent Lecture outline Fano s Lemma revisited Fano s inequality for codewords Converse to

More information

EE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes

EE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes EE229B - Final Project Capacity-Approaching Low-Density Parity-Check Codes Pierre Garrigues EECS department, UC Berkeley garrigue@eecs.berkeley.edu May 13, 2005 Abstract The class of low-density parity-check

More information

Lecture 14 February 28

Lecture 14 February 28 EE/Stats 376A: Information Theory Winter 07 Lecture 4 February 8 Lecturer: David Tse Scribe: Sagnik M, Vivek B 4 Outline Gaussian channel and capacity Information measures for continuous random variables

More information

Intermittent Communication

Intermittent Communication Intermittent Communication Mostafa Khoshnevisan, Student Member, IEEE, and J. Nicholas Laneman, Senior Member, IEEE arxiv:32.42v2 [cs.it] 7 Mar 207 Abstract We formulate a model for intermittent communication

More information

Solutions to Homework Set #3 Channel and Source coding

Solutions to Homework Set #3 Channel and Source coding Solutions to Homework Set #3 Channel and Source coding. Rates (a) Channels coding Rate: Assuming you are sending 4 different messages using usages of a channel. What is the rate (in bits per channel use)

More information

6.02 Fall 2012 Lecture #1

6.02 Fall 2012 Lecture #1 6.02 Fall 2012 Lecture #1 Digital vs. analog communication The birth of modern digital communication Information and entropy Codes, Huffman coding 6.02 Fall 2012 Lecture 1, Slide #1 6.02 Fall 2012 Lecture

More information

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University) Performance-based Security for Encoding of Information Signals FA9550-15-1-0180 (2015-2018) Paul Cuff (Princeton University) Contributors Two students finished PhD Tiance Wang (Goldman Sachs) Eva Song

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

Chapter 9 Fundamental Limits in Information Theory

Chapter 9 Fundamental Limits in Information Theory Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For

More information

The Method of Types and Its Application to Information Hiding

The Method of Types and Its Application to Information Hiding The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,

More information

Network coding for multicast relation to compression and generalization of Slepian-Wolf

Network coding for multicast relation to compression and generalization of Slepian-Wolf Network coding for multicast relation to compression and generalization of Slepian-Wolf 1 Overview Review of Slepian-Wolf Distributed network compression Error exponents Source-channel separation issues

More information

Can Feedback Increase the Capacity of the Energy Harvesting Channel?

Can Feedback Increase the Capacity of the Energy Harvesting Channel? Can Feedback Increase the Capacity of the Energy Harvesting Channel? Dor Shaviv EE Dept., Stanford University shaviv@stanford.edu Ayfer Özgür EE Dept., Stanford University aozgur@stanford.edu Haim Permuter

More information

LECTURE 10. Last time: Lecture outline

LECTURE 10. Last time: Lecture outline LECTURE 10 Joint AEP Coding Theorem Last time: Error Exponents Lecture outline Strong Coding Theorem Reading: Gallager, Chapter 5. Review Joint AEP A ( ɛ n) (X) A ( ɛ n) (Y ) vs. A ( ɛ n) (X, Y ) 2 nh(x)

More information

Towards control over fading channels

Towards control over fading channels Towards control over fading channels Paolo Minero, Massimo Franceschetti Advanced Network Science University of California San Diego, CA, USA mail: {minero,massimo}@ucsd.edu Invited Paper) Subhrakanti

More information

Shannon s Noisy-Channel Coding Theorem

Shannon s Noisy-Channel Coding Theorem Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 13, 2015 Lucas Slot, Sebastian Zur Shannon s Noisy-Channel Coding Theorem February 13, 2015 1 / 29 Outline 1 Definitions and Terminology

More information

Source coding and channel requirements for unstable processes. Anant Sahai, Sanjoy Mitter

Source coding and channel requirements for unstable processes. Anant Sahai, Sanjoy Mitter Source coding and channel requirements for unstable processes Anant Sahai, Sanjoy Mitter sahai@eecs.berkeley.edu, mitter@mit.edu Abstract Our understanding of information in systems has been based on the

More information

Characterization of Information Channels for Asymptotic Mean Stationarity and Stochastic Stability of Nonstationary/Unstable Linear Systems

Characterization of Information Channels for Asymptotic Mean Stationarity and Stochastic Stability of Nonstationary/Unstable Linear Systems 6332 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 58, NO 10, OCTOBER 2012 Characterization of Information Channels for Asymptotic Mean Stationarity and Stochastic Stability of Nonstationary/Unstable Linear

More information

Physical Layer and Coding

Physical Layer and Coding Physical Layer and Coding Muriel Médard Professor EECS Overview A variety of physical media: copper, free space, optical fiber Unified way of addressing signals at the input and the output of these media:

More information

Introduction to Convolutional Codes, Part 1

Introduction to Convolutional Codes, Part 1 Introduction to Convolutional Codes, Part 1 Frans M.J. Willems, Eindhoven University of Technology September 29, 2009 Elias, Father of Coding Theory Textbook Encoder Encoder Properties Systematic Codes

More information

Unequal Error Protection Querying Policies for the Noisy 20 Questions Problem

Unequal Error Protection Querying Policies for the Noisy 20 Questions Problem Unequal Error Protection Querying Policies for the Noisy 20 Questions Problem Hye Won Chung, Brian M. Sadler, Lizhong Zheng and Alfred O. Hero arxiv:606.09233v2 [cs.it] 28 Sep 207 Abstract In this paper,

More information

LECTURE 3. Last time:

LECTURE 3. Last time: LECTURE 3 Last time: Mutual Information. Convexity and concavity Jensen s inequality Information Inequality Data processing theorem Fano s Inequality Lecture outline Stochastic processes, Entropy rate

More information

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR 1 Stefano Rini, Daniela Tuninetti and Natasha Devroye srini2, danielat, devroye @ece.uic.edu University of Illinois at Chicago Abstract

More information

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying Ahmad Abu Al Haija, and Mai Vu, Department of Electrical and Computer Engineering McGill University Montreal, QC H3A A7 Emails: ahmadabualhaija@mailmcgillca,

More information

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel

More information

16.36 Communication Systems Engineering

16.36 Communication Systems Engineering MIT OpenCourseWare http://ocw.mit.edu 16.36 Communication Systems Engineering Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. 16.36: Communication

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

Channel combining and splitting for cutoff rate improvement

Channel combining and splitting for cutoff rate improvement Channel combining and splitting for cutoff rate improvement Erdal Arıkan Electrical-Electronics Engineering Department Bilkent University, Ankara, 68, Turkey Email: arikan@eebilkentedutr arxiv:cs/5834v

More information

ECE Information theory Final

ECE Information theory Final ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the

More information

Common Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014

Common Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014 Common Information Abbas El Gamal Stanford University Viterbi Lecture, USC, April 2014 Andrew Viterbi s Fabulous Formula, IEEE Spectrum, 2010 El Gamal (Stanford University) Disclaimer Viterbi Lecture 2

More information

Approximately achieving the feedback interference channel capacity with point-to-point codes

Approximately achieving the feedback interference channel capacity with point-to-point codes Approximately achieving the feedback interference channel capacity with point-to-point codes Joyson Sebastian*, Can Karakus*, Suhas Diggavi* Abstract Superposition codes with rate-splitting have been used

More information

The Robustness of Dirty Paper Coding and The Binary Dirty Multiple Access Channel with Common Interference

The Robustness of Dirty Paper Coding and The Binary Dirty Multiple Access Channel with Common Interference The and The Binary Dirty Multiple Access Channel with Common Interference Dept. EE - Systems, Tel Aviv University, Tel Aviv, Israel April 25th, 2010 M.Sc. Presentation The B/G Model Compound CSI Smart

More information

Variable Length Codes for Degraded Broadcast Channels

Variable Length Codes for Degraded Broadcast Channels Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates

More information

ECEN 655: Advanced Channel Coding

ECEN 655: Advanced Channel Coding ECEN 655: Advanced Channel Coding Course Introduction Henry D. Pfister Department of Electrical and Computer Engineering Texas A&M University ECEN 655: Advanced Channel Coding 1 / 19 Outline 1 History

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

Lecture 7 September 24

Lecture 7 September 24 EECS 11: Coding for Digital Communication and Beyond Fall 013 Lecture 7 September 4 Lecturer: Anant Sahai Scribe: Ankush Gupta 7.1 Overview This lecture introduces affine and linear codes. Orthogonal signalling

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

ELEMENT OF INFORMATION THEORY

ELEMENT OF INFORMATION THEORY History Table of Content ELEMENT OF INFORMATION THEORY O. Le Meur olemeur@irisa.fr Univ. of Rennes 1 http://www.irisa.fr/temics/staff/lemeur/ October 2010 1 History Table of Content VERSION: 2009-2010:

More information

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 Capacity Theorems for Discrete, Finite-State Broadcast Channels With Feedback and Unidirectional Receiver Cooperation Ron Dabora

More information

A Simple Converse of Burnashev's Reliability Function

A Simple Converse of Burnashev's Reliability Function A Simple Converse of Burnashev's Reliability Function The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation As Published Publisher

More information

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions EE/Stat 376B Handout #5 Network Information Theory October, 14, 014 1. Problem.4 parts (b) and (c). Homework Set # Solutions (b) Consider h(x + Y ) h(x + Y Y ) = h(x Y ) = h(x). (c) Let ay = Y 1 + Y, where

More information

AN INFORMATION THEORY APPROACH TO WIRELESS SENSOR NETWORK DESIGN

AN INFORMATION THEORY APPROACH TO WIRELESS SENSOR NETWORK DESIGN AN INFORMATION THEORY APPROACH TO WIRELESS SENSOR NETWORK DESIGN A Thesis Presented to The Academic Faculty by Bryan Larish In Partial Fulfillment of the Requirements for the Degree Doctor of Philosophy

More information

Lecture 12. Block Diagram

Lecture 12. Block Diagram Lecture 12 Goals Be able to encode using a linear block code Be able to decode a linear block code received over a binary symmetric channel or an additive white Gaussian channel XII-1 Block Diagram Data

More information

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

CS6304 / Analog and Digital Communication UNIT IV - SOURCE AND ERROR CONTROL CODING PART A 1. What is the use of error control coding? The main use of error control coding is to reduce the overall probability

More information

X 1 : X Table 1: Y = X X 2

X 1 : X Table 1: Y = X X 2 ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access

More information

Reliable Computation over Multiple-Access Channels

Reliable Computation over Multiple-Access Channels Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,

More information

Recent Results on Input-Constrained Erasure Channels

Recent Results on Input-Constrained Erasure Channels Recent Results on Input-Constrained Erasure Channels A Case Study for Markov Approximation July, 2017@NUS Memoryless Channels Memoryless Channels Channel transitions are characterized by time-invariant

More information

APPLICATIONS. Quantum Communications

APPLICATIONS. Quantum Communications SOFT PROCESSING TECHNIQUES FOR QUANTUM KEY DISTRIBUTION APPLICATIONS Marina Mondin January 27, 2012 Quantum Communications In the past decades, the key to improving computer performance has been the reduction

More information

EE 4TM4: Digital Communications II. Channel Capacity

EE 4TM4: Digital Communications II. Channel Capacity EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.

More information

Homework Set #2 Data Compression, Huffman code and AEP

Homework Set #2 Data Compression, Huffman code and AEP Homework Set #2 Data Compression, Huffman code and AEP 1. Huffman coding. Consider the random variable ( x1 x X = 2 x 3 x 4 x 5 x 6 x 7 0.50 0.26 0.11 0.04 0.04 0.03 0.02 (a Find a binary Huffman code

More information