The connection between information theory and networked control

Size: px
Start display at page:

Download "The connection between information theory and networked control"

Transcription

1 The connection between information theory and networked control Anant Sahai based in part on joint work with students: Tunc Simsek, Hari Palaiyanur, and Pulkit Grover Wireless Foundations Department of Electrical Engineering and Computer Sciences University of California at Berkeley Tutorial Seminar at the Global COE Workshop on Networked Control Systems Kyoto University Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

2 Networked Control Systems Communication Links 2 Controller 6 System 6 Sensor Relay System 6 User Systems, sensors, and users connected with network links over noisy channels. Signals evolve in real time and the communication links carry ongoing and interacting streams of information. Holistic approach: overall cost function. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

3 Ho, Kastner, and Wong (1978)... sporadic and not too successful attempts have been made to relate Shannon s information theory with feedback control system design. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

4 Shannon tells us Separate source and channel coding Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

5 Shannon tells us Separate source and channel coding But delay is the price of reliability. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

6 Shannon tells us Separate source and channel coding But delay is the price of reliability. [The duality between source and channel coding] can be pursued further and is related to a duality between past and future and the notions of control and knowledge. Thus we may have knowledge of the past and cannot control it; we may control the future but have no knowledge of it. Claude Shannon 1959 Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

7 Shannon tells us Separate source and channel coding But delay is the price of reliability. [The duality between source and channel coding] can be pursued further and is related to a duality between past and future and the notions of control and knowledge. Thus we may have knowledge of the past and cannot control it; we may control the future but have no knowledge of it. Claude Shannon 1959 What is this relationship since delays hurt control? Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

8 Outline 1 A bridge to nowhere? From control to information theory. A simple control problem A connection to information theory Fixing information theory and filling in the gaps. 2 Coming back to the control problem What is wrong with random coding The role of noiseless feedback 3 Taking control thinking to the forefront of information theory. The holy grail problem Control thinking to the rescue! Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

9 A simple distributed control problem U t 1 1 Step Delay W t Unstable System X t Designed Observer O Possible Control Knowledge U t Control Signals Designed Controller C Possible Channel Feedback Delay Fortified Channel 1 Step X t+1 = λx t + U t + W t Unstable λ > 1, bounded initial condition and disturbance W. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

10 A simple distributed control problem U t 1 1 Step Delay W t Unstable System X t Designed Observer O Possible Control Knowledge U t Control Signals Designed Controller C Possible Channel Feedback Delay Fortified Channel 1 Step X t+1 = λx t + U t + W t Unstable λ > 1, bounded initial condition and disturbance W. Goal: Performance = sup t>0 E[ X t η ] K for some target K <. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

11 Review: Entirely noiseless channel t Window known to contain X t λ t will grow by factor of λ > Encode which control U t to apply Sending R bits, cut window by a factor of 2 R grows by Ω on each side 2 giving a new window for X t+1 t+1 As long as R > log 2 λ, we can have stay bounded forever. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

12 The separation-principle oriented program Use entropy and mutual information Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

13 The separation-principle oriented program Use entropy and mutual information Tatikonda s insight: directed mutual information captures causality Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

14 The separation-principle oriented program Use entropy and mutual information Tatikonda s insight: directed mutual information captures causality Write out entropic inequalities Key Inequality: Directed data-processing inequality Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

15 The separation-principle oriented program Use entropy and mutual information Tatikonda s insight: directed mutual information captures causality Write out entropic inequalities Key Inequality: Directed data-processing inequality Set up a mapping between bits and performance You probably don t care about the entropy of the state. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

16 The separation-principle oriented program Use entropy and mutual information Tatikonda s insight: directed mutual information captures causality Write out entropic inequalities Key Inequality: Directed data-processing inequality Set up a mapping between bits and performance You probably don t care about the entropy of the state. Lower bound performance assuming nested information Equivalent to estimation Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

17 The separation-principle oriented program Use entropy and mutual information Tatikonda s insight: directed mutual information captures causality Write out entropic inequalities Key Inequality: Directed data-processing inequality Set up a mapping between bits and performance You probably don t care about the entropy of the state. Lower bound performance assuming nested information Equivalent to estimation Rate-distortion theory can be developed Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

18 The separation-principle oriented program Use entropy and mutual information Tatikonda s insight: directed mutual information captures causality Write out entropic inequalities Key Inequality: Directed data-processing inequality Set up a mapping between bits and performance You probably don t care about the entropy of the state. Lower bound performance assuming nested information Equivalent to estimation Rate-distortion theory can be developed Get tight upper bounds and architectures? Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

19 The rate-distortion part 1 Squared error distortion "Sequential" Rate distortion (obeys causality) Rate distortion curve (non causal) Stable counterpart (non causal) Rate (in bits) Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

20 How bad can entropic bounds be? Consider a system with λ = 2 for the dynamics Real packet-drop channel (C = ) Z t = { Yt with Probability with Probability 1 2 Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

21 How bad can entropic bounds be? Consider a system with λ = 2 for the dynamics Real packet-drop channel (C = ) Z t = { Yt with Probability with Probability 1 2 No other constraints, so design is obvious: Y t = X t and U t = λz t Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

22 How bad can entropic bounds be? Consider a system with λ = 2 for the dynamics Real packet-drop channel (C = ) Z t = { Yt with Probability with Probability 1 2 No other constraints, so design is obvious: Y t = X t and U t = λz t X t+1 = { Wt with Probability 1 2 2X t + W t with Probability 1 2 Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

23 How bad can entropic bounds be? Consider a system with λ = 2 for the dynamics Real packet-drop channel (C = ) Z t = { Yt with Probability with Probability 1 2 No other constraints, so design is obvious: Y t = X t and U t = λz t X t+1 = { Wt with Probability 1 2 2X t + W t with Probability 1 2 Under stochastic disturbances, the variance of the state is asymptotically infinite. (St. Petersburg Lottery Style) Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

24 Delay-universal (anytime) communication B 1 B 2 B 3 B 4 B 5 B 6 B 7 B 8 B 9 B 10 B 11 B 12 B 13 Y 1 Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y 10 Y 11 Y 12 Y 13 Y 14 Y 15 Y 16 Y 17 Y 18 Y 19 Y 20 Y 21 Y 22 Y 23 Y 24 Y 25 Y 26 Z 1 Z 2 Z 3 Z 4 Z 5 Z 6 Z 7 Z 8 Z 9 Z 10 Z 11 Z 12 Z 13 Z 14 Z 15 Z 16 Z 17 Z 18 Z 19 Z 20 Z 21 Z 22 Z 23 Z 24 Z 25 Z 26 bb 1 bb 2 bb 3 bb 4 bb 5 bb 6 bb 7 bb 8 bb 9 fixed delay d = 7 Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

25 Delay-universal (anytime) communication B 1 B 2 B 3 B 4 B 5 B 6 B 7 B 8 B 9 B 10 B 11 B 12 B 13 Y 1 Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y 10 Y 11 Y 12 Y 13 Y 14 Y 15 Y 16 Y 17 Y 18 Y 19 Y 20 Y 21 Y 22 Y 23 Y 24 Y 25 Y 26 Z 1 Z 2 Z 3 Z 4 Z 5 Z 6 Z 7 Z 8 Z 9 Z 10 Z 11 Z 12 Z 13 Z 14 Z 15 Z 16 Z 17 Z 18 Z 19 Z 20 Z 21 Z 22 Z 23 Z 24 Z 25 Z 26 bb 1 bb 2 bb 3 bb 4 bb 5 bb 6 bb 7 bb 8 bb 9 fixed delay d = 7 Fixed-delay reliability α is achievable if there exists a sequence of encoder/decoder pairs with increasing end-to-end delays d j such that lim j 1 d j ln P(B i ˆB j i ) = α. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

26 Delay-universal (anytime) communication B 1 B 2 B 3 B 4 B 5 B 6 B 7 B 8 B 9 B 10 B 11 B 12 B 13 Y 1 Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y 10 Y 11 Y 12 Y 13 Y 14 Y 15 Y 16 Y 17 Y 18 Y 19 Y 20 Y 21 Y 22 Y 23 Y 24 Y 25 Y 26 Z 1 Z 2 Z 3 Z 4 Z 5 Z 6 Z 7 Z 8 Z 9 Z 10 Z 11 Z 12 Z 13 Z 14 Z 15 Z 16 Z 17 Z 18 Z 19 Z 20 Z 21 Z 22 Z 23 Z 24 Z 25 Z 26 bb 1 bb 2 bb 3 bb 4 bb 5 bb 6 bb 7 bb 8 bb 9 fixed delay d = 7 α is achievable delay-universally or in an anytime fashion if a single encoder works for all sufficiently large delays d. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

27 Delay-universal (anytime) communication B 1 B 2 B 3 B 4 B 5 B 6 B 7 B 8 B 9 B 10 B 11 B 12 B 13 Y 1 Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y 10 Y 11 Y 12 Y 13 Y 14 Y 15 Y 16 Y 17 Y 18 Y 19 Y 20 Y 21 Y 22 Y 23 Y 24 Y 25 Y 26 Z 1 Z 2 Z 3 Z 4 Z 5 Z 6 Z 7 Z 8 Z 9 Z 10 Z 11 Z 12 Z 13 Z 14 Z 15 Z 16 Z 17 Z 18 Z 19 Z 20 Z 21 Z 22 Z 23 Z 24 Z 25 Z 26 bb 1 bb 2 bb 3 bb 4 bb 5 bb 6 bb 7 bb 8 bb 9 fixed delay d = 7 The anytime capacity Cany(α) is the supremal rate at which reliability α is achievable in a delay-universal way. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

28 Separation theorem for scalar control Necessity: If a scalar system with parameter λ > 1 can be stabilized with finite η-moment across a noisy channel, then the channel with noiseless feedback must have Cany(η ln λ) ln λ In general: If P( X > m) < f(m), then K : Perror(d) < f(kλ d ) Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

29 Separation theorem for scalar control Necessity: If a scalar system with parameter λ > 1 can be stabilized with finite η-moment across a noisy channel, then the channel with noiseless feedback must have Cany(η ln λ) ln λ In general: If P( X > m) < f(m), then K : Perror(d) < f(kλ d ) Sufficiency: If there is an α > η ln λ for which the channel with noiseless feedback has Cany(α) > ln λ then the scalar system with parameter λ 1 with a bounded disturbance can be stabilized across the noisy channel with finite η-moment. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

30 Separation theorem for scalar control Necessity: If a scalar system with parameter λ > 1 can be stabilized with finite η-moment across a noisy channel, then the channel with noiseless feedback must have Cany(η ln λ) ln λ In general: If P( X > m) < f(m), then K : Perror(d) < f(kλ d ) Sufficiency: If there is an α > η ln λ for which the channel with noiseless feedback has Cany(α) > ln λ then the scalar system with parameter λ 1 with a bounded disturbance can be stabilized across the noisy channel with finite η-moment. Captures stabilization only. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

31 Some easy implications If we want P( X t > m) f(m) = 0 for some finite m, we require zero-error reliability across the channel. Also required (for DMCs) if we want the controller to be finite memory. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

32 Some easy implications If we want P( X t > m) f(m) = 0 for some finite m, we require zero-error reliability across the channel. Also required (for DMCs) if we want the controller to be finite memory. For generic DMCs, anytime reliability with feedback is upper-bounded: f(kλ d ) ζ d 1 ζ log 2 λ f(m) K m log 2 A controlled state can have at best a power-law tail. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

33 Some easy implications If we want P( X t > m) f(m) = 0 for some finite m, we require zero-error reliability across the channel. Also required (for DMCs) if we want the controller to be finite memory. For generic DMCs, anytime reliability with feedback is upper-bounded: f(kλ d ) ζ d 1 ζ log 2 λ f(m) K m log 2 A controlled state can have at best a power-law tail. If we just want lim m f(m) = 0, then just Shannon capacity > log 2 λ is required for DMCs. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

34 Some easy implications If we want P( X t > m) f(m) = 0 for some finite m, we require zero-error reliability across the channel. Also required (for DMCs) if we want the controller to be finite memory. For generic DMCs, anytime reliability with feedback is upper-bounded: f(kλ d ) ζ d 1 ζ log 2 λ f(m) K m log 2 A controlled state can have at best a power-law tail. If we just want lim m f(m) = 0, then just Shannon capacity > log 2 λ is required for DMCs. Almost-sure stabilization for W t = 0 follows by simple time-varying transformation. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

35 Asymptotic communication problem hierarchy The easiest: Shannon communication Asymptotically: a single figure of merit C Equivalent to most estimation problems of stationary ergodic processes with bounded distortion measures. Feedback does not matter. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

36 Asymptotic communication problem hierarchy The easiest: Shannon communication Asymptotically: a single figure of merit C Equivalent to most estimation problems of stationary ergodic processes with bounded distortion measures. Feedback does not matter. Hardest level: Zero-error communication Single figure of merit C0 Feedback matters. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

37 Asymptotic communication problem hierarchy The easiest: Shannon communication Asymptotically: a single figure of merit C Equivalent to most estimation problems of stationary ergodic processes with bounded distortion measures. Feedback does not matter. Intermediate families: Anytime communication Multiple figures of merit: ( R, α) Feedback case equivalent to stabilization problems Related nonstationary estimation problems fall here also Does feedback matter? Hardest level: Zero-error communication Single figure of merit C0 Feedback matters. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

38 My favorite example: the BEC δ 1 δ e δ 1 δ 0 Simple capacity 1 δ bits per channel use With perfect feedback, simple to achieve: retransmit until it gets through Time till success: Geometric(1 δ) Expected time to get through: 1 1 δ Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

39 My favorite example: the BEC δ 1 δ e δ 1 δ 0 Error Exponent (base 2) Simple capacity 1 δ bits per channel use With perfect feedback, simple to achieve: retransmit until it gets through Time till success: Geometric(1 δ) Expected time to get through: 1 1 δ Rate (in bits) Classical bounds Sphere-packing bound D(1 R δ) Random coding bound max ρ [0,1] E 0 (ρ) ρr Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

40 My favorite example: the BEC δ 1 δ e δ 1 δ 0 Error Exponent (base 2) Simple capacity 1 δ bits per channel use With perfect feedback, simple to achieve: retransmit until it gets through Time till success: Geometric(1 δ) Expected time to get through: 1 1 δ Rate (in bits) Classical bounds Sphere-packing bound D(1 R δ) Random coding bound max ρ [0,1] E 0 (ρ) ρr What happens with feedback? Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

41 BEC with feedback and fixed blocks At rate R < 1, have Rn bits to transmit in n channel uses. Typically (1 δ)n code bits will be received. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

42 BEC with feedback and fixed blocks At rate R < 1, have Rn bits to transmit in n channel uses. Typically (1 δ)n code bits will be received. Block errors caused by atypical channel behavior. Doomed if fewer than Rn bits arrive intact. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

43 BEC with feedback and fixed blocks At rate R < 1, have Rn bits to transmit in n channel uses. Typically (1 δ)n code bits will be received. Block errors caused by atypical channel behavior. Doomed if fewer than Rn bits arrive intact. Feedback can not save us. D(1 R δ) Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

44 BEC with feedback and fixed blocks At rate R < 1, have Rn bits to transmit in n channel uses. Typically (1 δ)n code bits will be received. Block errors caused by atypical channel behavior. Doomed if fewer than Rn bits arrive intact. Feedback can not save us. D(1 R δ) Dobrushin-62 showed that this type of behavior is common: E + (R) = E sp (R) for symmetric channels. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

45 BEC with feedback and fixed delay R = 1 2 example: δ 2 δ 2 δ 2 δ 2 1 (1 δ) 2 (1 δ) 2 (1 δ) 2 (1 δ) Birth-death chain: positive recurrent if δ < 1 2 Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

46 BEC with feedback and fixed delay R = 1 2 example: δ 2 δ 2 δ 2 δ 2 1 (1 δ) 2 (1 δ) 2 (1 δ) 2 (1 δ) Birth-death chain: positive recurrent if δ < 1 2 Delay exponent easy to see: P(D d) = P(L > d 2 ) = K( δ 1 δ )d Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

47 BEC with feedback and fixed delay R = 1 2 example: δ 2 δ 2 δ 2 δ 2 1 (1 δ) 2 (1 δ) 2 (1 δ) 2 (1 δ) Birth-death chain: positive recurrent if δ < 1 2 Delay exponent easy to see: P(D d) = P(L > d 2 ) = K( δ 1 δ )d vs for block-coding with δ = 0.4 Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

48 BEC with feedback and fixed delay R = 1 2 example: δ 2 δ 2 δ 2 δ 2 1 (1 δ) 2 (1 δ) 2 (1 δ) 2 (1 δ) Birth-death chain: positive recurrent if δ < 1 2 Delay exponent easy to see: P(D d) = P(L > d 2 ) = K( δ 1 δ )d vs for block-coding with δ = 0.4 Block-coding is misleading! Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

49 Pinsker s bounding construction explained Without feedback: E + (R) continues to be a bound. Consider a code with target delay d Use it to construct a block-code with blocksize n d Genie-aided decoder: has the truth of all bits before i B feedforward delay Fixed bb Y delay decoder Y 1 Fixed delay XOR XOR decoder... t eb (t d)r bb (t d)r 1 1 eb Causal X encoder DMC B (t d)r 1 B Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

50 Pinsker s bounding construction explained Without feedback: E + (R) continues to be a bound. Consider a code with target delay d Use it to construct a block-code with blocksize n d Genie-aided decoder: has the truth of all bits before i Error events for genie-aided system depend only on last d B feedforward delay Fixed bb Y delay decoder Y 1 Fixed delay XOR XOR decoder... t eb (t d)r bb (t d)r 1 1 eb Causal X encoder DMC B (t d)r 1 B Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

51 Pinsker s bounding construction explained Without feedback: E + (R) continues to be a bound. Consider a code with target delay d Use it to construct a block-code with blocksize n d Genie-aided decoder: has the truth of all bits before i Error events for genie-aided system depend only on last d Apply a change of measure argument B feedforward delay Fixed bb Y delay decoder Y 1 Fixed delay XOR XOR decoder... t eb (t d)r bb (t d)r 1 1 eb Causal X encoder DMC B (t d)r 1 B Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

52 Using E sp to bound α in general Past behavior λn λ 1 λ d Future (1 λ)n d λr n bits within deadline bits to ignore The block error probability is like exp( α(1 λ)n) which cannot exceed the Haroutunian bound exp( E sp (λr)n) Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

53 Using E sp to bound α in general Past behavior λn λ 1 λ d Future (1 λ)n d λr n bits within deadline bits to ignore The block error probability is like exp( α(1 λ)n) which cannot exceed the Haroutunian bound exp( E sp (λr)n) α (R) E sp(λr) 1 λ Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

54 Using E sp to bound α in general Past behavior λn λ 1 λ d Future (1 λ)n d λr n bits within deadline bits to ignore The block error probability is like exp( α(1 λ)n) which cannot exceed the Haroutunian bound exp( E sp (λr)n) α (R) E sp(λr) 1 λ The error events involve both the past and the future. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

55 Uncertainty-focusing bound for symmetric DMCs Minimize over λ for symmetric DMCs to sweep out frontier by varying ρ > 0: Using the Gallager function: R(ρ) = E 0(ρ) ρ E a + (ρ) = E 0 (ρ) E 0 (ρ) = max ln q j ( q i p i ) 1+ρ 1 1+ρ ij Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

56 Uncertainty-focusing bound for symmetric DMCs Minimize over λ for symmetric DMCs to sweep out frontier by varying ρ > 0: Using the Gallager function: R(ρ) = E 0(ρ) ρ E a + (ρ) = E 0 (ρ) E 0 (ρ) = max ln q j ( q i p i ) 1+ρ 1 1+ρ ij Same form as Viterbi s convolutional coding bound for constraint-lengths, but a lot more fundamental! Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

57 Upper bound tight for the BEC with feedback 7 Error Exponent (base 2) Rate (in bits) Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

58 Implications for scalar moment stabilization Error Exponent (base e) Rate (in nats) Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

59 Implications for scalar moment stabilization 10 8 Moments stabilized Open loop unstable gain Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

60 Outline 1 A bridge to nowhere? A simple control problem A connection to information theory Fixing information theory and filling in the gaps. 2 Coming back to control What is wrong with random coding The role of noiseless feedback 3 Taking control thinking to the forefront of information theory. The holy grail problem Control thinking to the rescue! Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

61 Random coding bound is relatively easy to achieve Randomly label the uniformly quantized state! t=0 t=1 t=2 t=3 t= R = log 2 3 Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

62 Random coding bound is relatively easy to achieve Randomly label the uniformly quantized state! Stable system state renews itself. t=0 t=1 t=2 t=3 t= R = log 2 3 Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

63 Random coding bound is relatively easy to achieve Randomly label the uniformly quantized state! Stable system state renews itself. It diverges locally whenever the channel misbehaves. t=0 t=1 t=2 t=3 t= R = log 2 3 Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

64 Random coding bound is relatively easy to achieve Randomly label the uniformly quantized state! Stable system state renews itself. It diverges locally whenever the channel misbehaves. Semi-reasonable implementation complexity. t=0 t=1 t=2 t=3 t= R = log 2 3 Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

65 Controller and Computations All false disjoint paths through the trellis are pairwise independent with the true path. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

66 Controller and Computations All false disjoint paths through the trellis are pairwise independent with the true path. Bound the number of distinct paths by assuming no remerging. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

67 Controller and Computations All false disjoint paths through the trellis are pairwise independent with the true path. Bound the number of distinct paths by assuming no remerging. Gallager s E r (R branch ) emerges as the governing exponent. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

68 Controller and Computations All false disjoint paths through the trellis are pairwise independent with the true path. Bound the number of distinct paths by assuming no remerging. Gallager s E r (R branch ) emerges as the governing exponent. Apply the control based on current ML state. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

69 Controller and Computations All false disjoint paths through the trellis are pairwise independent with the true path. Bound the number of distinct paths by assuming no remerging. Gallager s E r (R branch ) emerges as the governing exponent. Apply the control based on current ML state. Computational nightmare: effort grows exponentially with time. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

70 Controller and Computations All false disjoint paths through the trellis are pairwise independent with the true path. Bound the number of distinct paths by assuming no remerging. Gallager s E r (R branch ) emerges as the governing exponent. Apply the control based on current ML state. Computational nightmare: effort grows exponentially with time. Use Stack-based greedy search algorithm instead. Log likelihoods are additive. The score of a path is a random walk with drift. Bias it so that the true path goes up and false ones down. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

71 Controller and Computations All false disjoint paths through the trellis are pairwise independent with the true path. Bound the number of distinct paths by assuming no remerging. Gallager s E r (R branch ) emerges as the governing exponent. Apply the control based on current ML state. Computational nightmare: effort grows exponentially with time. Use Stack-based greedy search algorithm instead. Log likelihoods are additive. The score of a path is a random walk with drift. Bias it so that the true path goes up and false ones down. Classical results tell us that with appropriate bias, achieve E r (R branch ) for error probability and hence power-law in state Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

72 Controller and Computations All false disjoint paths through the trellis are pairwise independent with the true path. Bound the number of distinct paths by assuming no remerging. Gallager s E r (R branch ) emerges as the governing exponent. Apply the control based on current ML state. Computational nightmare: effort grows exponentially with time. Use Stack-based greedy search algorithm instead. Log likelihoods are additive. The score of a path is a random walk with drift. Bias it so that the true path goes up and false ones down. Classical results tell us that with appropriate bias, achieve E r (R branch ) for error probability and hence power-law in state At the cost of only finite expected computation. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

73 Catch up all-at-once phenomenon Simulation Parameters: λ = 1.1 ε = 0.05 Ω = 2.0 = Bias = 0.55 T = ,000 Blocks 17 seconds to run Decoded Bin Actual Bin Rate = Capacity = 0.71 Bin Block Number Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

74 Truth in advertising: computation revisited Although we are doing better than exponential growth, we still have power laws on both sides. What if we needed a finite speed computer in the controller? Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

75 Truth in advertising: computation revisited Although we are doing better than exponential growth, we still have power laws on both sides. What if we needed a finite speed computer in the controller? Bad news: Assume 0 control applied if we can not decode yet. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

76 Truth in advertising: computation revisited Although we are doing better than exponential growth, we still have power laws on both sides. What if we needed a finite speed computer in the controller? Bad news: Assume 0 control applied if we can not decode yet. Power law for comp. implies power low for waiting. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

77 Truth in advertising: computation revisited Although we are doing better than exponential growth, we still have power laws on both sides. What if we needed a finite speed computer in the controller? Bad news: Assume 0 control applied if we can not decode yet. Power law for comp. implies power low for waiting. Exponentially rare doubly exponentially bad states! Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

78 How to hit the higher bound? Error Exponent (base e) Rate (in nats) Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

79 How to hit the higher bound? 10 8 Moments stabilized Open loop unstable gain Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

80 Fortified channels Noisy forward channel uses Fortification noiseless forward channel uses S 1 S 2 S 3 S 4 S 5 S 6 Some mix of noisy and noiseless channels Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

81 Fortified channels Noisy forward channel uses Fortification noiseless forward channel uses S 1 S 2 S 3 S 4 S 5 S 6 Some mix of noisy and noiseless channels Is it all or nothing? Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

82 Noiseless channel can enable event-based sampling Noisy forward channel uses Fortification noiseless forward channel uses S 1 S 2 S 3 S 4 S 5 S 6 Need to allow for gradual progress during bad periods. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

83 Noiseless channel can enable event-based sampling Noisy forward channel uses Fortification noiseless forward channel uses S 1 S 2 S 3 S 4 S 5 S 6 Need to allow for gradual progress during bad periods. Use the noiseless channel for supervisory information: Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

84 Noiseless channel can enable event-based sampling Noisy forward channel uses Fortification noiseless forward channel uses S 1 S 2 S 3 S 4 S 5 S 6 Need to allow for gradual progress during bad periods. Use the noiseless channel for supervisory information: Have the observer do event-based sampling of the state. Outer net to quantize and encode the state Inner catchment area to resample the state Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

85 Noiseless channel can enable event-based sampling Noisy forward channel uses Fortification noiseless forward channel uses S 1 S 2 S 3 S 4 S 5 S 6 Need to allow for gradual progress during bad periods. Use the noiseless channel for supervisory information: Have the observer do event-based sampling of the state. Quantization net grows as needed, but has only e nr boxes. Outer net to quantize and encode the state Inner catchment area to resample the state Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

86 Noiseless channel can enable event-based sampling Noisy forward channel uses Fortification noiseless forward channel uses S 1 S 2 S 3 S 4 S 5 S 6 Need to allow for gradual progress during bad periods. Use the noiseless channel for supervisory information: Have the observer do event-based sampling of the state. Quantization net grows as needed, but has only e nr boxes. Noiseless channel tells controller when it has resampled. Outer net to quantize and encode the state Inner catchment area to resample the state Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

87 Noiseless channel can enable event-based sampling Noisy forward channel uses Fortification noiseless forward channel uses S 1 S 2 S 3 S 4 S 5 S 6 Need to allow for gradual progress during bad periods. Use the noiseless channel for supervisory information: Have the observer do event-based sampling of the state. Quantization net grows as needed, but has only e nr boxes. Noiseless channel tells controller when it has resampled. Use the noisy channel for variable-length block-coding. Outer net to quantize and encode the state Inner catchment area to resample the state Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

88 Why gradual progress is better: intuition 130 Progres (in bits) time (in BEC uses) Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

89 Why this works: proof strategy Lift problem by using large nr Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

90 Why this works: proof strategy Lift problem by using large nr Very few noiseless channel uses required Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

91 Why this works: proof strategy Lift problem by using large nr Very few noiseless channel uses required Stopping time for variable-length channel is like n + T, where T is geometric exp( E 0 (ρ)). Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

92 Why this works: proof strategy Lift problem by using large nr Very few noiseless channel uses required Stopping time for variable-length channel is like n + T, where T is geometric exp( E 0 (ρ)). Interpret with ln λ < R = E 0(ρ) ρ < E0(η+ǫ) η+ǫ Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

93 Why this works: proof strategy Lift problem by using large nr Very few noiseless channel uses required Stopping time for variable-length channel is like n + T, where T is geometric exp( E 0 (ρ)). Interpret with ln λ < R = E 0(ρ) ρ < E0(η+ǫ) η+ǫ Behaves like a virtual packet-erasure channel. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

94 Why this works: proof strategy Lift problem by using large nr Very few noiseless channel uses required Stopping time for variable-length channel is like n + T, where T is geometric exp( E 0 (ρ)). Interpret with ln λ < R = E 0(ρ) ρ < E0(η+ǫ) η+ǫ Behaves like a virtual packet-erasure channel. Each packet carries n(r ln λ) nats. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

95 Why this works: proof strategy Lift problem by using large nr Very few noiseless channel uses required Stopping time for variable-length channel is like n + T, where T is geometric exp( E 0 (ρ)). Interpret with ln λ < R = E 0(ρ) ρ < E0(η+ǫ) η+ǫ Behaves like a virtual packet-erasure channel. Each packet carries n(r ln λ) nats. Disturbances grow by factor O(λ n ) Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

96 Why this works: proof strategy Lift problem by using large nr Very few noiseless channel uses required Stopping time for variable-length channel is like n + T, where T is geometric exp( E 0 (ρ)). Interpret with ln λ < R = E 0(ρ) ρ < E0(η+ǫ) η+ǫ Behaves like a virtual packet-erasure channel. Each packet carries n(r ln λ) nats. Disturbances grow by factor O(λ n ) Erasure probability exp( E0 (ρ)) Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

97 Outline 1 A bridge to nowhere? A simple control problem A connection to information theory Fixing information theory and filling in the gaps. 2 Coming back to control What is wrong with random coding The role of noiseless feedback 3 Taking control thinking to the forefront of information theory. The holy grail problem Control thinking to the rescue! Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

98 The holy grail: understanding complexity Classical goal: arbitrarily low probability of error. Classical assumption: not delay sensitive at all. New twist: minimize total power consumption Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

99 The holy grail: understanding complexity Classical goal: arbitrarily low probability of error. Classical assumption: not delay sensitive at all. New twist: minimize total power consumption Important technology trends Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

100 The holy grail: understanding complexity Classical goal: arbitrarily low probability of error. Classical assumption: not delay sensitive at all. New twist: minimize total power consumption Important technology trends Moore s law allows billions of transistors, and but only mildly reduces power-consumption per transistor. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

101 The holy grail: understanding complexity Classical goal: arbitrarily low probability of error. Classical assumption: not delay sensitive at all. New twist: minimize total power consumption Important technology trends Moore s law allows billions of transistors, and but only mildly reduces power-consumption per transistor. New short-range applications: swarm behavior, in-home networks, dense meshes, personal-area networks, UWB, between-chip communication, etc. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

102 Decoding power vs communication range γ (db) mm 10mm 1m 100m 10km Distance Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

103 Dense linear codes with brute-force decoding Total power Optimal transmit power Decoding power Shannon waterfall log 10 ( P e ) Power Decoding Power nr2 nr, Error Prob 2 Esp(R,P)n Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

104 Convolutional codes with Viterbi decoding Total power Optimal transmit power Decoding power Shannon waterfall log 10 ( P e ) Power Decoding Power L c R2 LcR, Error Prob 2 Econv(R,P)Lc Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

105 Convolutional with magical sequential decoding Shannon waterfall Optimal transmit power Decoding power Total power log 10 ( P e ) Power Decoding Power L c R, Error Prob 2 Econv(R,P)Lc Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

106 Dense linear codes with magical syndrome decoding Shannon waterfall Optimal transmit power Total power Decoding power log 10 ( P e ) Power Decoding Power (1 R)nR, Error Prob 2 Esp(R,P)n Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

107 A new hope: iterative decoding Make assumptions about the decoder implementation rather than the code. Rich enough to capture LDPC, RA, Turbo, etc. codes. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

108 A new hope: iterative decoding Make assumptions about the decoder implementation rather than the code. Massively parallel computational nodes. Each connected to at most α + 1 other nodes. Rich enough to capture LDPC, RA, Turbo, etc. codes. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

109 A new hope: iterative decoding Make assumptions about the decoder implementation rather than the code. Massively parallel computational nodes. Each connected to at most α + 1 other nodes. Each consumes Enode energy per iteration and can send arbitrary messages to its neighbors. Rich enough to capture LDPC, RA, Turbo, etc. codes. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

110 A new hope: iterative decoding Make assumptions about the decoder implementation rather than the code. Massively parallel computational nodes. Each connected to at most α + 1 other nodes. Each consumes Enode energy per iteration and can send arbitrary messages to its neighbors. Some nodes initialized with a single received codeword symbol. Some nodes responsible for decoding a single message bit. Rich enough to capture LDPC, RA, Turbo, etc. codes. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

111 A new hope: iterative decoding Make assumptions about the decoder implementation rather than the code. Massively parallel computational nodes. Each connected to at most α + 1 other nodes. Each consumes Enode energy per iteration and can send arbitrary messages to its neighbors. Some nodes initialized with a single received codeword symbol. Some nodes responsible for decoding a single message bit. Run for a fixed number of iterations i. Rich enough to capture LDPC, RA, Turbo, etc. codes. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

112 A new hope: iterative decoding Make assumptions about the decoder implementation rather than the code. Massively parallel computational nodes. Each connected to at most α + 1 other nodes. Each consumes Enode energy per iteration and can send arbitrary messages to its neighbors. Some nodes initialized with a single received codeword symbol. Some nodes responsible for decoding a single message bit. Run for a fixed number of iterations i. Rich enough to capture LDPC, RA, Turbo, etc. codes. Power-consumption ie node per received sample. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

113 B Y B R N Y 1 Y B Y 2 Y X Y Y 7 X Y 8 How to lower-bound the number of iterations? o o t b i t o d e B i Y i Key concept: decoding neighborhoods = information patterns B X X Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

114 B Y B R N Y 1 Y B Y 2 Y X Y Y 7 X Y 8 How to lower-bound the number of iterations? o o t b i t o d e B 1 2 B i Y i 7 8 Key concept: decoding neighborhoods = information patterns Decoding neighborhood size n 1 + (α + 1)α i 1 α i. X X Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

115 B Y B R N Y 1 Y B Y 2 Y X Y Y 7 X Y 8 How to lower-bound the number of iterations? X 4 5 o o t 5 b i t o d e B B i X Y i Key concept: decoding neighborhoods = information patterns Decoding neighborhood size n 1 + (α + 1)α i 1 α i. Need to lower-bound average probability of bit error in terms of n. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

116 B Y B R N Y 1 Y B Y 2 Y X Y Y 7 X Y 8 How to lower-bound the number of iterations? X 4 5 o o t K b i t o d e B B i X Y i Key concept: decoding neighborhoods = information patterns Decoding neighborhood size n 1 + (α + 1)α i 1 α i. Need to lower-bound average probability of bit error in terms of n. Key insight: n is playing a role analogous to delay. Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

117 A local sphere-packing bound for the AWGN Decoding neighborhood size n 1 + (α + 1)α i 1 α i. P e sup σ 2 G >σp 2 µ(n): C(G)<R h 1 b (δ(g)) exp 2 ( nd(σ 2 G σ 2 P) 1 2 C(G) = 1 2 log 2 (1 + P T ), δ(g): 1 C(G) σg 2 R ( σ 2 )) φ(n, h 1 b (δ(g))) G σp 2 1 Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

118 A local sphere-packing bound for the AWGN Decoding neighborhood size n 1 + (α + 1)α i 1 α i. P e sup σ 2 G >σp 2 µ(n): C(G)<R h 1 b (δ(g)) exp 2 ( nd(σ 2 G σ 2 P) 1 2 C(G) = 1 2 log 2 (1 + P T ), δ(g): 1 C(G) σg 2 R µ(n) = 1 2 (1 + 1 T(n)+1 + 4T(n)+2 nt(n)(1+t(n)) ) T(n) = W L ( exp( 1)(1/4) 1/n ) W L (x) solves x = W L (x) exp(w L (x)) ) φ(n, y) = n(w L ( exp( 1)( y 2 ) 2 n + 1) ( σ 2 )) φ(n, h 1 b (δ(g))) G σp 2 1 Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

119 A local sphere-packing bound for the AWGN Decoding neighborhood size n 1 + (α + 1)α i 1 α i. P e sup σ 2 G >σp 2 µ(n): C(G)<R h 1 b (δ(g)) exp 2 ( nd(σ 2 G σ 2 P) 1 2 C(G) = 1 2 log 2 (1 + P T ), δ(g): 1 C(G) σg 2 R µ(n) = 1 2 (1 + 1 T(n)+1 + 4T(n)+2 nt(n)(1+t(n)) ) T(n) = W L ( exp( 1)(1/4) 1/n ) W L (x) solves x = W L (x) exp(w L (x)) ) φ(n, y) = n(w L ( exp( 1)( y 2 ) 2 n + 1) ( σ 2 )) φ(n, h 1 b (δ(g))) G σp 2 1 Double-exponential potential return on investments in decoding power! Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

120 Waterslide curves for general AWGN case 2 4 γ = 0.4 γ = 0.3 γ = 0.2 Shannon limit log 10 ( P e ) Power Anant Sahai (UC Berkeley) IT + Control Mar 26, / 54

Delay, feedback, and the price of ignorance

Delay, feedback, and the price of ignorance Delay, feedback, and the price of ignorance Anant Sahai based in part on joint work with students: Tunc Simsek Cheng Chang Wireless Foundations Department of Electrical Engineering and Computer Sciences

More information

Universal Anytime Codes: An approach to uncertain channels in control

Universal Anytime Codes: An approach to uncertain channels in control Universal Anytime Codes: An approach to uncertain channels in control paper by Stark Draper and Anant Sahai presented by Sekhar Tatikonda Wireless Foundations Department of Electrical Engineering and Computer

More information

Feedback and Side-Information in Information Theory

Feedback and Side-Information in Information Theory Feedback and Side-Information in Information Theory Anant Sahai and Sekhar Tatikonda UC Berkeley and Yale sahai@eecs.berkeley.edu and sekhar.tatikonda@yale.edu ISIT 27 Tutorial T2 Nice, France June 24,

More information

Stabilization over discrete memoryless and wideband channels using nearly memoryless observations

Stabilization over discrete memoryless and wideband channels using nearly memoryless observations Stabilization over discrete memoryless and wideband channels using nearly memoryless observations Anant Sahai Abstract We study stabilization of a discrete-time scalar unstable plant over a noisy communication

More information

Coding into a source: an inverse rate-distortion theorem

Coding into a source: an inverse rate-distortion theorem Coding into a source: an inverse rate-distortion theorem Anant Sahai joint work with: Mukul Agarwal Sanjoy K. Mitter Wireless Foundations Department of Electrical Engineering and Computer Sciences University

More information

A Simple Encoding And Decoding Strategy For Stabilization Over Discrete Memoryless Channels

A Simple Encoding And Decoding Strategy For Stabilization Over Discrete Memoryless Channels A Simple Encoding And Decoding Strategy For Stabilization Over Discrete Memoryless Channels Anant Sahai and Hari Palaiyanur Dept. of Electrical Engineering and Computer Sciences University of California,

More information

On the variable-delay reliability function of discrete memoryless channels with access to noisy feedback

On the variable-delay reliability function of discrete memoryless channels with access to noisy feedback ITW2004, San Antonio, Texas, October 24 29, 2004 On the variable-delay reliability function of discrete memoryless channels with access to noisy feedback Anant Sahai and Tunç Şimşek Electrical Engineering

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

The necessity and sufficiency of anytime capacity for control over a noisy communication link: Parts I and II

The necessity and sufficiency of anytime capacity for control over a noisy communication link: Parts I and II The necessity and sufficiency of anytime capacity for control over a noisy communication link: Parts I and II Anant Sahai, Sanjoy Mitter sahai@eecs.berkeley.edu, mitter@mit.edu Abstract We review how Shannon

More information

Relaying Information Streams

Relaying Information Streams Relaying Information Streams Anant Sahai UC Berkeley EECS sahai@eecs.berkeley.edu Originally given: Oct 2, 2002 This was a talk only. I was never able to rigorously formalize the back-of-the-envelope reasoning

More information

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Igal Sason Department of Electrical Engineering Technion - Israel Institute of Technology Haifa 32000, Israel 2009 IEEE International

More information

Chapter 1 Elements of Information Theory for Networked Control Systems

Chapter 1 Elements of Information Theory for Networked Control Systems Chapter 1 Elements of Information Theory for Networked Control Systems Massimo Franceschetti and Paolo Minero 1.1 Introduction Next generation cyber-physical systems [35] will integrate computing, communication,

More information

Towards control over fading channels

Towards control over fading channels Towards control over fading channels Paolo Minero, Massimo Franceschetti Advanced Network Science University of California San Diego, CA, USA mail: {minero,massimo}@ucsd.edu Invited Paper) Subhrakanti

More information

Anytime Capacity of the AWGN+Erasure Channel with Feedback. Qing Xu. B.S. (Beijing University) 1997 M.S. (University of California at Berkeley) 2000

Anytime Capacity of the AWGN+Erasure Channel with Feedback. Qing Xu. B.S. (Beijing University) 1997 M.S. (University of California at Berkeley) 2000 Anytime Capacity of the AWGN+Erasure Channel with Feedback by Qing Xu B.S. (Beijing University) 1997 M.S. (University of California at Berkeley) 2000 A dissertation submitted in partial satisfaction of

More information

Data Rate Theorem for Stabilization over Time-Varying Feedback Channels

Data Rate Theorem for Stabilization over Time-Varying Feedback Channels Data Rate Theorem for Stabilization over Time-Varying Feedback Channels Workshop on Frontiers in Distributed Communication, Sensing and Control Massimo Franceschetti, UCSD (joint work with P. Minero, S.

More information

Characterization of Information Channels for Asymptotic Mean Stationarity and Stochastic Stability of Nonstationary/Unstable Linear Systems

Characterization of Information Channels for Asymptotic Mean Stationarity and Stochastic Stability of Nonstationary/Unstable Linear Systems 6332 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 58, NO 10, OCTOBER 2012 Characterization of Information Channels for Asymptotic Mean Stationarity and Stochastic Stability of Nonstationary/Unstable Linear

More information

Capacity-achieving Feedback Scheme for Flat Fading Channels with Channel State Information

Capacity-achieving Feedback Scheme for Flat Fading Channels with Channel State Information Capacity-achieving Feedback Scheme for Flat Fading Channels with Channel State Information Jialing Liu liujl@iastate.edu Sekhar Tatikonda sekhar.tatikonda@yale.edu Nicola Elia nelia@iastate.edu Dept. of

More information

Lecture 4 Channel Coding

Lecture 4 Channel Coding Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity

More information

NAME... Soc. Sec. #... Remote Location... (if on campus write campus) FINAL EXAM EE568 KUMAR. Sp ' 00

NAME... Soc. Sec. #... Remote Location... (if on campus write campus) FINAL EXAM EE568 KUMAR. Sp ' 00 NAME... Soc. Sec. #... Remote Location... (if on campus write campus) FINAL EXAM EE568 KUMAR Sp ' 00 May 3 OPEN BOOK exam (students are permitted to bring in textbooks, handwritten notes, lecture notes

More information

Source coding and channel requirements for unstable processes. Anant Sahai, Sanjoy Mitter

Source coding and channel requirements for unstable processes. Anant Sahai, Sanjoy Mitter Source coding and channel requirements for unstable processes Anant Sahai, Sanjoy Mitter sahai@eecs.berkeley.edu, mitter@mit.edu Abstract Our understanding of information in systems has been based on the

More information

The Poisson Channel with Side Information

The Poisson Channel with Side Information The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH

More information

Chapter 9 Fundamental Limits in Information Theory

Chapter 9 Fundamental Limits in Information Theory Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For

More information

The Hallucination Bound for the BSC

The Hallucination Bound for the BSC The Hallucination Bound for the BSC Anant Sahai and Stark Draper Wireless Foundations Department of Electrical Engineering and Computer Sciences University of California at Berkeley ECE Department University

More information

Arimoto Channel Coding Converse and Rényi Divergence

Arimoto Channel Coding Converse and Rényi Divergence Arimoto Channel Coding Converse and Rényi Divergence Yury Polyanskiy and Sergio Verdú Abstract Arimoto proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code

More information

Towards a theory of information for energy-efficient communication

Towards a theory of information for energy-efficient communication Towards a theory of information for energy-efficient communication Pulkit Grover Wireless Foundations and Berkeley Wireless Research Center, UC Berkeley with Anant Sahai, Hari Palaiyanur, Kristen Woyach

More information

CSCI 2570 Introduction to Nanocomputing

CSCI 2570 Introduction to Nanocomputing CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication

More information

An introduction to basic information theory. Hampus Wessman

An introduction to basic information theory. Hampus Wessman An introduction to basic information theory Hampus Wessman Abstract We give a short and simple introduction to basic information theory, by stripping away all the non-essentials. Theoretical bounds on

More information

Appendix B Information theory from first principles

Appendix B Information theory from first principles Appendix B Information theory from first principles This appendix discusses the information theory behind the capacity expressions used in the book. Section 8.3.4 is the only part of the book that supposes

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

Lecture 4 : Introduction to Low-density Parity-check Codes

Lecture 4 : Introduction to Low-density Parity-check Codes Lecture 4 : Introduction to Low-density Parity-check Codes LDPC codes are a class of linear block codes with implementable decoders, which provide near-capacity performance. History: 1. LDPC codes were

More information

Introduction to Low-Density Parity Check Codes. Brian Kurkoski

Introduction to Low-Density Parity Check Codes. Brian Kurkoski Introduction to Low-Density Parity Check Codes Brian Kurkoski kurkoski@ice.uec.ac.jp Outline: Low Density Parity Check Codes Review block codes History Low Density Parity Check Codes Gallager s LDPC code

More information

UNIT I INFORMATION THEORY. I k log 2

UNIT I INFORMATION THEORY. I k log 2 UNIT I INFORMATION THEORY Claude Shannon 1916-2001 Creator of Information Theory, lays the foundation for implementing logic in digital circuits as part of his Masters Thesis! (1939) and published a paper

More information

LECTURE 10. Last time: Lecture outline

LECTURE 10. Last time: Lecture outline LECTURE 10 Joint AEP Coding Theorem Last time: Error Exponents Lecture outline Strong Coding Theorem Reading: Gallager, Chapter 5. Review Joint AEP A ( ɛ n) (X) A ( ɛ n) (Y ) vs. A ( ɛ n) (X, Y ) 2 nh(x)

More information

Decoding the Tail-Biting Convolutional Codes with Pre-Decoding Circular Shift

Decoding the Tail-Biting Convolutional Codes with Pre-Decoding Circular Shift Decoding the Tail-Biting Convolutional Codes with Pre-Decoding Circular Shift Ching-Yao Su Directed by: Prof. Po-Ning Chen Department of Communications Engineering, National Chiao-Tung University July

More information

Lecture 8: Shannon s Noise Models

Lecture 8: Shannon s Noise Models Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have

More information

EE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes

EE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes EE229B - Final Project Capacity-Approaching Low-Density Parity-Check Codes Pierre Garrigues EECS department, UC Berkeley garrigue@eecs.berkeley.edu May 13, 2005 Abstract The class of low-density parity-check

More information

LECTURE 13. Last time: Lecture outline

LECTURE 13. Last time: Lecture outline LECTURE 13 Last time: Strong coding theorem Revisiting channel and codes Bound on probability of error Error exponent Lecture outline Fano s Lemma revisited Fano s inequality for codewords Converse to

More information

The Method of Types and Its Application to Information Hiding

The Method of Types and Its Application to Information Hiding The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,

More information

Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels

Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels Jilei Hou, Paul H. Siegel and Laurence B. Milstein Department of Electrical and Computer Engineering

More information

Error Exponent Region for Gaussian Broadcast Channels

Error Exponent Region for Gaussian Broadcast Channels Error Exponent Region for Gaussian Broadcast Channels Lihua Weng, S. Sandeep Pradhan, and Achilleas Anastasopoulos Electrical Engineering and Computer Science Dept. University of Michigan, Ann Arbor, MI

More information

Communication constraints and latency in Networked Control Systems

Communication constraints and latency in Networked Control Systems Communication constraints and latency in Networked Control Systems João P. Hespanha Center for Control Engineering and Computation University of California Santa Barbara In collaboration with Antonio Ortega

More information

6.02 Fall 2011 Lecture #9

6.02 Fall 2011 Lecture #9 6.02 Fall 2011 Lecture #9 Claude E. Shannon Mutual information Channel capacity Transmission at rates up to channel capacity, and with asymptotically zero error 6.02 Fall 2011 Lecture 9, Slide #1 First

More information

Physical Layer and Coding

Physical Layer and Coding Physical Layer and Coding Muriel Médard Professor EECS Overview A variety of physical media: copper, free space, optical fiber Unified way of addressing signals at the input and the output of these media:

More information

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1 Kraft s inequality An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if N 2 l i 1 Proof: Suppose that we have a tree code. Let l max = max{l 1,...,

More information

An Introduction to Low Density Parity Check (LDPC) Codes

An Introduction to Low Density Parity Check (LDPC) Codes An Introduction to Low Density Parity Check (LDPC) Codes Jian Sun jian@csee.wvu.edu Wireless Communication Research Laboratory Lane Dept. of Comp. Sci. and Elec. Engr. West Virginia University June 3,

More information

1 Background on Information Theory

1 Background on Information Theory Review of the book Information Theory: Coding Theorems for Discrete Memoryless Systems by Imre Csiszár and János Körner Second Edition Cambridge University Press, 2011 ISBN:978-0-521-19681-9 Review by

More information

The error exponent with delay for lossless source coding

The error exponent with delay for lossless source coding The error eponent with delay for lossless source coding Cheng Chang and Anant Sahai Wireless Foundations, University of California at Berkeley cchang@eecs.berkeley.edu, sahai@eecs.berkeley.edu Abstract

More information

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

for some error exponent E( R) as a function R,

for some error exponent E( R) as a function R, . Capacity-achieving codes via Forney concatenation Shannon s Noisy Channel Theorem assures us the existence of capacity-achieving codes. However, exhaustive search for the code has double-exponential

More information

Time-division multiplexing for green broadcasting

Time-division multiplexing for green broadcasting Time-division multiplexing for green broadcasting Pulkit Grover and Anant Sahai Wireless Foundations, Department of EECS University of California at Berkeley Email: {pulkit, sahai} @ eecs.berkeley.edu

More information

LECTURE 3. Last time:

LECTURE 3. Last time: LECTURE 3 Last time: Mutual Information. Convexity and concavity Jensen s inequality Information Inequality Data processing theorem Fano s Inequality Lecture outline Stochastic processes, Entropy rate

More information

An Improved Sphere-Packing Bound for Finite-Length Codes over Symmetric Memoryless Channels

An Improved Sphere-Packing Bound for Finite-Length Codes over Symmetric Memoryless Channels An Improved Sphere-Packing Bound for Finite-Length Codes over Symmetric Memoryless Channels Gil Wiechman Igal Sason Department of Electrical Engineering Technion, Haifa 3000, Israel {igillw@tx,sason@ee}.technion.ac.il

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute ENEE 739C: Advanced Topics in Signal Processing: Coding Theory Instructor: Alexander Barg Lecture 6 (draft; 9/6/03. Error exponents for Discrete Memoryless Channels http://www.enee.umd.edu/ abarg/enee739c/course.html

More information

Towards a Theory of Information Flow in the Finitary Process Soup

Towards a Theory of Information Flow in the Finitary Process Soup Towards a Theory of in the Finitary Process Department of Computer Science and Complexity Sciences Center University of California at Davis June 1, 2010 Goals Analyze model of evolutionary self-organization

More information

Code design: Computer search

Code design: Computer search Code design: Computer search Low rate codes Represent the code by its generator matrix Find one representative for each equivalence class of codes Permutation equivalences? Do NOT try several generator

More information

Information Theory - Entropy. Figure 3

Information Theory - Entropy. Figure 3 Concept of Information Information Theory - Entropy Figure 3 A typical binary coded digital communication system is shown in Figure 3. What is involved in the transmission of information? - The system

More information

Turbo Compression. Andrej Rikovsky, Advisor: Pavol Hanus

Turbo Compression. Andrej Rikovsky, Advisor: Pavol Hanus Turbo Compression Andrej Rikovsky, Advisor: Pavol Hanus Abstract Turbo codes which performs very close to channel capacity in channel coding can be also used to obtain very efficient source coding schemes.

More information

Lecture 1: Entropy, convexity, and matrix scaling CSE 599S: Entropy optimality, Winter 2016 Instructor: James R. Lee Last updated: January 24, 2016

Lecture 1: Entropy, convexity, and matrix scaling CSE 599S: Entropy optimality, Winter 2016 Instructor: James R. Lee Last updated: January 24, 2016 Lecture 1: Entropy, convexity, and matrix scaling CSE 599S: Entropy optimality, Winter 2016 Instructor: James R. Lee Last updated: January 24, 2016 1 Entropy Since this course is about entropy maximization,

More information

Codes on graphs and iterative decoding

Codes on graphs and iterative decoding Codes on graphs and iterative decoding Bane Vasić Error Correction Coding Laboratory University of Arizona Funded by: National Science Foundation (NSF) Seagate Technology Defense Advanced Research Projects

More information

Chapter 7: Channel coding:convolutional codes

Chapter 7: Channel coding:convolutional codes Chapter 7: : Convolutional codes University of Limoges meghdadi@ensil.unilim.fr Reference : Digital communications by John Proakis; Wireless communication by Andreas Goldsmith Encoder representation Communication

More information

Lecture 11: Quantum Information III - Source Coding

Lecture 11: Quantum Information III - Source Coding CSCI5370 Quantum Computing November 25, 203 Lecture : Quantum Information III - Source Coding Lecturer: Shengyu Zhang Scribe: Hing Yin Tsang. Holevo s bound Suppose Alice has an information source X that

More information

2. Transience and Recurrence

2. Transience and Recurrence Virtual Laboratories > 15. Markov Chains > 1 2 3 4 5 6 7 8 9 10 11 12 2. Transience and Recurrence The study of Markov chains, particularly the limiting behavior, depends critically on the random times

More information

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel Introduction to Coding Theory CMU: Spring 2010 Notes 3: Stochastic channels and noisy coding theorem bound January 2010 Lecturer: Venkatesan Guruswami Scribe: Venkatesan Guruswami We now turn to the basic

More information

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H.

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H. Problem sheet Ex. Verify that the function H(p,..., p n ) = k p k log p k satisfies all 8 axioms on H. Ex. (Not to be handed in). looking at the notes). List as many of the 8 axioms as you can, (without

More information

5. Density evolution. Density evolution 5-1

5. Density evolution. Density evolution 5-1 5. Density evolution Density evolution 5-1 Probabilistic analysis of message passing algorithms variable nodes factor nodes x1 a x i x2 a(x i ; x j ; x k ) x3 b x4 consider factor graph model G = (V ;

More information

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Nevroz Şen, Fady Alajaji and Serdar Yüksel Department of Mathematics and Statistics Queen s University Kingston, ON K7L 3N6, Canada

More information

Upper Bounds on the Capacity of Binary Intermittent Communication

Upper Bounds on the Capacity of Binary Intermittent Communication Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,

More information

Noisy channel communication

Noisy channel communication Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 6 Communication channels and Information Some notes on the noisy channel setup: Iain Murray, 2012 School of Informatics, University

More information

Attaining maximal reliability with minimal feedback via joint channel-code and hash-function design

Attaining maximal reliability with minimal feedback via joint channel-code and hash-function design Attaining maimal reliability with minimal feedback via joint channel-code and hash-function design Stark C. Draper, Kannan Ramchandran, Biio Rimoldi, Anant Sahai, and David N. C. Tse Department of EECS,

More information

CHAPTER 8 Viterbi Decoding of Convolutional Codes

CHAPTER 8 Viterbi Decoding of Convolutional Codes MIT 6.02 DRAFT Lecture Notes Fall 2011 (Last update: October 9, 2011) Comments, questions or bug reports? Please contact hari at mit.edu CHAPTER 8 Viterbi Decoding of Convolutional Codes This chapter describes

More information

Spatially Coupled LDPC Codes

Spatially Coupled LDPC Codes Spatially Coupled LDPC Codes Kenta Kasai Tokyo Institute of Technology 30 Aug, 2013 We already have very good codes. Efficiently-decodable asymptotically capacity-approaching codes Irregular LDPC Codes

More information

ON BEAMFORMING WITH FINITE RATE FEEDBACK IN MULTIPLE ANTENNA SYSTEMS

ON BEAMFORMING WITH FINITE RATE FEEDBACK IN MULTIPLE ANTENNA SYSTEMS ON BEAMFORMING WITH FINITE RATE FEEDBACK IN MULTIPLE ANTENNA SYSTEMS KRISHNA KIRAN MUKKAVILLI ASHUTOSH SABHARWAL ELZA ERKIP BEHNAAM AAZHANG Abstract In this paper, we study a multiple antenna system where

More information

Lecture 12. Block Diagram

Lecture 12. Block Diagram Lecture 12 Goals Be able to encode using a linear block code Be able to decode a linear block code received over a binary symmetric channel or an additive white Gaussian channel XII-1 Block Diagram Data

More information

Low-density parity-check codes

Low-density parity-check codes Low-density parity-check codes From principles to practice Dr. Steve Weller steven.weller@newcastle.edu.au School of Electrical Engineering and Computer Science The University of Newcastle, Callaghan,

More information

The Robustness of Dirty Paper Coding and The Binary Dirty Multiple Access Channel with Common Interference

The Robustness of Dirty Paper Coding and The Binary Dirty Multiple Access Channel with Common Interference The and The Binary Dirty Multiple Access Channel with Common Interference Dept. EE - Systems, Tel Aviv University, Tel Aviv, Israel April 25th, 2010 M.Sc. Presentation The B/G Model Compound CSI Smart

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

Cooperative Communication with Feedback via Stochastic Approximation

Cooperative Communication with Feedback via Stochastic Approximation Cooperative Communication with Feedback via Stochastic Approximation Utsaw Kumar J Nicholas Laneman and Vijay Gupta Department of Electrical Engineering University of Notre Dame Email: {ukumar jnl vgupta}@ndedu

More information

Control Capacity. Gireeja Ranade and Anant Sahai UC Berkeley EECS

Control Capacity. Gireeja Ranade and Anant Sahai UC Berkeley EECS Control Capacity Gireeja Ranade and Anant Sahai UC Berkeley EECS gireeja@eecs.berkeley.edu, sahai@eecs.berkeley.edu Abstract This paper presents a notion of control capacity that gives a fundamental limit

More information

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity 5-859: Information Theory and Applications in TCS CMU: Spring 23 Lecture 8: Channel and source-channel coding theorems; BEC & linear codes February 7, 23 Lecturer: Venkatesan Guruswami Scribe: Dan Stahlke

More information

Iterative Quantization. Using Codes On Graphs

Iterative Quantization. Using Codes On Graphs Iterative Quantization Using Codes On Graphs Emin Martinian and Jonathan S. Yedidia 2 Massachusetts Institute of Technology 2 Mitsubishi Electric Research Labs Lossy Data Compression: Encoding: Map source

More information

EVALUATION OF PACKET ERROR RATE IN WIRELESS NETWORKS

EVALUATION OF PACKET ERROR RATE IN WIRELESS NETWORKS EVALUATION OF PACKET ERROR RATE IN WIRELESS NETWORKS Ramin Khalili, Kavé Salamatian LIP6-CNRS, Université Pierre et Marie Curie. Paris, France. Ramin.khalili, kave.salamatian@lip6.fr Abstract Bit Error

More information

LDPC Codes. Intracom Telecom, Peania

LDPC Codes. Intracom Telecom, Peania LDPC Codes Alexios Balatsoukas-Stimming and Athanasios P. Liavas Technical University of Crete Dept. of Electronic and Computer Engineering Telecommunications Laboratory December 16, 2011 Intracom Telecom,

More information

Control Over Noisy Channels

Control Over Noisy Channels IEEE RANSACIONS ON AUOMAIC CONROL, VOL??, NO??, MONH?? 004 Control Over Noisy Channels Sekhar atikonda, Member, IEEE, and Sanjoy Mitter, Fellow, IEEE, Abstract Communication is an important component of

More information

Introduction to Convolutional Codes, Part 1

Introduction to Convolutional Codes, Part 1 Introduction to Convolutional Codes, Part 1 Frans M.J. Willems, Eindhoven University of Technology September 29, 2009 Elias, Father of Coding Theory Textbook Encoder Encoder Properties Systematic Codes

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

LOW-density parity-check (LDPC) codes were invented

LOW-density parity-check (LDPC) codes were invented IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 54, NO 1, JANUARY 2008 51 Extremal Problems of Information Combining Yibo Jiang, Alexei Ashikhmin, Member, IEEE, Ralf Koetter, Senior Member, IEEE, and Andrew

More information

Second-Order Asymptotics in Information Theory

Second-Order Asymptotics in Information Theory Second-Order Asymptotics in Information Theory Vincent Y. F. Tan (vtan@nus.edu.sg) Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) National Taiwan University November 2015

More information

Iterative Encoder-Controller Design for Feedback Control Over Noisy Channels

Iterative Encoder-Controller Design for Feedback Control Over Noisy Channels IEEE TRANSACTIONS ON AUTOMATIC CONTROL 1 Iterative Encoder-Controller Design for Feedback Control Over Noisy Channels Lei Bao, Member, IEEE, Mikael Skoglund, Senior Member, IEEE, and Karl Henrik Johansson,

More information

A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback

A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback Vincent Y. F. Tan (NUS) Joint work with Silas L. Fong (Toronto) 2017 Information Theory Workshop, Kaohsiung,

More information

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem LECTURE 15 Last time: Feedback channel: setting up the problem Perfect feedback Feedback capacity Data compression Lecture outline Joint source and channel coding theorem Converse Robustness Brain teaser

More information

Codes on graphs and iterative decoding

Codes on graphs and iterative decoding Codes on graphs and iterative decoding Bane Vasić Error Correction Coding Laboratory University of Arizona Prelude Information transmission 0 0 0 0 0 0 Channel Information transmission signal 0 0 threshold

More information

The price of certainty: "waterslide curves" and the gap to capacity

The price of certainty: waterslide curves and the gap to capacity The price of certainty: "waterslide curves" and the to capacity Anant Sahai ulkit Grover Electrical Engineering and Computer Sciences University of California at Berkeley Technical Report No. UCB/EECS-008-

More information

Graph-based codes for flash memory

Graph-based codes for flash memory 1/28 Graph-based codes for flash memory Discrete Mathematics Seminar September 3, 2013 Katie Haymaker Joint work with Professor Christine Kelley University of Nebraska-Lincoln 2/28 Outline 1 Background

More information

Channel Coding for Secure Transmissions

Channel Coding for Secure Transmissions Channel Coding for Secure Transmissions March 27, 2017 1 / 51 McEliece Cryptosystem Coding Approach: Noiseless Main Channel Coding Approach: Noisy Main Channel 2 / 51 Outline We present an overiew of linear

More information

Coding on a Trellis: Convolutional Codes

Coding on a Trellis: Convolutional Codes .... Coding on a Trellis: Convolutional Codes Telecommunications Laboratory Alex Balatsoukas-Stimming Technical University of Crete November 6th, 2008 Telecommunications Laboratory (TUC) Coding on a Trellis:

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

Estimating a linear process using phone calls

Estimating a linear process using phone calls Estimating a linear process using phone calls Mohammad Javad Khojasteh, Massimo Franceschetti, Gireeja Ranade Abstract We consider the problem of estimating an undisturbed, scalar, linear process over

More information