Universal Anytime Codes: An approach to uncertain channels in control

Size: px
Start display at page:

Download "Universal Anytime Codes: An approach to uncertain channels in control"

Transcription

1 Universal Anytime Codes: An approach to uncertain channels in control paper by Stark Draper and Anant Sahai presented by Sekhar Tatikonda Wireless Foundations Department of Electrical Engineering and Computer Sciences UC Berkeley Mitsubishi Electric Research Labs Cambridge, MA, USA April 2, 27 Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 / 23

2 Outline Problem setup Channel uncertainty and stabilization Review of past results 2 New result: universal anytime codes 3 Sufficient condition for stabilization 4 Conclusion Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 2 / 23

3 Our simple distributed control problem U t Step Delay D t Unstable System X t Designed Observer O Possible Control Knowledge U t Control Signals Designed Controller C Possible Channel Feedback Delay Uncertain Channel W Step X t+ = λx t + U t + D t Unstable λ >, bounded initial condition and disturbance D Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 4 / 23

4 Our simple distributed control problem U t Step Delay D t Unstable System X t Designed Observer O Possible Control Knowledge U t Control Signals Designed Controller C Possible Channel Feedback Delay Uncertain Channel W Step X t+ = λx t + U t + D t Unstable λ >, bounded initial condition and disturbance D Goal: η-stabilization sup t> E[ X t η ] K for some K < Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 4 / 23

5 Model of Channel Uncertainty: Compound Channels Channel W is known to be memoryless across time Input and output alphabets are known (Y, Z) and finite Set-valued uncertainty W Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 6 / 23

6 Model of Channel Uncertainty: Compound Channels Channel W is known to be memoryless across time Input and output alphabets are known (Y, Z) and finite Set-valued uncertainty W Nature can choose any particular W W Choice remains fixed for all time Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 6 / 23

7 Model of Channel Uncertainty: Compound Channels Channel W is known to be memoryless across time Input and output alphabets are known (Y, Z) and finite Set-valued uncertainty W Nature can choose any particular W W Choice remains fixed for all time Capacity well understood: C = sup Q inf W W I(Q, W) Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 6 / 23

8 Review: Entirely noiseless channel t Window known to contain X t λ t will grow by factor of λ > Encode which control U t to apply Sending R bits, cut window by a factor of 2 R grows by Ω on each side 2 giving a new window for X t+ t+ As long as R > log 2 λ, we can have stay bounded forever Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 8 / 23

9 Review: Delay-universal (anytime) communication B B 2 B 3 B 4 B 5 B 6 B 7 B 8 B 9 B B B 2 B 3 Y Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y Y Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y 2 Y 2 Y 22 Y 23 Y 24 Y 25 Y 26 Z Z 2 Z 3 Z 4 Z 5 Z 6 Z 7 Z 8 Z 9 Z Z Z 2 Z 3 Z 4 Z 5 Z 6 Z 7 Z 8 Z 9 Z 2 Z 2 Z 22 Z 23 Z 24 Z 25 Z 26 bb bb 2 bb 3 bb 4 bb 5 bb 6 bb 7 bb 8 bb 9 fixed delay d = 7 Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 / 23

10 Review: Delay-universal (anytime) communication B B 2 B 3 B 4 B 5 B 6 B 7 B 8 B 9 B B B 2 B 3 Y Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y Y Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y 2 Y 2 Y 22 Y 23 Y 24 Y 25 Y 26 Z Z 2 Z 3 Z 4 Z 5 Z 6 Z 7 Z 8 Z 9 Z Z Z 2 Z 3 Z 4 Z 5 Z 6 Z 7 Z 8 Z 9 Z 2 Z 2 Z 22 Z 23 Z 24 Z 25 Z 26 bb bb 2 bb 3 bb 4 bb 5 bb 6 bb 7 bb 8 bb 9 fixed delay d = 7 Fixed-delay reliability α is achievable if there exists a sequence of encoder/decoder pairs with increasing end-to-end delays d j such that lim j d j ln P(B i ˆB j i ) = α Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 / 23

11 Review: Delay-universal (anytime) communication B B 2 B 3 B 4 B 5 B 6 B 7 B 8 B 9 B B B 2 B 3 Y Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y Y Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y 2 Y 2 Y 22 Y 23 Y 24 Y 25 Y 26 Z Z 2 Z 3 Z 4 Z 5 Z 6 Z 7 Z 8 Z 9 Z Z Z 2 Z 3 Z 4 Z 5 Z 6 Z 7 Z 8 Z 9 Z 2 Z 2 Z 22 Z 23 Z 24 Z 25 Z 26 bb bb 2 bb 3 bb 4 bb 5 bb 6 bb 7 bb 8 bb 9 fixed delay d = 7 Reliability α is achievable delay-universally or in an anytime fashion if a single encoder works for all sufficiently large delays d Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 / 23

12 Review: Delay-universal (anytime) communication B B 2 B 3 B 4 B 5 B 6 B 7 B 8 B 9 B B B 2 B 3 Y Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y Y Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y 2 Y 2 Y 22 Y 23 Y 24 Y 25 Y 26 Z Z 2 Z 3 Z 4 Z 5 Z 6 Z 7 Z 8 Z 9 Z Z Z 2 Z 3 Z 4 Z 5 Z 6 Z 7 Z 8 Z 9 Z 2 Z 2 Z 22 Z 23 Z 24 Z 25 Z 26 bb bb 2 bb 3 bb 4 bb 5 bb 6 bb 7 bb 8 bb 9 fixed delay d = 7 The anytime capacity Cany(α) is the supremal rate at which reliability α is achievable in a delay-universal way Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 / 23

13 Review: Separation theorem for scalar control Necessity: If a scalar system with parameter λ > can be stabilized with finite η-moment across a noisy channel, then the channel with noiseless feedback must have Cany(η ln λ) ln λ In general: If P( X > m) < f(m), then K : Perror(d) < f(kλ d ) Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 / 23

14 Review: Separation theorem for scalar control Necessity: If a scalar system with parameter λ > can be stabilized with finite η-moment across a noisy channel, then the channel with noiseless feedback must have Cany(η ln λ) ln λ In general: If P( X > m) < f(m), then K : Perror(d) < f(kλ d ) Sufficiency: If there is an α > η ln λ for which the channel with noiseless feedback has Cany(α) > ln λ then the scalar system with parameter λ with a bounded disturbance can be stabilized across the noisy channel with finite η-moment Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 / 23

15 Review: Separation theorem for scalar control Necessity: If a scalar system with parameter λ > can be stabilized with finite η-moment across a noisy channel, then the channel with noiseless feedback must have Cany(η ln λ) ln λ In general: If P( X > m) < f(m), then K : Perror(d) < f(kλ d ) Sufficiency: If there is an α > η ln λ for which the channel with noiseless feedback has Cany(α) > ln λ then the scalar system with parameter λ with a bounded disturbance can be stabilized across the noisy channel with finite η-moment Proved using a direct equivalence so it also holds for compound channels Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 / 23

16 From Rate/Reliability to Gain/Moments Error Exponent (base e) Rate (in nats) Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 2 / 23

17 From Rate/Reliability to Gain/Moments 8 Moments stabilized Open loop unstable gain Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 2 / 23

18 Why the random-coding bound works: tree codes Time Tree with iid random labels: Data chooses a path through the tree Transmit the path labels through the channel Feedback is not used Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 4 / 23

19 Why the random-coding bound works: tree codes Time ML path decoding Log likelihoods add along path Disjoint segments are pairwise independent of the true path Er (R) analysis applies at the suffix Tree with iid random labels: Data chooses a path through the tree Transmit the path labels through the channel Feedback is not used Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 4 / 23

20 Why the random-coding bound works: tree codes Time Tree with iid random labels: Data chooses a path through the tree Transmit the path labels through the channel Feedback is not used ML path decoding Log likelihoods add along path Disjoint segments are pairwise independent of the true path Er (R) analysis applies at the suffix Shortest suffix dominates Achieves P e (d) K exp( E r (R)d) for every d for all R < C Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 4 / 23

21 Relevant aspects of block and anytime codes Block coding: source bits, b group into blocks encode blocks independently y channel W(z y) z decode blocks independently Rare blocks in error Erroneous blocks never recovered Eventually results in exponential instability Anytime coding: source bits, b causally encode w/ tree code y = f(b, b, b ) k 2 k channel W(z y) z streaming decoder Can revisit earlier bit estimates Estimate reliabilities increase with delay Eventually detect and compensate for any incorrect controls Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 5 / 23

22 Outline Problem setup Channel uncertainty and stabilization Review of past results without uncertainty 2 New result: universal anytime codes 3 Sufficient condition for stabilization 4 Conclusion Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 6 / 23

23 Review: universal block codes for compound channels A single input distribution Q must be chosen for all W W Look at the empirical mutual information (EMI) between Z and candidate codewords Y m Choose the one with the highest mutual information Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 7 / 23

24 Review: universal block codes for compound channels A single input distribution Q must be chosen for all W W Look at the empirical mutual information (EMI) between Z and candidate codewords Y m Choose the one with the highest mutual information Why does this work? Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 7 / 23

25 Review: universal block codes for compound channels A single input distribution Q must be chosen for all W W Look at the empirical mutual information (EMI) between Z and candidate codewords Y m Choose the one with the highest mutual information Why does this work? The true codeword gives rise to an empirical channel that is like the true channel Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 7 / 23

26 Review: universal block codes for compound channels A single input distribution Q must be chosen for all W W Look at the empirical mutual information (EMI) between Z and candidate codewords Y m Choose the one with the highest mutual information Why does this work? The true codeword gives rise to an empirical channel that is like the true channel There are only a polynomial (n + ) Y Z set of joint types The random-coding error exponent Er (R, Q, W) is achieved Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 7 / 23

27 Review: universal block codes for compound channels A single input distribution Q must be chosen for all W W Look at the empirical mutual information (EMI) between Z and candidate codewords Y m Choose the one with the highest mutual information Why does this work? The true codeword gives rise to an empirical channel that is like the true channel There are only a polynomial (n + ) Y Z set of joint types The random-coding error exponent Er (R, Q, W) is achieved What are the difficulties in generalizing to trees? Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 7 / 23

28 Review: universal block codes for compound channels A single input distribution Q must be chosen for all W W Look at the empirical mutual information (EMI) between Z and candidate codewords Y m Choose the one with the highest mutual information Why does this work? The true codeword gives rise to an empirical channel that is like the true channel There are only a polynomial (n + ) Y Z set of joint types The random-coding error exponent Er (R, Q, W) is achieved What are the difficulties in generalizing to trees? The empirical mutual information (EMI) is not additive Ex: (,) + (,) gives Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 7 / 23

29 Review: universal block codes for compound channels A single input distribution Q must be chosen for all W W Look at the empirical mutual information (EMI) between Z and candidate codewords Y m Choose the one with the highest mutual information Why does this work? The true codeword gives rise to an empirical channel that is like the true channel There are only a polynomial (n + ) Y Z set of joint types The random-coding error exponent Er (R, Q, W) is achieved What are the difficulties in generalizing to trees? The empirical mutual information (EMI) is not additive Ex: (,) + (,) gives The polynomial term grows with n not with delay d Delays must be longer as time goes on Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 7 / 23

30 Sequential Suffix-EMI Decoding Decode sequentially, using max empirical mutual inform (EMI) comparisons at each stage If a blue codeword has the max EMI, decode first bit to, else to If a blue codeword suffix has the max EMI, decode second bit to, else to If a blue codeword suffix has the max EMI, decode third bit to, else to Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 8 / 23

31 The universal anytime coding theorem Given a rate R > and a compound discrete memoryless channel W, there exists a random anytime code such that for all E < E any,univ (R) there is a constant K > such that Pr[ B n d B n d ] K2 de for all n, d where E any,univ (R) = sup inf inf D(P V Q W) + max{, I(P, V) R} Q W W P,V = sup inf E r(r, Q, W) Q W W No feedback is needed Does as well as could be hoped for essentially hits E r (R) Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 9 / 23

32 Outline Problem setup Channel uncertainty and stabilization Review of past results without uncertainty 2 New result: universal anytime codes 3 Sufficient condition for stabilization 4 Conclusion Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 2 / 23

33 Back to the stabilization problem Can use sup Q inf W W E r (R, Q, W) to evaluate compound channels W Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 2 / 23

34 Back to the stabilization problem Can use sup Q inf W W E r (R, Q, W) to evaluate compound channels W Picking Q is unavoidable determines codebook and capacity Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 2 / 23

35 Back to the stabilization problem Can use sup Q inf W W E r (R, Q, W) to evaluate compound channels W Picking Q is unavoidable determines codebook and capacity But evaluating this bound can be difficult Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 2 / 23

36 Back to the stabilization problem Can use sup Q inf W W E r (R, Q, W) to evaluate compound channels W Picking Q is unavoidable determines codebook and capacity But evaluating this bound can be difficult Is there a simpler condition that only depends on capacity C(W)? Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 2 / 23

37 Back to the stabilization problem Can use sup Q inf W W E r (R, Q, W) to evaluate compound channels W Picking Q is unavoidable determines codebook and capacity But evaluating this bound can be difficult Is there a simpler condition that only depends on capacity C(W)? Gallager Exercise 523 tells us: E r (R, Q, W) (I(Q, W) R) 2 /( 8 e 2 + 4(ln Z ) 2 ) Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 2 / 23

38 Back to the stabilization problem Can use sup Q inf W W E r (R, Q, W) to evaluate compound channels W Picking Q is unavoidable determines codebook and capacity But evaluating this bound can be difficult Is there a simpler condition that only depends on capacity C(W)? Gallager Exercise 523 tells us: E r (R, Q, W) (I(Q, W) R) 2 /( 8 e 2 + 4(ln Z ) 2 ) Translate to η-stabilization: C(W) log λ > (2 log Z ) η log λ 2(log e)2 ( + log e e 2 (log Z ) 2 ) Says that O( η log λ) extra capacity suffices to get η-th moment stabilized Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 2 / 23

39 Visualizing the new universal sufficient condition Error Exponent (base e) Rate (in nats) Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 22 / 23

40 Visualizing the new universal sufficient condition 8 Moments stabilized Open loop unstable gain Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 22 / 23

41 Conclusion Random anytime codes exist for compound channels Can thus stabilize systems over uncertain channels Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 23 / 23

42 Conclusion Random anytime codes exist for compound channels Can thus stabilize systems over uncertain channels Can operate with coarse description of channel uncertainty Channel input distribution Q Resulting capacity C Set-size proxy: output alphabet size Z Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 23 / 23

43 Conclusion Random anytime codes exist for compound channels Can thus stabilize systems over uncertain channels Can operate with coarse description of channel uncertainty Channel input distribution Q Resulting capacity C Set-size proxy: output alphabet size Z Performance loss for coarse channel uncertainty Bounds should be improved Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 23 / 23

44 Conclusion Random anytime codes exist for compound channels Can thus stabilize systems over uncertain channels Can operate with coarse description of channel uncertainty Channel input distribution Q Resulting capacity C Set-size proxy: output alphabet size Z Performance loss for coarse channel uncertainty Bounds should be improved Feedback should be used Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 23 / 23

The connection between information theory and networked control

The connection between information theory and networked control The connection between information theory and networked control Anant Sahai based in part on joint work with students: Tunc Simsek, Hari Palaiyanur, and Pulkit Grover Wireless Foundations Department of

More information

Stabilization over discrete memoryless and wideband channels using nearly memoryless observations

Stabilization over discrete memoryless and wideband channels using nearly memoryless observations Stabilization over discrete memoryless and wideband channels using nearly memoryless observations Anant Sahai Abstract We study stabilization of a discrete-time scalar unstable plant over a noisy communication

More information

Delay, feedback, and the price of ignorance

Delay, feedback, and the price of ignorance Delay, feedback, and the price of ignorance Anant Sahai based in part on joint work with students: Tunc Simsek Cheng Chang Wireless Foundations Department of Electrical Engineering and Computer Sciences

More information

Coding into a source: an inverse rate-distortion theorem

Coding into a source: an inverse rate-distortion theorem Coding into a source: an inverse rate-distortion theorem Anant Sahai joint work with: Mukul Agarwal Sanjoy K. Mitter Wireless Foundations Department of Electrical Engineering and Computer Sciences University

More information

A Simple Encoding And Decoding Strategy For Stabilization Over Discrete Memoryless Channels

A Simple Encoding And Decoding Strategy For Stabilization Over Discrete Memoryless Channels A Simple Encoding And Decoding Strategy For Stabilization Over Discrete Memoryless Channels Anant Sahai and Hari Palaiyanur Dept. of Electrical Engineering and Computer Sciences University of California,

More information

Capacity-achieving Feedback Scheme for Flat Fading Channels with Channel State Information

Capacity-achieving Feedback Scheme for Flat Fading Channels with Channel State Information Capacity-achieving Feedback Scheme for Flat Fading Channels with Channel State Information Jialing Liu liujl@iastate.edu Sekhar Tatikonda sekhar.tatikonda@yale.edu Nicola Elia nelia@iastate.edu Dept. of

More information

Relaying Information Streams

Relaying Information Streams Relaying Information Streams Anant Sahai UC Berkeley EECS sahai@eecs.berkeley.edu Originally given: Oct 2, 2002 This was a talk only. I was never able to rigorously formalize the back-of-the-envelope reasoning

More information

The necessity and sufficiency of anytime capacity for control over a noisy communication link: Parts I and II

The necessity and sufficiency of anytime capacity for control over a noisy communication link: Parts I and II The necessity and sufficiency of anytime capacity for control over a noisy communication link: Parts I and II Anant Sahai, Sanjoy Mitter sahai@eecs.berkeley.edu, mitter@mit.edu Abstract We review how Shannon

More information

On the variable-delay reliability function of discrete memoryless channels with access to noisy feedback

On the variable-delay reliability function of discrete memoryless channels with access to noisy feedback ITW2004, San Antonio, Texas, October 24 29, 2004 On the variable-delay reliability function of discrete memoryless channels with access to noisy feedback Anant Sahai and Tunç Şimşek Electrical Engineering

More information

Control Over Noisy Channels

Control Over Noisy Channels IEEE RANSACIONS ON AUOMAIC CONROL, VOL??, NO??, MONH?? 004 Control Over Noisy Channels Sekhar atikonda, Member, IEEE, and Sanjoy Mitter, Fellow, IEEE, Abstract Communication is an important component of

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

for some error exponent E( R) as a function R,

for some error exponent E( R) as a function R, . Capacity-achieving codes via Forney concatenation Shannon s Noisy Channel Theorem assures us the existence of capacity-achieving codes. However, exhaustive search for the code has double-exponential

More information

Lecture 4 Channel Coding

Lecture 4 Channel Coding Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity

More information

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I

More information

Towards control over fading channels

Towards control over fading channels Towards control over fading channels Paolo Minero, Massimo Franceschetti Advanced Network Science University of California San Diego, CA, USA mail: {minero,massimo}@ucsd.edu Invited Paper) Subhrakanti

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

Data Rate Theorem for Stabilization over Time-Varying Feedback Channels

Data Rate Theorem for Stabilization over Time-Varying Feedback Channels Data Rate Theorem for Stabilization over Time-Varying Feedback Channels Workshop on Frontiers in Distributed Communication, Sensing and Control Massimo Franceschetti, UCSD (joint work with P. Minero, S.

More information

Chapter 1 Elements of Information Theory for Networked Control Systems

Chapter 1 Elements of Information Theory for Networked Control Systems Chapter 1 Elements of Information Theory for Networked Control Systems Massimo Franceschetti and Paolo Minero 1.1 Introduction Next generation cyber-physical systems [35] will integrate computing, communication,

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

An Achievable Error Exponent for the Mismatched Multiple-Access Channel

An Achievable Error Exponent for the Mismatched Multiple-Access Channel An Achievable Error Exponent for the Mismatched Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@camacuk Albert Guillén i Fàbregas ICREA & Universitat Pompeu Fabra University of

More information

The Hallucination Bound for the BSC

The Hallucination Bound for the BSC The Hallucination Bound for the BSC Anant Sahai and Stark Draper Wireless Foundations Department of Electrical Engineering and Computer Sciences University of California at Berkeley ECE Department University

More information

Source coding and channel requirements for unstable processes. Anant Sahai, Sanjoy Mitter

Source coding and channel requirements for unstable processes. Anant Sahai, Sanjoy Mitter Source coding and channel requirements for unstable processes Anant Sahai, Sanjoy Mitter sahai@eecs.berkeley.edu, mitter@mit.edu Abstract Our understanding of information in systems has been based on the

More information

Unequal Error Protection Querying Policies for the Noisy 20 Questions Problem

Unequal Error Protection Querying Policies for the Noisy 20 Questions Problem Unequal Error Protection Querying Policies for the Noisy 20 Questions Problem Hye Won Chung, Brian M. Sadler, Lizhong Zheng and Alfred O. Hero arxiv:606.09233v2 [cs.it] 28 Sep 207 Abstract In this paper,

More information

CSCI 2570 Introduction to Nanocomputing

CSCI 2570 Introduction to Nanocomputing CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication

More information

Lecture 11: Polar codes construction

Lecture 11: Polar codes construction 15-859: Information Theory and Applications in TCS CMU: Spring 2013 Lecturer: Venkatesan Guruswami Lecture 11: Polar codes construction February 26, 2013 Scribe: Dan Stahlke 1 Polar codes: recap of last

More information

Characterization of Information Channels for Asymptotic Mean Stationarity and Stochastic Stability of Nonstationary/Unstable Linear Systems

Characterization of Information Channels for Asymptotic Mean Stationarity and Stochastic Stability of Nonstationary/Unstable Linear Systems 6332 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 58, NO 10, OCTOBER 2012 Characterization of Information Channels for Asymptotic Mean Stationarity and Stochastic Stability of Nonstationary/Unstable Linear

More information

National University of Singapore Department of Electrical & Computer Engineering. Examination for

National University of Singapore Department of Electrical & Computer Engineering. Examination for National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:

More information

The Poisson Channel with Side Information

The Poisson Channel with Side Information The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH

More information

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity 5-859: Information Theory and Applications in TCS CMU: Spring 23 Lecture 8: Channel and source-channel coding theorems; BEC & linear codes February 7, 23 Lecturer: Venkatesan Guruswami Scribe: Dan Stahlke

More information

RECENT advances in technology have led to increased activity

RECENT advances in technology have led to increased activity IEEE TRANSACTIONS ON AUTOMATIC CONTROL, VOL 49, NO 9, SEPTEMBER 2004 1549 Stochastic Linear Control Over a Communication Channel Sekhar Tatikonda, Member, IEEE, Anant Sahai, Member, IEEE, and Sanjoy Mitter,

More information

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1 Kraft s inequality An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if N 2 l i 1 Proof: Suppose that we have a tree code. Let l max = max{l 1,...,

More information

Lecture 2: August 31

Lecture 2: August 31 0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 2: August 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

School of Computer and Communication Sciences. Information Theory and Coding Notes on Random Coding December 12, 2003.

School of Computer and Communication Sciences. Information Theory and Coding Notes on Random Coding December 12, 2003. ÉCOLE POLYTECHNIQUE FÉDÉRALE DE LAUSANNE School of Computer and Communication Sciences Handout 8 Information Theor and Coding Notes on Random Coding December 2, 2003 Random Coding In this note we prove

More information

Estimating a linear process using phone calls

Estimating a linear process using phone calls Estimating a linear process using phone calls Mohammad Javad Khojasteh, Massimo Franceschetti, Gireeja Ranade Abstract We consider the problem of estimating an undisturbed, scalar, linear process over

More information

Towards a Theory of Information Flow in the Finitary Process Soup

Towards a Theory of Information Flow in the Finitary Process Soup Towards a Theory of in the Finitary Process Department of Computer Science and Complexity Sciences Center University of California at Davis June 1, 2010 Goals Analyze model of evolutionary self-organization

More information

A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding

A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation

More information

Space-Time Coding for Multi-Antenna Systems

Space-Time Coding for Multi-Antenna Systems Space-Time Coding for Multi-Antenna Systems ECE 559VV Class Project Sreekanth Annapureddy vannapu2@uiuc.edu Dec 3rd 2007 MIMO: Diversity vs Multiplexing Multiplexing Diversity Pictures taken from lectures

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

4 An Introduction to Channel Coding and Decoding over BSC

4 An Introduction to Channel Coding and Decoding over BSC 4 An Introduction to Channel Coding and Decoding over BSC 4.1. Recall that channel coding introduces, in a controlled manner, some redundancy in the (binary information sequence that can be used at the

More information

Feedback and Side-Information in Information Theory

Feedback and Side-Information in Information Theory Feedback and Side-Information in Information Theory Anant Sahai and Sekhar Tatikonda UC Berkeley and Yale sahai@eecs.berkeley.edu and sekhar.tatikonda@yale.edu ISIT 27 Tutorial T2 Nice, France June 24,

More information

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information.

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information. L65 Dept. of Linguistics, Indiana University Fall 205 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission rate

More information

Exact Random Coding Error Exponents of Optimal Bin Index Decoding

Exact Random Coding Error Exponents of Optimal Bin Index Decoding Exact Random Coding Error Exponents of Optimal Bin Index Decoding Neri Merhav Department of Electrical Engineering Technion - Israel Institute of Technology Technion City, Haifa 32000, ISRAEL E mail: merhav@ee.technion.ac.il

More information

Dept. of Linguistics, Indiana University Fall 2015

Dept. of Linguistics, Indiana University Fall 2015 L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 28 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

Lecture 7 September 24

Lecture 7 September 24 EECS 11: Coding for Digital Communication and Beyond Fall 013 Lecture 7 September 4 Lecturer: Anant Sahai Scribe: Ankush Gupta 7.1 Overview This lecture introduces affine and linear codes. Orthogonal signalling

More information

Appendix B Information theory from first principles

Appendix B Information theory from first principles Appendix B Information theory from first principles This appendix discusses the information theory behind the capacity expressions used in the book. Section 8.3.4 is the only part of the book that supposes

More information

Equivalence for Networks with Adversarial State

Equivalence for Networks with Adversarial State Equivalence for Networks with Adversarial State Oliver Kosut Department of Electrical, Computer and Energy Engineering Arizona State University Tempe, AZ 85287 Email: okosut@asu.edu Jörg Kliewer Department

More information

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@cam.ac.uk Alfonso Martinez Universitat Pompeu Fabra alfonso.martinez@ieee.org

More information

Lecture 8: Shannon s Noise Models

Lecture 8: Shannon s Noise Models Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have

More information

Introduction to Convolutional Codes, Part 1

Introduction to Convolutional Codes, Part 1 Introduction to Convolutional Codes, Part 1 Frans M.J. Willems, Eindhoven University of Technology September 29, 2009 Elias, Father of Coding Theory Textbook Encoder Encoder Properties Systematic Codes

More information

LECTURE 10. Last time: Lecture outline

LECTURE 10. Last time: Lecture outline LECTURE 10 Joint AEP Coding Theorem Last time: Error Exponents Lecture outline Strong Coding Theorem Reading: Gallager, Chapter 5. Review Joint AEP A ( ɛ n) (X) A ( ɛ n) (Y ) vs. A ( ɛ n) (X, Y ) 2 nh(x)

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

Arimoto Channel Coding Converse and Rényi Divergence

Arimoto Channel Coding Converse and Rényi Divergence Arimoto Channel Coding Converse and Rényi Divergence Yury Polyanskiy and Sergio Verdú Abstract Arimoto proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code

More information

The Method of Types and Its Application to Information Hiding

The Method of Types and Its Application to Information Hiding The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,

More information

Control Capacity. Gireeja Ranade and Anant Sahai UC Berkeley EECS

Control Capacity. Gireeja Ranade and Anant Sahai UC Berkeley EECS Control Capacity Gireeja Ranade and Anant Sahai UC Berkeley EECS gireeja@eecs.berkeley.edu, sahai@eecs.berkeley.edu Abstract This paper presents a notion of control capacity that gives a fundamental limit

More information

Exercise 1. = P(y a 1)P(a 1 )

Exercise 1. = P(y a 1)P(a 1 ) Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a

More information

Error Exponent Region for Gaussian Broadcast Channels

Error Exponent Region for Gaussian Broadcast Channels Error Exponent Region for Gaussian Broadcast Channels Lihua Weng, S. Sandeep Pradhan, and Achilleas Anastasopoulos Electrical Engineering and Computer Science Dept. University of Michigan, Ann Arbor, MI

More information

The error exponent with delay for lossless source coding

The error exponent with delay for lossless source coding The error eponent with delay for lossless source coding Cheng Chang and Anant Sahai Wireless Foundations, University of California at Berkeley cchang@eecs.berkeley.edu, sahai@eecs.berkeley.edu Abstract

More information

EECS 750. Hypothesis Testing with Communication Constraints

EECS 750. Hypothesis Testing with Communication Constraints EECS 750 Hypothesis Testing with Communication Constraints Name: Dinesh Krithivasan Abstract In this report, we study a modification of the classical statistical problem of bivariate hypothesis testing.

More information

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet. EE376A - Information Theory Final, Monday March 14th 216 Solutions Instructions: You have three hours, 3.3PM - 6.3PM The exam has 4 questions, totaling 12 points. Please start answering each question on

More information

A General Formula for Compound Channel Capacity

A General Formula for Compound Channel Capacity A General Formula for Compound Channel Capacity Sergey Loyka, Charalambos D. Charalambous University of Ottawa, University of Cyprus ETH Zurich (May 2015), ISIT-15 1/32 Outline 1 Introduction 2 Channel

More information

UNIT I INFORMATION THEORY. I k log 2

UNIT I INFORMATION THEORY. I k log 2 UNIT I INFORMATION THEORY Claude Shannon 1916-2001 Creator of Information Theory, lays the foundation for implementing logic in digital circuits as part of his Masters Thesis! (1939) and published a paper

More information

Sparse Regression Codes for Multi-terminal Source and Channel Coding

Sparse Regression Codes for Multi-terminal Source and Channel Coding Sparse Regression Codes for Multi-terminal Source and Channel Coding Ramji Venkataramanan Yale University Sekhar Tatikonda Allerton 2012 1 / 20 Compression with Side-Information X Encoder Rate R Decoder

More information

Universal Incremental Slepian-Wolf Coding

Universal Incremental Slepian-Wolf Coding Proceedings of the 43rd annual Allerton Conference, Monticello, IL, September 2004 Universal Incremental Slepian-Wolf Coding Stark C. Draper University of California, Berkeley Berkeley, CA, 94720 USA sdraper@eecs.berkeley.edu

More information

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 Please submit the solutions on Gradescope. EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 1. Optimal codeword lengths. Although the codeword lengths of an optimal variable length code

More information

IN this paper, we consider the capacity of sticky channels, a

IN this paper, we consider the capacity of sticky channels, a 72 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 54, NO. 1, JANUARY 2008 Capacity Bounds for Sticky Channels Michael Mitzenmacher, Member, IEEE Abstract The capacity of sticky channels, a subclass of insertion

More information

Encoder Decoder Design for Feedback Control over the Binary Symmetric Channel

Encoder Decoder Design for Feedback Control over the Binary Symmetric Channel Encoder Decoder Design for Feedback Control over the Binary Symmetric Channel Lei Bao, Mikael Skoglund and Karl Henrik Johansson School of Electrical Engineering, Royal Institute of Technology, Stockholm,

More information

Attaining maximal reliability with minimal feedback via joint channel-code and hash-function design

Attaining maximal reliability with minimal feedback via joint channel-code and hash-function design Attaining maimal reliability with minimal feedback via joint channel-code and hash-function design Stark C. Draper, Kannan Ramchandran, Biio Rimoldi, Anant Sahai, and David N. C. Tse Department of EECS,

More information

A Comparison of Superposition Coding Schemes

A Comparison of Superposition Coding Schemes A Comparison of Superposition Coding Schemes Lele Wang, Eren Şaşoğlu, Bernd Bandemer, and Young-Han Kim Department of Electrical and Computer Engineering University of California, San Diego La Jolla, CA

More information

Noisy channel communication

Noisy channel communication Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 6 Communication channels and Information Some notes on the noisy channel setup: Iain Murray, 2012 School of Informatics, University

More information

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem LECTURE 15 Last time: Feedback channel: setting up the problem Perfect feedback Feedback capacity Data compression Lecture outline Joint source and channel coding theorem Converse Robustness Brain teaser

More information

On Scalable Source Coding for Multiple Decoders with Side Information

On Scalable Source Coding for Multiple Decoders with Side Information On Scalable Source Coding for Multiple Decoders with Side Information Chao Tian School of Computer and Communication Sciences Laboratory for Information and Communication Systems (LICOS), EPFL, Lausanne,

More information

Iterative Encoder-Controller Design for Feedback Control Over Noisy Channels

Iterative Encoder-Controller Design for Feedback Control Over Noisy Channels IEEE TRANSACTIONS ON AUTOMATIC CONTROL 1 Iterative Encoder-Controller Design for Feedback Control Over Noisy Channels Lei Bao, Member, IEEE, Mikael Skoglund, Senior Member, IEEE, and Karl Henrik Johansson,

More information

Anytime Capacity of the AWGN+Erasure Channel with Feedback. Qing Xu. B.S. (Beijing University) 1997 M.S. (University of California at Berkeley) 2000

Anytime Capacity of the AWGN+Erasure Channel with Feedback. Qing Xu. B.S. (Beijing University) 1997 M.S. (University of California at Berkeley) 2000 Anytime Capacity of the AWGN+Erasure Channel with Feedback by Qing Xu B.S. (Beijing University) 1997 M.S. (University of California at Berkeley) 2000 A dissertation submitted in partial satisfaction of

More information

Multimedia Communications. Mathematical Preliminaries for Lossless Compression

Multimedia Communications. Mathematical Preliminaries for Lossless Compression Multimedia Communications Mathematical Preliminaries for Lossless Compression What we will see in this chapter Definition of information and entropy Modeling a data source Definition of coding and when

More information

Variable Length Codes for Degraded Broadcast Channels

Variable Length Codes for Degraded Broadcast Channels Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates

More information

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute ENEE 739C: Advanced Topics in Signal Processing: Coding Theory Instructor: Alexander Barg Lecture 6 (draft; 9/6/03. Error exponents for Discrete Memoryless Channels http://www.enee.umd.edu/ abarg/enee739c/course.html

More information

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

18.2 Continuous Alphabet (discrete-time, memoryless) Channel 0-704: Information Processing and Learning Spring 0 Lecture 8: Gaussian channel, Parallel channels and Rate-distortion theory Lecturer: Aarti Singh Scribe: Danai Koutra Disclaimer: These notes have not

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

Shannon s Noisy-Channel Coding Theorem

Shannon s Noisy-Channel Coding Theorem Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 13, 2015 Lucas Slot, Sebastian Zur Shannon s Noisy-Channel Coding Theorem February 13, 2015 1 / 29 Outline 1 Definitions and Terminology

More information

An introduction to basic information theory. Hampus Wessman

An introduction to basic information theory. Hampus Wessman An introduction to basic information theory Hampus Wessman Abstract We give a short and simple introduction to basic information theory, by stripping away all the non-essentials. Theoretical bounds on

More information

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Nevroz Şen, Fady Alajaji and Serdar Yüksel Department of Mathematics and Statistics Queen s University Kingston, ON K7L 3N6, Canada

More information

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Igal Sason Department of Electrical Engineering Technion - Israel Institute of Technology Haifa 32000, Israel 2009 IEEE International

More information

Entropy, Inference, and Channel Coding

Entropy, Inference, and Channel Coding Entropy, Inference, and Channel Coding Sean Meyn Department of Electrical and Computer Engineering University of Illinois and the Coordinated Science Laboratory NSF support: ECS 02-17836, ITR 00-85929

More information

X 1 : X Table 1: Y = X X 2

X 1 : X Table 1: Y = X X 2 ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access

More information

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel Introduction to Coding Theory CMU: Spring 2010 Notes 3: Stochastic channels and noisy coding theorem bound January 2010 Lecturer: Venkatesan Guruswami Scribe: Venkatesan Guruswami We now turn to the basic

More information

New communication strategies for broadcast and interference networks

New communication strategies for broadcast and interference networks New communication strategies for broadcast and interference networks S. Sandeep Pradhan (Joint work with Arun Padakandla and Aria Sahebi) University of Michigan, Ann Arbor Distributed Information Coding

More information

Decoding the Tail-Biting Convolutional Codes with Pre-Decoding Circular Shift

Decoding the Tail-Biting Convolutional Codes with Pre-Decoding Circular Shift Decoding the Tail-Biting Convolutional Codes with Pre-Decoding Circular Shift Ching-Yao Su Directed by: Prof. Po-Ning Chen Department of Communications Engineering, National Chiao-Tung University July

More information

Coding and Control over Discrete Noisy Forward and Feedback Channels

Coding and Control over Discrete Noisy Forward and Feedback Channels Proceedings of the 44th IEEE Conference on Decision and Control, and the European Control Conference 2005 Seville, Spain, December 12-15, 2005 TuA14.1 Coding and Control over Discrete Noisy Forward and

More information

Finding the best mismatched detector for channel coding and hypothesis testing

Finding the best mismatched detector for channel coding and hypothesis testing Finding the best mismatched detector for channel coding and hypothesis testing Sean Meyn Department of Electrical and Computer Engineering University of Illinois and the Coordinated Science Laboratory

More information

Intermittent Communication

Intermittent Communication Intermittent Communication Mostafa Khoshnevisan, Student Member, IEEE, and J. Nicholas Laneman, Senior Member, IEEE arxiv:32.42v2 [cs.it] 7 Mar 207 Abstract We formulate a model for intermittent communication

More information

9. Distance measures. 9.1 Classical information measures. Head Tail. How similar/close are two probability distributions? Trace distance.

9. Distance measures. 9.1 Classical information measures. Head Tail. How similar/close are two probability distributions? Trace distance. 9. Distance measures 9.1 Classical information measures How similar/close are two probability distributions? Trace distance Fidelity Example: Flipping two coins, one fair one biased Head Tail Trace distance

More information

Second-Order Asymptotics in Information Theory

Second-Order Asymptotics in Information Theory Second-Order Asymptotics in Information Theory Vincent Y. F. Tan (vtan@nus.edu.sg) Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) National Taiwan University November 2015

More information

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 Capacity Theorems for Discrete, Finite-State Broadcast Channels With Feedback and Unidirectional Receiver Cooperation Ron Dabora

More information

Turbo Compression. Andrej Rikovsky, Advisor: Pavol Hanus

Turbo Compression. Andrej Rikovsky, Advisor: Pavol Hanus Turbo Compression Andrej Rikovsky, Advisor: Pavol Hanus Abstract Turbo codes which performs very close to channel capacity in channel coding can be also used to obtain very efficient source coding schemes.

More information

Channel combining and splitting for cutoff rate improvement

Channel combining and splitting for cutoff rate improvement Channel combining and splitting for cutoff rate improvement Erdal Arıkan Electrical-Electronics Engineering Department Bilkent University, Ankara, 68, Turkey Email: arikan@eebilkentedutr arxiv:cs/5834v

More information

MARKOV CHAINS A finite state Markov chain is a sequence of discrete cv s from a finite alphabet where is a pmf on and for

MARKOV CHAINS A finite state Markov chain is a sequence of discrete cv s from a finite alphabet where is a pmf on and for MARKOV CHAINS A finite state Markov chain is a sequence S 0,S 1,... of discrete cv s from a finite alphabet S where q 0 (s) is a pmf on S 0 and for n 1, Q(s s ) = Pr(S n =s S n 1 =s ) = Pr(S n =s S n 1

More information

On Source-Channel Communication in Networks

On Source-Channel Communication in Networks On Source-Channel Communication in Networks Michael Gastpar Department of EECS University of California, Berkeley gastpar@eecs.berkeley.edu DIMACS: March 17, 2003. Outline 1. Source-Channel Communication

More information