Universal Anytime Codes: An approach to uncertain channels in control

Similar documents
The connection between information theory and networked control

Stabilization over discrete memoryless and wideband channels using nearly memoryless observations

Delay, feedback, and the price of ignorance

Coding into a source: an inverse rate-distortion theorem

A Simple Encoding And Decoding Strategy For Stabilization Over Discrete Memoryless Channels

Capacity-achieving Feedback Scheme for Flat Fading Channels with Channel State Information

Relaying Information Streams

The necessity and sufficiency of anytime capacity for control over a noisy communication link: Parts I and II

On the variable-delay reliability function of discrete memoryless channels with access to noisy feedback

Control Over Noisy Channels

Lecture 4 Noisy Channel Coding

for some error exponent E( R) as a function R,

Lecture 4 Channel Coding

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Towards control over fading channels

Shannon s noisy-channel theorem

Data Rate Theorem for Stabilization over Time-Varying Feedback Channels

Chapter 1 Elements of Information Theory for Networked Control Systems

(Classical) Information Theory III: Noisy channel coding

An Achievable Error Exponent for the Mismatched Multiple-Access Channel

The Hallucination Bound for the BSC

Source coding and channel requirements for unstable processes. Anant Sahai, Sanjoy Mitter

Unequal Error Protection Querying Policies for the Noisy 20 Questions Problem

CSCI 2570 Introduction to Nanocomputing

Lecture 11: Polar codes construction

Characterization of Information Channels for Asymptotic Mean Stationarity and Stochastic Stability of Nonstationary/Unstable Linear Systems

National University of Singapore Department of Electrical & Computer Engineering. Examination for

The Poisson Channel with Side Information

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

RECENT advances in technology have led to increased activity

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1

Lecture 2: August 31

Lecture 5 Channel Coding over Continuous Channels

School of Computer and Communication Sciences. Information Theory and Coding Notes on Random Coding December 12, 2003.

Estimating a linear process using phone calls

Towards a Theory of Information Flow in the Finitary Process Soup

A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding

Space-Time Coding for Multi-Antenna Systems

Capacity of a channel Shannon s second theorem. Information Theory 1/33

4 An Introduction to Channel Coding and Decoding over BSC

Feedback and Side-Information in Information Theory

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information.

Exact Random Coding Error Exponents of Optimal Bin Index Decoding

Dept. of Linguistics, Indiana University Fall 2015

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 7 September 24

Appendix B Information theory from first principles

Equivalence for Networks with Adversarial State

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel

Lecture 8: Shannon s Noise Models

Introduction to Convolutional Codes, Part 1

LECTURE 10. Last time: Lecture outline

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Arimoto Channel Coding Converse and Rényi Divergence

The Method of Types and Its Application to Information Hiding

Control Capacity. Gireeja Ranade and Anant Sahai UC Berkeley EECS

Exercise 1. = P(y a 1)P(a 1 )

Error Exponent Region for Gaussian Broadcast Channels

The error exponent with delay for lossless source coding

EECS 750. Hypothesis Testing with Communication Constraints

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

A General Formula for Compound Channel Capacity

UNIT I INFORMATION THEORY. I k log 2

Sparse Regression Codes for Multi-terminal Source and Channel Coding

Universal Incremental Slepian-Wolf Coding

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

IN this paper, we consider the capacity of sticky channels, a

Encoder Decoder Design for Feedback Control over the Binary Symmetric Channel

Attaining maximal reliability with minimal feedback via joint channel-code and hash-function design

A Comparison of Superposition Coding Schemes

Noisy channel communication

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

On Scalable Source Coding for Multiple Decoders with Side Information

Iterative Encoder-Controller Design for Feedback Control Over Noisy Channels

Anytime Capacity of the AWGN+Erasure Channel with Feedback. Qing Xu. B.S. (Beijing University) 1997 M.S. (University of California at Berkeley) 2000

Multimedia Communications. Mathematical Preliminaries for Lossless Compression

Variable Length Codes for Degraded Broadcast Channels

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Shannon s Noisy-Channel Coding Theorem

An introduction to basic information theory. Hampus Wessman

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes

Entropy, Inference, and Channel Coding

X 1 : X Table 1: Y = X X 2

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

New communication strategies for broadcast and interference networks

Decoding the Tail-Biting Convolutional Codes with Pre-Decoding Circular Shift

Coding and Control over Discrete Noisy Forward and Feedback Channels

Finding the best mismatched detector for channel coding and hypothesis testing

Intermittent Communication

9. Distance measures. 9.1 Classical information measures. Head Tail. How similar/close are two probability distributions? Trace distance.

Second-Order Asymptotics in Information Theory

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010

Turbo Compression. Andrej Rikovsky, Advisor: Pavol Hanus

Channel combining and splitting for cutoff rate improvement

MARKOV CHAINS A finite state Markov chain is a sequence of discrete cv s from a finite alphabet where is a pmf on and for

On Source-Channel Communication in Networks

Transcription:

Universal Anytime Codes: An approach to uncertain channels in control paper by Stark Draper and Anant Sahai presented by Sekhar Tatikonda Wireless Foundations Department of Electrical Engineering and Computer Sciences UC Berkeley Mitsubishi Electric Research Labs Cambridge, MA, USA April 2, 27 Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 / 23

Outline Problem setup Channel uncertainty and stabilization Review of past results 2 New result: universal anytime codes 3 Sufficient condition for stabilization 4 Conclusion Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 2 / 23

Our simple distributed control problem U t Step Delay D t Unstable System X t Designed Observer O Possible Control Knowledge U t Control Signals Designed Controller C Possible Channel Feedback Delay Uncertain Channel W Step X t+ = λx t + U t + D t Unstable λ >, bounded initial condition and disturbance D Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 4 / 23

Our simple distributed control problem U t Step Delay D t Unstable System X t Designed Observer O Possible Control Knowledge U t Control Signals Designed Controller C Possible Channel Feedback Delay Uncertain Channel W Step X t+ = λx t + U t + D t Unstable λ >, bounded initial condition and disturbance D Goal: η-stabilization sup t> E[ X t η ] K for some K < Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 4 / 23

Model of Channel Uncertainty: Compound Channels Channel W is known to be memoryless across time Input and output alphabets are known (Y, Z) and finite Set-valued uncertainty W Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 6 / 23

Model of Channel Uncertainty: Compound Channels Channel W is known to be memoryless across time Input and output alphabets are known (Y, Z) and finite Set-valued uncertainty W Nature can choose any particular W W Choice remains fixed for all time Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 6 / 23

Model of Channel Uncertainty: Compound Channels Channel W is known to be memoryless across time Input and output alphabets are known (Y, Z) and finite Set-valued uncertainty W Nature can choose any particular W W Choice remains fixed for all time Capacity well understood: C = sup Q inf W W I(Q, W) Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 6 / 23

Review: Entirely noiseless channel t Window known to contain X t λ t will grow by factor of λ > Encode which control U t to apply Sending R bits, cut window by a factor of 2 R grows by Ω on each side 2 giving a new window for X t+ t+ As long as R > log 2 λ, we can have stay bounded forever Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 8 / 23

Review: Delay-universal (anytime) communication B B 2 B 3 B 4 B 5 B 6 B 7 B 8 B 9 B B B 2 B 3 Y Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y Y Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y 2 Y 2 Y 22 Y 23 Y 24 Y 25 Y 26 Z Z 2 Z 3 Z 4 Z 5 Z 6 Z 7 Z 8 Z 9 Z Z Z 2 Z 3 Z 4 Z 5 Z 6 Z 7 Z 8 Z 9 Z 2 Z 2 Z 22 Z 23 Z 24 Z 25 Z 26 bb bb 2 bb 3 bb 4 bb 5 bb 6 bb 7 bb 8 bb 9 fixed delay d = 7 Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 / 23

Review: Delay-universal (anytime) communication B B 2 B 3 B 4 B 5 B 6 B 7 B 8 B 9 B B B 2 B 3 Y Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y Y Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y 2 Y 2 Y 22 Y 23 Y 24 Y 25 Y 26 Z Z 2 Z 3 Z 4 Z 5 Z 6 Z 7 Z 8 Z 9 Z Z Z 2 Z 3 Z 4 Z 5 Z 6 Z 7 Z 8 Z 9 Z 2 Z 2 Z 22 Z 23 Z 24 Z 25 Z 26 bb bb 2 bb 3 bb 4 bb 5 bb 6 bb 7 bb 8 bb 9 fixed delay d = 7 Fixed-delay reliability α is achievable if there exists a sequence of encoder/decoder pairs with increasing end-to-end delays d j such that lim j d j ln P(B i ˆB j i ) = α Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 / 23

Review: Delay-universal (anytime) communication B B 2 B 3 B 4 B 5 B 6 B 7 B 8 B 9 B B B 2 B 3 Y Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y Y Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y 2 Y 2 Y 22 Y 23 Y 24 Y 25 Y 26 Z Z 2 Z 3 Z 4 Z 5 Z 6 Z 7 Z 8 Z 9 Z Z Z 2 Z 3 Z 4 Z 5 Z 6 Z 7 Z 8 Z 9 Z 2 Z 2 Z 22 Z 23 Z 24 Z 25 Z 26 bb bb 2 bb 3 bb 4 bb 5 bb 6 bb 7 bb 8 bb 9 fixed delay d = 7 Reliability α is achievable delay-universally or in an anytime fashion if a single encoder works for all sufficiently large delays d Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 / 23

Review: Delay-universal (anytime) communication B B 2 B 3 B 4 B 5 B 6 B 7 B 8 B 9 B B B 2 B 3 Y Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y Y Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y 2 Y 2 Y 22 Y 23 Y 24 Y 25 Y 26 Z Z 2 Z 3 Z 4 Z 5 Z 6 Z 7 Z 8 Z 9 Z Z Z 2 Z 3 Z 4 Z 5 Z 6 Z 7 Z 8 Z 9 Z 2 Z 2 Z 22 Z 23 Z 24 Z 25 Z 26 bb bb 2 bb 3 bb 4 bb 5 bb 6 bb 7 bb 8 bb 9 fixed delay d = 7 The anytime capacity Cany(α) is the supremal rate at which reliability α is achievable in a delay-universal way Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 / 23

Review: Separation theorem for scalar control Necessity: If a scalar system with parameter λ > can be stabilized with finite η-moment across a noisy channel, then the channel with noiseless feedback must have Cany(η ln λ) ln λ In general: If P( X > m) < f(m), then K : Perror(d) < f(kλ d ) Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 / 23

Review: Separation theorem for scalar control Necessity: If a scalar system with parameter λ > can be stabilized with finite η-moment across a noisy channel, then the channel with noiseless feedback must have Cany(η ln λ) ln λ In general: If P( X > m) < f(m), then K : Perror(d) < f(kλ d ) Sufficiency: If there is an α > η ln λ for which the channel with noiseless feedback has Cany(α) > ln λ then the scalar system with parameter λ with a bounded disturbance can be stabilized across the noisy channel with finite η-moment Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 / 23

Review: Separation theorem for scalar control Necessity: If a scalar system with parameter λ > can be stabilized with finite η-moment across a noisy channel, then the channel with noiseless feedback must have Cany(η ln λ) ln λ In general: If P( X > m) < f(m), then K : Perror(d) < f(kλ d ) Sufficiency: If there is an α > η ln λ for which the channel with noiseless feedback has Cany(α) > ln λ then the scalar system with parameter λ with a bounded disturbance can be stabilized across the noisy channel with finite η-moment Proved using a direct equivalence so it also holds for compound channels Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 / 23

From Rate/Reliability to Gain/Moments Error Exponent (base e) 2 8 6 4 2 2 3 4 5 6 Rate (in nats) Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 2 / 23

From Rate/Reliability to Gain/Moments 8 Moments stabilized 6 4 2 2 4 6 8 Open loop unstable gain Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 2 / 23

Why the random-coding bound works: tree codes 2 3 4 5 6 7 8 Time Tree with iid random labels: Data chooses a path through the tree Transmit the path labels through the channel Feedback is not used Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 4 / 23

Why the random-coding bound works: tree codes 2 3 4 5 6 7 8 Time ML path decoding Log likelihoods add along path Disjoint segments are pairwise independent of the true path Er (R) analysis applies at the suffix Tree with iid random labels: Data chooses a path through the tree Transmit the path labels through the channel Feedback is not used Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 4 / 23

Why the random-coding bound works: tree codes 2 3 4 5 6 7 8 Time Tree with iid random labels: Data chooses a path through the tree Transmit the path labels through the channel Feedback is not used ML path decoding Log likelihoods add along path Disjoint segments are pairwise independent of the true path Er (R) analysis applies at the suffix Shortest suffix dominates Achieves P e (d) K exp( E r (R)d) for every d for all R < C Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 4 / 23

Relevant aspects of block and anytime codes Block coding: source bits, b group into blocks encode blocks independently y channel W(z y) z decode blocks independently Rare blocks in error Erroneous blocks never recovered Eventually results in exponential instability Anytime coding: source bits, b causally encode w/ tree code y = f(b, b, b ) k 2 k channel W(z y) z streaming decoder Can revisit earlier bit estimates Estimate reliabilities increase with delay Eventually detect and compensate for any incorrect controls Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 5 / 23

Outline Problem setup Channel uncertainty and stabilization Review of past results without uncertainty 2 New result: universal anytime codes 3 Sufficient condition for stabilization 4 Conclusion Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 6 / 23

Review: universal block codes for compound channels A single input distribution Q must be chosen for all W W Look at the empirical mutual information (EMI) between Z and candidate codewords Y m Choose the one with the highest mutual information Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 7 / 23

Review: universal block codes for compound channels A single input distribution Q must be chosen for all W W Look at the empirical mutual information (EMI) between Z and candidate codewords Y m Choose the one with the highest mutual information Why does this work? Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 7 / 23

Review: universal block codes for compound channels A single input distribution Q must be chosen for all W W Look at the empirical mutual information (EMI) between Z and candidate codewords Y m Choose the one with the highest mutual information Why does this work? The true codeword gives rise to an empirical channel that is like the true channel Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 7 / 23

Review: universal block codes for compound channels A single input distribution Q must be chosen for all W W Look at the empirical mutual information (EMI) between Z and candidate codewords Y m Choose the one with the highest mutual information Why does this work? The true codeword gives rise to an empirical channel that is like the true channel There are only a polynomial (n + ) Y Z set of joint types The random-coding error exponent Er (R, Q, W) is achieved Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 7 / 23

Review: universal block codes for compound channels A single input distribution Q must be chosen for all W W Look at the empirical mutual information (EMI) between Z and candidate codewords Y m Choose the one with the highest mutual information Why does this work? The true codeword gives rise to an empirical channel that is like the true channel There are only a polynomial (n + ) Y Z set of joint types The random-coding error exponent Er (R, Q, W) is achieved What are the difficulties in generalizing to trees? Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 7 / 23

Review: universal block codes for compound channels A single input distribution Q must be chosen for all W W Look at the empirical mutual information (EMI) between Z and candidate codewords Y m Choose the one with the highest mutual information Why does this work? The true codeword gives rise to an empirical channel that is like the true channel There are only a polynomial (n + ) Y Z set of joint types The random-coding error exponent Er (R, Q, W) is achieved What are the difficulties in generalizing to trees? The empirical mutual information (EMI) is not additive Ex: (,) + (,) gives 4 + 4 Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 7 / 23

Review: universal block codes for compound channels A single input distribution Q must be chosen for all W W Look at the empirical mutual information (EMI) between Z and candidate codewords Y m Choose the one with the highest mutual information Why does this work? The true codeword gives rise to an empirical channel that is like the true channel There are only a polynomial (n + ) Y Z set of joint types The random-coding error exponent Er (R, Q, W) is achieved What are the difficulties in generalizing to trees? The empirical mutual information (EMI) is not additive Ex: (,) + (,) gives 4 + 4 The polynomial term grows with n not with delay d Delays must be longer as time goes on Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 7 / 23

Sequential Suffix-EMI Decoding Decode sequentially, using max empirical mutual inform (EMI) comparisons at each stage If a blue codeword has the max EMI, decode first bit to, else to If a blue codeword suffix has the max EMI, decode second bit to, else to If a blue codeword suffix has the max EMI, decode third bit to, else to Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 8 / 23

The universal anytime coding theorem Given a rate R > and a compound discrete memoryless channel W, there exists a random anytime code such that for all E < E any,univ (R) there is a constant K > such that Pr[ B n d B n d ] K2 de for all n, d where E any,univ (R) = sup inf inf D(P V Q W) + max{, I(P, V) R} Q W W P,V = sup inf E r(r, Q, W) Q W W No feedback is needed Does as well as could be hoped for essentially hits E r (R) Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 9 / 23

Outline Problem setup Channel uncertainty and stabilization Review of past results without uncertainty 2 New result: universal anytime codes 3 Sufficient condition for stabilization 4 Conclusion Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 2 / 23

Back to the stabilization problem Can use sup Q inf W W E r (R, Q, W) to evaluate compound channels W Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 2 / 23

Back to the stabilization problem Can use sup Q inf W W E r (R, Q, W) to evaluate compound channels W Picking Q is unavoidable determines codebook and capacity Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 2 / 23

Back to the stabilization problem Can use sup Q inf W W E r (R, Q, W) to evaluate compound channels W Picking Q is unavoidable determines codebook and capacity But evaluating this bound can be difficult Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 2 / 23

Back to the stabilization problem Can use sup Q inf W W E r (R, Q, W) to evaluate compound channels W Picking Q is unavoidable determines codebook and capacity But evaluating this bound can be difficult Is there a simpler condition that only depends on capacity C(W)? Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 2 / 23

Back to the stabilization problem Can use sup Q inf W W E r (R, Q, W) to evaluate compound channels W Picking Q is unavoidable determines codebook and capacity But evaluating this bound can be difficult Is there a simpler condition that only depends on capacity C(W)? Gallager Exercise 523 tells us: E r (R, Q, W) (I(Q, W) R) 2 /( 8 e 2 + 4(ln Z ) 2 ) Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 2 / 23

Back to the stabilization problem Can use sup Q inf W W E r (R, Q, W) to evaluate compound channels W Picking Q is unavoidable determines codebook and capacity But evaluating this bound can be difficult Is there a simpler condition that only depends on capacity C(W)? Gallager Exercise 523 tells us: E r (R, Q, W) (I(Q, W) R) 2 /( 8 e 2 + 4(ln Z ) 2 ) Translate to η-stabilization: C(W) log λ > (2 log Z ) η log λ 2(log e)2 ( + log e e 2 (log Z ) 2 ) Says that O( η log λ) extra capacity suffices to get η-th moment stabilized Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 2 / 23

Visualizing the new universal sufficient condition Error Exponent (base e) 2 8 6 4 2 2 3 4 5 6 Rate (in nats) Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 22 / 23

Visualizing the new universal sufficient condition 8 Moments stabilized 6 4 2 2 4 6 8 Open loop unstable gain Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 22 / 23

Conclusion Random anytime codes exist for compound channels Can thus stabilize systems over uncertain channels Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 23 / 23

Conclusion Random anytime codes exist for compound channels Can thus stabilize systems over uncertain channels Can operate with coarse description of channel uncertainty Channel input distribution Q Resulting capacity C Set-size proxy: output alphabet size Z Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 23 / 23

Conclusion Random anytime codes exist for compound channels Can thus stabilize systems over uncertain channels Can operate with coarse description of channel uncertainty Channel input distribution Q Resulting capacity C Set-size proxy: output alphabet size Z Performance loss for coarse channel uncertainty Bounds should be improved Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 23 / 23

Conclusion Random anytime codes exist for compound channels Can thus stabilize systems over uncertain channels Can operate with coarse description of channel uncertainty Channel input distribution Q Resulting capacity C Set-size proxy: output alphabet size Z Performance loss for coarse channel uncertainty Bounds should be improved Feedback should be used Draper,Sahai (UC Berkeley) Universal Anytime ConCom7 23 / 23