Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory

Similar documents
A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

The Gallager Converse

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets

Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities

Second-Order Asymptotics in Information Theory

Lecture 4 Channel Coding

Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities

ELEC546 Review of Information Theory

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

Relay Networks With Delays

Lecture 3: Channel Capacity

ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016

On Third-Order Asymptotics for DMCs

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem

National University of Singapore Department of Electrical & Computer Engineering. Examination for

Lecture 10: Broadcast Channel and Superposition Coding

Lecture 22: Final Review

Memory in Classical Information Theory: A Brief History

Capacity of a channel Shannon s second theorem. Information Theory 1/33

On Multiple User Channels with State Information at the Transmitters

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Variable Rate Channel Capacity. Jie Ren 2013/4/26

On the Rate-Limited Gelfand-Pinsker Problem

Correlation Detection and an Operational Interpretation of the Rényi Mutual Information

Lecture 4 Noisy Channel Coding

On the Secrecy Capacity of Fading Channels

On the Duality between Multiple-Access Codes and Computation Codes

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Dispersion of the Gilbert-Elliott Channel

On Two-user Fading Gaussian Broadcast Channels. with Perfect Channel State Information at the Receivers. Daniela Tuninetti

An Outer Bound for the Gaussian. Interference channel with a relay.

The Capacity Region of the Gaussian MIMO Broadcast Channel

Lecture 5 Channel Coding over Continuous Channels

A New Metaconverse and Outer Region for Finite-Blocklength MACs

On the Capacity Region of the Gaussian Z-channel

Lecture 11: Polar codes construction

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

Variable Length Codes for Degraded Broadcast Channels

A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels

LECTURE 13. Last time: Lecture outline

Shannon s Noisy-Channel Coding Theorem

The Method of Types and Its Application to Information Hiding

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels

X 1 : X Table 1: Y = X X 2

Simultaneous Nonunique Decoding Is Rate-Optimal

On the Capacity of Interference Channels with Degraded Message sets

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel

Arimoto Channel Coding Converse and Rényi Divergence

ECE Information theory Final (Fall 2008)

Research Collection. The relation between block length and reliability for a cascade of AWGN links. Conference Paper. ETH Library

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

Bounds on Achievable Rates for General Multi-terminal Networks with Practical Constraints

Bounds and Capacity Results for the Cognitive Z-interference Channel

A Comparison of Two Achievable Rate Regions for the Interference Channel

Generalized Writing on Dirty Paper

Katalin Marton. Abbas El Gamal. Stanford University. Withits A. El Gamal (Stanford University) Katalin Marton Withits / 9

Draft. On Limiting Expressions for the Capacity Region of Gaussian Interference Channels. Mojtaba Vaezi and H. Vincent Poor

Shannon s noisy-channel theorem

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying

EE5139R: Problem Set 4 Assigned: 31/08/16, Due: 07/09/16

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

Capacity of a Class of Cognitive Radio Channels: Interference Channels with Degraded Message Sets

Channels with cost constraints: strong converse and dispersion

The Capacity Region of the Cognitive Z-interference Channel with One Noiseless Component

Lecture 15: Conditional and Joint Typicaility

Capacity Theorems for Relay Channels

Lecture 2: August 31

Network coding for multicast relation to compression and generalization of Slepian-Wolf

EE 4TM4: Digital Communications II. Channel Capacity

LECTURE 10. Last time: Lecture outline

Soft Covering with High Probability

(Classical) Information Theory III: Noisy channel coding

Lecture 2. Capacity of the Gaussian channel

Capacity bounds for multiple access-cognitive interference channel

Lecture 14 February 28

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010

Secret Key Agreement Using Asymmetry in Channel State Knowledge

Quiz 2 Date: Monday, November 21, 2016

Energy State Amplification in an Energy Harvesting Communication System

A Formula for the Capacity of the General Gel fand-pinsker Channel

A Summary of Multiple Access Channels

An Extended Fano s Inequality for the Finite Blocklength Coding

Lecture 4. Capacity of Fading Channels

Ahmed S. Mansour, Rafael F. Schaefer and Holger Boche. June 9, 2015

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7

Chapter 8: Differential entropy. University of Illinois at Chicago ECE 534, Natasha Devroye

EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm

Achievable Rates for Probabilistic Shaping

On the Capacity of the Binary-Symmetric Parallel-Relay Network

Capacity Bounds for Diamond Networks

IET Commun., 2009, Vol. 3, Iss. 4, pp doi: /iet-com & The Institution of Engineering and Technology 2009

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.

Channel Dispersion and Moderate Deviations Limits for Memoryless Channels

Transcription:

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Vincent Y. F. Tan (Joint work with Silas L. Fong) National University of Singapore (NUS) 2016 International Zurich Seminar on Communications Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 1 / 17

Gaussian Poincaré Inequality Theorem For Z n iid N (0, 1) and any differentiable mapping f such that E[(f (Z n )) 2 ] <, and E[ f (Z n ) 2 ] < we have var[f (Z n )] E[ f (Z n ) 2 ]. Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 2 / 17

Gaussian Poincaré Inequality Theorem For Z n iid N (0, 1) and any differentiable mapping f such that E[(f (Z n )) 2 ] <, and E[ f (Z n ) 2 ] < we have var[f (Z n )] E[ f (Z n ) 2 ]. Controlling the variance of f (Z n ) that is a function of i.i.d. random variables in terms of the gradient of f (Z n ) Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 2 / 17

Gaussian Poincaré Inequality Theorem For Z n iid N (0, 1) and any differentiable mapping f such that E[(f (Z n )) 2 ] <, and E[ f (Z n ) 2 ] < we have var[f (Z n )] E[ f (Z n ) 2 ]. Controlling the variance of f (Z n ) that is a function of i.i.d. random variables in terms of the gradient of f (Z n ) Using the Gaussian Poincaré inequality for appropriate f, var[f (Z n )] = O(n). Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 2 / 17

Gaussian Poincaré Inequality in Shannon Theory Polyanskiy and Verdú (2014) bounded the KL divergence between the empirical output distribution of AWGN channel codes P Y n and the n-fold product of the CAOD P Y, i.e., D(P Y n (P Y) n ) Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 3 / 17

Gaussian Poincaré Inequality in Shannon Theory Polyanskiy and Verdú (2014) bounded the KL divergence between the empirical output distribution of AWGN channel codes P Y n and the n-fold product of the CAOD P Y, i.e., D(P Y n (P Y) n ) Often we need to bound the variance of certain log-likelihood ratios (dispersion) Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 3 / 17

Gaussian Poincaré Inequality in Shannon Theory Polyanskiy and Verdú (2014) bounded the KL divergence between the empirical output distribution of AWGN channel codes P Y n and the n-fold product of the CAOD P Y, i.e., D(P Y n (P Y) n ) Often we need to bound the variance of certain log-likelihood ratios (dispersion) Demonstrate its utility by establishing Strong converse for the Gaussian broadcast channels Properties of the empirical output distribution of delay-limited codes for quasi-static fading channels Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 3 / 17

Gaussian Broadcast Channel Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 4 / 17

Gaussian Broadcast Channel Assume g 1 = g 2 = 1 and σ 2 2 > σ2 1 Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 4 / 17

Gaussian Broadcast Channel Assume g 1 = g 2 = 1 and σ 2 2 > σ2 1 Input X n must satisfy X n 2 2 = n Xi 2 np i=1 Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 4 / 17

Capacity Region An (n, M 1n, M 2n, ε n )-code consists of an encoder f : {1,..., M 1n } {1,..., M 2n } R n such that the power constraint is satisfied; two decoders ϕ j : R n {1,..., M jn } for j = 1, 2; such that the average error probability P (n) e := Pr(Ŵ 1 W 1 or Ŵ 2 W 2 ) ε n. Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 5 / 17

Capacity Region An (n, M 1n, M 2n, ε n )-code consists of an encoder f : {1,..., M 1n } {1,..., M 2n } R n such that the power constraint is satisfied; two decoders ϕ j : R n {1,..., M jn } for j = 1, 2; such that the average error probability P (n) e := Pr(Ŵ 1 W 1 or Ŵ 2 W 2 ) ε n. (R 1, R 2 ) is achievable a sequence of (n, M 1n, M 2n, ε n )-codes s.t. lim inf n 1 n log M jn R j, j = 1, 2, and lim n = 0. n Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 5 / 17

Capacity Region An (n, M 1n, M 2n, ε n )-code consists of an encoder f : {1,..., M 1n } {1,..., M 2n } R n such that the power constraint is satisfied; two decoders ϕ j : R n {1,..., M jn } for j = 1, 2; such that the average error probability P (n) e := Pr(Ŵ 1 W 1 or Ŵ 2 W 2 ) ε n. (R 1, R 2 ) is achievable a sequence of (n, M 1n, M 2n, ε n )-codes s.t. lim inf n 1 n log M jn R j, j = 1, 2, and lim n = 0. n Capacity region C is the set of all achievable rate pairs Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 5 / 17

Capacity Region Cover (1972) and Bergmans (1974) showed that C = R BC = R(α) α [0,1] where { ( αp ) ( (1 α)p ) } R(α) = (R 1, R 2 ) : R 1 C σ1 2, R 2 C αp + σ2 2 and C(x) = 1 log(1 + x). 2 Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 6 / 17

Capacity Region Cover (1972) and Bergmans (1974) showed that C = R BC = R(α) α [0,1] where { ( αp ) ( (1 α)p ) } R(α) = (R 1, R 2 ) : R 1 C σ1 2, R 2 C αp + σ2 2 and C(x) = 1 log(1 + x). 2 Direct part: Random coding + Superposition coding Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 6 / 17

Capacity Region Cover (1972) and Bergmans (1974) showed that C = R BC = R(α) α [0,1] where { ( αp ) ( (1 α)p ) } R(α) = (R 1, R 2 ) : R 1 C σ1 2, R 2 C αp + σ2 2 and C(x) = 1 log(1 + x). 2 Direct part: Random coding + Superposition coding Converse part: Fano s inequality + Entropy power inequality Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 6 / 17

Capacity Region Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 7 / 17

Capacity Region P (n) e 0 Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 7 / 17

Capacity Region P (n) e 0 P (n) e 0 Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 7 / 17

Capacity Region P (n) e 0 P (n) e 1? P (n) e 0 Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 7 / 17

Strong converse vs weak converse Can we claim that if (R 1, R 2 ) / C, then P (n) e 1? Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 8 / 17

Strong converse vs weak converse Can we claim that if (R 1, R 2 ) / C, then P (n) e 1? Indeed! Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 8 / 17

Strong converse vs weak converse Can we claim that if (R 1, R 2 ) / C, then P (n) e 1? Indeed! Sharp phase transition between what s possible and what s not Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 8 / 17

Strong converse vs weak converse Can we claim that if (R 1, R 2 ) / C, then P (n) e 1? Indeed! Sharp phase transition between what s possible and what s not The strong converse has been established for only degraded discrete memoryless BC Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 8 / 17

Strong converse vs weak converse Can we claim that if (R 1, R 2 ) / C, then P (n) e 1? Indeed! Sharp phase transition between what s possible and what s not The strong converse has been established for only degraded discrete memoryless BC Ahlswede, Gács and Körner (1976) used the blowing-up lemma BUL doesn t work for continuous alphabets [but see Wu and Özgür (2015)] Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 8 / 17

Strong converse vs weak converse Can we claim that if (R 1, R 2 ) / C, then P (n) e 1? Indeed! Sharp phase transition between what s possible and what s not The strong converse has been established for only degraded discrete memoryless BC Ahlswede, Gács and Körner (1976) used the blowing-up lemma BUL doesn t work for continuous alphabets [but see Wu and Özgür (2015)] Oohama (2015) uses properties of the Rényi divergence Good bounds between the Rényi divergence D α (P Q) and the relative entropy D(P Q) exist for finite alphabets Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 8 / 17

ε-capacity Region (R 1, R 2 ) is ε-achievable a sequence of (n, M 1n, M 2n, ε n )-codes s.t. lim inf n 1 n log M jn R j, j = 1, 2, and lim sup ε n ε. n Capacity region C ε is the set of all achievable rate pairs Strong converse holds iff C ε does not depend on ε. We already know that R BC = C 0 C ε Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 9 / 17

Strong converse Theorem The Gaussian BC satisfies the strong converse property: C ε = R BC, ε [0, 1) Key ideas in proof: Derive an appropriate information spectrum converse bound Use the Gaussian Poincaré inequality to bound relevant variances Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 10 / 17

Weak Converse for GBC [Bergmans (1974)] Step 1: Invoke Fano s inequality to assert that for any sequence of codes with vanishing error probability ε n 0, R j 1 n I(W j; Y n j ) + o(1), j {1, 2}. Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 11 / 17

Weak Converse for GBC [Bergmans (1974)] Step 1: Invoke Fano s inequality to assert that for any sequence of codes with vanishing error probability ε n 0, R j 1 n I(W j; Y n j ) + o(1), j {1, 2}. Step 2: Single-letterize and entropy power inequality ( ) I(W 1 ; Y1 n ) ni(x; Y 1 U) EPI αp nc ( ) I(W 1 ; Y1 n ) + I(W 2; Y2 n ) ni(u; Y 2) EPI (1 α)p nc αp + σ2 2 σ 2 1 Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 11 / 17

Strong Converse for DM-BC [Ahlswede et al. (1976)] Step 1: Invoke the blowing-up lemma to assert that for any sequence of codes with non-vanishing error probability ε [0, 1), R j 1 n I(W j; Y n j ) + o(1), j {1, 2}. Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 12 / 17

Strong Converse for DM-BC [Ahlswede et al. (1976)] Step 1: Invoke the blowing-up lemma to assert that for any sequence of codes with non-vanishing error probability ε [0, 1), R j 1 n I(W j; Y n j ) + o(1), j {1, 2}. Step 2: Single-letterize I(W 1 ; Y n 1 ) ni(x; Y 1 U), I(W 1 ; Y n 1 ) + I(W 2; Y n 2 ) ni(u; Y 2) where U i := (W 2, Y i 1 1 ). One also uses the degradedness condition here: I(W 2, Y i 1 2, Y i 1 1 ; Y 2i ) = I(U i ; Y 2i ). Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 12 / 17

Our Strong Converse Proof for Gaussian BC Convert code defined based on avg error prob ε to one based on max error prob ε =: ε w/o loss in rate [Telatar] Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 13 / 17

Our Strong Converse Proof for Gaussian BC Convert code defined based on avg error prob ε to one based on max error prob ε =: ε w/o loss in rate [Telatar] Establish information spectrum bound. For every (w 1, w 2 ), every code with max error prob ε satisfies ( ε Pr log P(Yn 1 w ) 1) P(Y1 n) nr 1 γ 1 (w 1, w 2 ) n 2 e γ 1(w 1,w 2 ) { } 1 2 n(r 1+R 2 ) P(y n 1 )P(w 2 w 1, y n 1 ) dyn 1 > n2 D 1 (w 1 ) Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 13 / 17

Our Strong Converse Proof for Gaussian BC Convert code defined based on avg error prob ε to one based on max error prob ε =: ε w/o loss in rate [Telatar] Establish information spectrum bound. For every (w 1, w 2 ), every code with max error prob ε satisfies ( ε Pr log P(Yn 1 w ) 1) P(Y1 n) nr 1 γ 1 (w 1, w 2 ) n 2 e γ 1(w 1,w 2 ) { } 1 2 n(r 1+R 2 ) P(y n 1 )P(w 2 w 1, y n 1 ) dyn 1 > n2 D 1 (w 1 ) Indicator term is often negligible (by Markov s inequality) Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 13 / 17

Our Strong Converse Proof for Gaussian BC Convert code defined based on avg error prob ε to one based on max error prob ε =: ε w/o loss in rate [Telatar] Establish information spectrum bound. For every (w 1, w 2 ), every code with max error prob ε satisfies ( ε Pr log P(Yn 1 w ) 1) P(Y1 n) nr 1 γ 1 (w 1, w 2 ) n 2 e γ 1(w 1,w 2 ) { } 1 2 n(r 1+R 2 ) P(y n 1 )P(w 2 w 1, y n 1 ) dyn 1 > n2 D 1 (w 1 ) Indicator term is often negligible (by Markov s inequality) Establish a bound on the coding rate [ nr 1 E log P(Yn 1 w ] 1) P(Y1 n) + 2 1 ε var [ log P(Yn 1 w ] 1) P(Y1 n) + 3 log n Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 13 / 17

Our Strong Converse Proof for Gaussian BC Use the Gaussian Poincaré inequality with careful identification of f and peak power constraint X n 2 np to assert that [ var log P(Yn 1 w ] 1) P(Y1 n) = O(n). Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 14 / 17

Our Strong Converse Proof for Gaussian BC Use the Gaussian Poincaré inequality with careful identification of f and peak power constraint X n 2 np to assert that [ var log P(Yn 1 w ] 1) P(Y1 n) = O(n). Thus we conclude that nr 1 I(W 1 ; Y n 1 ) + O( n), ε [0, 1). Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 14 / 17

Our Strong Converse Proof for Gaussian BC Use the Gaussian Poincaré inequality with careful identification of f and peak power constraint X n 2 np to assert that [ var log P(Yn 1 w ] 1) P(Y1 n) = O(n). Thus we conclude that nr 1 I(W 1 ; Y n 1 ) + O( n), ε [0, 1). Backoff term is of the order O(1/ n), i.e., where λ log M 1n + (1 λ) log M 2n = nc λ + O( n) { ( αp ) ( (1 α)p ) } C λ := max λc α [0,1] σ1 2 + (1 λ)c αp + σ2 2 but nailing down the constant (dispersion) seems challenging. Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 14 / 17

Another Application: Quasi-Static Fading Channels Consider the channel model Y i = HX i + Z i, i = 1,..., n where Z i are independent standard normal random variables and E[1/H] (0, ) Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 15 / 17

Another Application: Quasi-Static Fading Channels Consider the channel model Y i = HX i + Z i, i = 1,..., n where Z i are independent standard normal random variables and E[1/H] (0, ) If fading state is h and message is w, codeword is f h (w) R n. Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 15 / 17

Another Application: Quasi-Static Fading Channels Consider the channel model Y i = HX i + Z i, i = 1,..., n where Z i are independent standard normal random variables and E[1/H] (0, ) If fading state is h and message is w, codeword is f h (w) R n. Long-term power constraint 1 M n M n w=1 R + P H (h) f h (w) 2 dh np Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 15 / 17

Another Application: Quasi-Static Fading Channels Consider the channel model Y i = HX i + Z i, i = 1,..., n where Z i are independent standard normal random variables and E[1/H] (0, ) If fading state is h and message is w, codeword is f h (w) R n. Long-term power constraint 1 M n M n w=1 R + P H (h) f h (w) 2 dh np Delay-limited capacity [Hanly and Tse (1998)], i.e., the maximum transmission rate under the constraint that the maximal error probability over all H > 0 vanishes Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 15 / 17

Vanishing Normalized Relative Entropy The delay-limited capacity [Hanly and Tse (1998)] is C(P DL ), where P DL := P E[1/H] For any sequence of capacity-achieving codes with vanishing maximum error probability 1 lim n n D( P Y n (P Y) n) 0 where P Y = N (0, P DL ). Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 16 / 17

Vanishing Normalized Relative Entropy The delay-limited capacity [Hanly and Tse (1998)] is C(P DL ), where P DL := P E[1/H] For any sequence of capacity-achieving codes with vanishing maximum error probability 1 lim n n D( P Y n (P Y) n) 0 where P Y = N (0, P DL ). Every good code is s.t. the induced output distribution looks like the n-fold CAOD. Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 16 / 17

Vanishing Normalized Relative Entropy The delay-limited capacity [Hanly and Tse (1998)] is C(P DL ), where P DL := P E[1/H] For any sequence of capacity-achieving codes with vanishing maximum error probability 1 lim n n D( P Y n (P Y) n) 0 where P Y = N (0, P DL ). Every good code is s.t. the induced output distribution looks like the n-fold CAOD. Extend to the case where the error probability is non-vanishing Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 16 / 17

Vanishing Normalized Relative Entropy The delay-limited capacity [Hanly and Tse (1998)] is C(P DL ), where P DL := P E[1/H] For any sequence of capacity-achieving codes with vanishing maximum error probability 1 lim n n D( P Y n (P Y) n) 0 where P Y = N (0, P DL ). Every good code is s.t. the induced output distribution looks like the n-fold CAOD. Extend to the case where the error probability is non-vanishing Control a variance term [ var log P Y n X n,h(y n X n ], h) P Y n H(Y n = O(n). h) Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 16 / 17

Concluding Remarks Gaussian Poincaré inequality is useful for Shannon-theoretic problems with uncountable alphabets var[f (Z n )] E[ f (Z n ) 2 ]. Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 17 / 17

Concluding Remarks Gaussian Poincaré inequality is useful for Shannon-theoretic problems with uncountable alphabets var[f (Z n )] E[ f (Z n ) 2 ]. Allows us to establish strong converses and properties of good codes by controlling variance of log-likelihood ratios Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 17 / 17

Concluding Remarks Gaussian Poincaré inequality is useful for Shannon-theoretic problems with uncountable alphabets var[f (Z n )] E[ f (Z n ) 2 ]. Allows us to establish strong converses and properties of good codes by controlling variance of log-likelihood ratios For more details, please see Arxiv: 1509.01380 (Strong converse for Gaussian broadcast) Arxiv: 1510.08544 (Empirical output distribution of good codes for fading channels) Vincent Tan (NUS) Applications of the Poincaré Inequality IZS 2016 17 / 17