Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory
|
|
- James Rodgers
- 5 years ago
- Views:
Transcription
1 Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Vincent Y. F. Tan (Joint work with Silas L. Fong) National University of Singapore (NUS) 2016 International Zurich Seminar on Communications Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
2 Gaussian Poincaré Inequality Theorem For Z n iid N (0, 1) and any differentiable mapping f such that E[(f (Z n )) 2 ] <, and E[ f (Z n ) 2 ] < we have var[f (Z n )] E[ f (Z n ) 2 ]. Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
3 Gaussian Poincaré Inequality Theorem For Z n iid N (0, 1) and any differentiable mapping f such that E[(f (Z n )) 2 ] <, and E[ f (Z n ) 2 ] < we have var[f (Z n )] E[ f (Z n ) 2 ]. Controlling the variance of f (Z n ) that is a function of i.i.d. random variables in terms of the gradient of f (Z n ) Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
4 Gaussian Poincaré Inequality Theorem For Z n iid N (0, 1) and any differentiable mapping f such that E[(f (Z n )) 2 ] <, and E[ f (Z n ) 2 ] < we have var[f (Z n )] E[ f (Z n ) 2 ]. Controlling the variance of f (Z n ) that is a function of i.i.d. random variables in terms of the gradient of f (Z n ) Using the Gaussian Poincaré inequality for appropriate f, var[f (Z n )] = O(n). Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
5 Gaussian Poincaré Inequality in Shannon Theory Polyanskiy and Verdú (2014) bounded the KL divergence between the empirical output distribution of AWGN channel codes P Y n and the n-fold product of the CAOD P Y, i.e., D(P Y n (P Y) n ) Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
6 Gaussian Poincaré Inequality in Shannon Theory Polyanskiy and Verdú (2014) bounded the KL divergence between the empirical output distribution of AWGN channel codes P Y n and the n-fold product of the CAOD P Y, i.e., D(P Y n (P Y) n ) Often we need to bound the variance of certain log-likelihood ratios (dispersion) Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
7 Gaussian Poincaré Inequality in Shannon Theory Polyanskiy and Verdú (2014) bounded the KL divergence between the empirical output distribution of AWGN channel codes P Y n and the n-fold product of the CAOD P Y, i.e., D(P Y n (P Y) n ) Often we need to bound the variance of certain log-likelihood ratios (dispersion) Demonstrate its utility by establishing Strong converse for the Gaussian broadcast channels Properties of the empirical output distribution of delay-limited codes for quasi-static fading channels Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
8 Gaussian Broadcast Channel Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
9 Gaussian Broadcast Channel Assume g 1 = g 2 = 1 and σ 2 2 > σ2 1 Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
10 Gaussian Broadcast Channel Assume g 1 = g 2 = 1 and σ 2 2 > σ2 1 Input X n must satisfy X n 2 2 = n Xi 2 np i=1 Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
11 Capacity Region An (n, M 1n, M 2n, ε n )-code consists of an encoder f : {1,..., M 1n } {1,..., M 2n } R n such that the power constraint is satisfied; two decoders ϕ j : R n {1,..., M jn } for j = 1, 2; such that the average error probability P (n) e := Pr(Ŵ 1 W 1 or Ŵ 2 W 2 ) ε n. Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
12 Capacity Region An (n, M 1n, M 2n, ε n )-code consists of an encoder f : {1,..., M 1n } {1,..., M 2n } R n such that the power constraint is satisfied; two decoders ϕ j : R n {1,..., M jn } for j = 1, 2; such that the average error probability P (n) e := Pr(Ŵ 1 W 1 or Ŵ 2 W 2 ) ε n. (R 1, R 2 ) is achievable a sequence of (n, M 1n, M 2n, ε n )-codes s.t. lim inf n 1 n log M jn R j, j = 1, 2, and lim n = 0. n Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
13 Capacity Region An (n, M 1n, M 2n, ε n )-code consists of an encoder f : {1,..., M 1n } {1,..., M 2n } R n such that the power constraint is satisfied; two decoders ϕ j : R n {1,..., M jn } for j = 1, 2; such that the average error probability P (n) e := Pr(Ŵ 1 W 1 or Ŵ 2 W 2 ) ε n. (R 1, R 2 ) is achievable a sequence of (n, M 1n, M 2n, ε n )-codes s.t. lim inf n 1 n log M jn R j, j = 1, 2, and lim n = 0. n Capacity region C is the set of all achievable rate pairs Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
14 Capacity Region Cover (1972) and Bergmans (1974) showed that C = R BC = R(α) α [0,1] where { ( αp ) ( (1 α)p ) } R(α) = (R 1, R 2 ) : R 1 C σ1 2, R 2 C αp + σ2 2 and C(x) = 1 log(1 + x). 2 Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
15 Capacity Region Cover (1972) and Bergmans (1974) showed that C = R BC = R(α) α [0,1] where { ( αp ) ( (1 α)p ) } R(α) = (R 1, R 2 ) : R 1 C σ1 2, R 2 C αp + σ2 2 and C(x) = 1 log(1 + x). 2 Direct part: Random coding + Superposition coding Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
16 Capacity Region Cover (1972) and Bergmans (1974) showed that C = R BC = R(α) α [0,1] where { ( αp ) ( (1 α)p ) } R(α) = (R 1, R 2 ) : R 1 C σ1 2, R 2 C αp + σ2 2 and C(x) = 1 log(1 + x). 2 Direct part: Random coding + Superposition coding Converse part: Fano s inequality + Entropy power inequality Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
17 Capacity Region Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
18 Capacity Region P (n) e 0 Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
19 Capacity Region P (n) e 0 P (n) e 0 Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
20 Capacity Region P (n) e 0 P (n) e 1? P (n) e 0 Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
21 Strong converse vs weak converse Can we claim that if (R 1, R 2 ) / C, then P (n) e 1? Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
22 Strong converse vs weak converse Can we claim that if (R 1, R 2 ) / C, then P (n) e 1? Indeed! Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
23 Strong converse vs weak converse Can we claim that if (R 1, R 2 ) / C, then P (n) e 1? Indeed! Sharp phase transition between what s possible and what s not Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
24 Strong converse vs weak converse Can we claim that if (R 1, R 2 ) / C, then P (n) e 1? Indeed! Sharp phase transition between what s possible and what s not The strong converse has been established for only degraded discrete memoryless BC Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
25 Strong converse vs weak converse Can we claim that if (R 1, R 2 ) / C, then P (n) e 1? Indeed! Sharp phase transition between what s possible and what s not The strong converse has been established for only degraded discrete memoryless BC Ahlswede, Gács and Körner (1976) used the blowing-up lemma BUL doesn t work for continuous alphabets [but see Wu and Özgür (2015)] Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
26 Strong converse vs weak converse Can we claim that if (R 1, R 2 ) / C, then P (n) e 1? Indeed! Sharp phase transition between what s possible and what s not The strong converse has been established for only degraded discrete memoryless BC Ahlswede, Gács and Körner (1976) used the blowing-up lemma BUL doesn t work for continuous alphabets [but see Wu and Özgür (2015)] Oohama (2015) uses properties of the Rényi divergence Good bounds between the Rényi divergence D α (P Q) and the relative entropy D(P Q) exist for finite alphabets Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
27 ε-capacity Region (R 1, R 2 ) is ε-achievable a sequence of (n, M 1n, M 2n, ε n )-codes s.t. lim inf n 1 n log M jn R j, j = 1, 2, and lim sup ε n ε. n Capacity region C ε is the set of all achievable rate pairs Strong converse holds iff C ε does not depend on ε. We already know that R BC = C 0 C ε Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
28 Strong converse Theorem The Gaussian BC satisfies the strong converse property: C ε = R BC, ε [0, 1) Key ideas in proof: Derive an appropriate information spectrum converse bound Use the Gaussian Poincaré inequality to bound relevant variances Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
29 Weak Converse for GBC [Bergmans (1974)] Step 1: Invoke Fano s inequality to assert that for any sequence of codes with vanishing error probability ε n 0, R j 1 n I(W j; Y n j ) + o(1), j {1, 2}. Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
30 Weak Converse for GBC [Bergmans (1974)] Step 1: Invoke Fano s inequality to assert that for any sequence of codes with vanishing error probability ε n 0, R j 1 n I(W j; Y n j ) + o(1), j {1, 2}. Step 2: Single-letterize and entropy power inequality ( ) I(W 1 ; Y1 n ) ni(x; Y 1 U) EPI αp nc ( ) I(W 1 ; Y1 n ) + I(W 2; Y2 n ) ni(u; Y 2) EPI (1 α)p nc αp + σ2 2 σ 2 1 Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
31 Strong Converse for DM-BC [Ahlswede et al. (1976)] Step 1: Invoke the blowing-up lemma to assert that for any sequence of codes with non-vanishing error probability ε [0, 1), R j 1 n I(W j; Y n j ) + o(1), j {1, 2}. Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
32 Strong Converse for DM-BC [Ahlswede et al. (1976)] Step 1: Invoke the blowing-up lemma to assert that for any sequence of codes with non-vanishing error probability ε [0, 1), R j 1 n I(W j; Y n j ) + o(1), j {1, 2}. Step 2: Single-letterize I(W 1 ; Y n 1 ) ni(x; Y 1 U), I(W 1 ; Y n 1 ) + I(W 2; Y n 2 ) ni(u; Y 2) where U i := (W 2, Y i 1 1 ). One also uses the degradedness condition here: I(W 2, Y i 1 2, Y i 1 1 ; Y 2i ) = I(U i ; Y 2i ). Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
33 Our Strong Converse Proof for Gaussian BC Convert code defined based on avg error prob ε to one based on max error prob ε =: ε w/o loss in rate [Telatar] Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
34 Our Strong Converse Proof for Gaussian BC Convert code defined based on avg error prob ε to one based on max error prob ε =: ε w/o loss in rate [Telatar] Establish information spectrum bound. For every (w 1, w 2 ), every code with max error prob ε satisfies ( ε Pr log P(Yn 1 w ) 1) P(Y1 n) nr 1 γ 1 (w 1, w 2 ) n 2 e γ 1(w 1,w 2 ) { } 1 2 n(r 1+R 2 ) P(y n 1 )P(w 2 w 1, y n 1 ) dyn 1 > n2 D 1 (w 1 ) Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
35 Our Strong Converse Proof for Gaussian BC Convert code defined based on avg error prob ε to one based on max error prob ε =: ε w/o loss in rate [Telatar] Establish information spectrum bound. For every (w 1, w 2 ), every code with max error prob ε satisfies ( ε Pr log P(Yn 1 w ) 1) P(Y1 n) nr 1 γ 1 (w 1, w 2 ) n 2 e γ 1(w 1,w 2 ) { } 1 2 n(r 1+R 2 ) P(y n 1 )P(w 2 w 1, y n 1 ) dyn 1 > n2 D 1 (w 1 ) Indicator term is often negligible (by Markov s inequality) Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
36 Our Strong Converse Proof for Gaussian BC Convert code defined based on avg error prob ε to one based on max error prob ε =: ε w/o loss in rate [Telatar] Establish information spectrum bound. For every (w 1, w 2 ), every code with max error prob ε satisfies ( ε Pr log P(Yn 1 w ) 1) P(Y1 n) nr 1 γ 1 (w 1, w 2 ) n 2 e γ 1(w 1,w 2 ) { } 1 2 n(r 1+R 2 ) P(y n 1 )P(w 2 w 1, y n 1 ) dyn 1 > n2 D 1 (w 1 ) Indicator term is often negligible (by Markov s inequality) Establish a bound on the coding rate [ nr 1 E log P(Yn 1 w ] 1) P(Y1 n) ε var [ log P(Yn 1 w ] 1) P(Y1 n) + 3 log n Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
37 Our Strong Converse Proof for Gaussian BC Use the Gaussian Poincaré inequality with careful identification of f and peak power constraint X n 2 np to assert that [ var log P(Yn 1 w ] 1) P(Y1 n) = O(n). Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
38 Our Strong Converse Proof for Gaussian BC Use the Gaussian Poincaré inequality with careful identification of f and peak power constraint X n 2 np to assert that [ var log P(Yn 1 w ] 1) P(Y1 n) = O(n). Thus we conclude that nr 1 I(W 1 ; Y n 1 ) + O( n), ε [0, 1). Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
39 Our Strong Converse Proof for Gaussian BC Use the Gaussian Poincaré inequality with careful identification of f and peak power constraint X n 2 np to assert that [ var log P(Yn 1 w ] 1) P(Y1 n) = O(n). Thus we conclude that nr 1 I(W 1 ; Y n 1 ) + O( n), ε [0, 1). Backoff term is of the order O(1/ n), i.e., where λ log M 1n + (1 λ) log M 2n = nc λ + O( n) { ( αp ) ( (1 α)p ) } C λ := max λc α [0,1] σ1 2 + (1 λ)c αp + σ2 2 but nailing down the constant (dispersion) seems challenging. Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
40 Another Application: Quasi-Static Fading Channels Consider the channel model Y i = HX i + Z i, i = 1,..., n where Z i are independent standard normal random variables and E[1/H] (0, ) Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
41 Another Application: Quasi-Static Fading Channels Consider the channel model Y i = HX i + Z i, i = 1,..., n where Z i are independent standard normal random variables and E[1/H] (0, ) If fading state is h and message is w, codeword is f h (w) R n. Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
42 Another Application: Quasi-Static Fading Channels Consider the channel model Y i = HX i + Z i, i = 1,..., n where Z i are independent standard normal random variables and E[1/H] (0, ) If fading state is h and message is w, codeword is f h (w) R n. Long-term power constraint 1 M n M n w=1 R + P H (h) f h (w) 2 dh np Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
43 Another Application: Quasi-Static Fading Channels Consider the channel model Y i = HX i + Z i, i = 1,..., n where Z i are independent standard normal random variables and E[1/H] (0, ) If fading state is h and message is w, codeword is f h (w) R n. Long-term power constraint 1 M n M n w=1 R + P H (h) f h (w) 2 dh np Delay-limited capacity [Hanly and Tse (1998)], i.e., the maximum transmission rate under the constraint that the maximal error probability over all H > 0 vanishes Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
44 Vanishing Normalized Relative Entropy The delay-limited capacity [Hanly and Tse (1998)] is C(P DL ), where P DL := P E[1/H] For any sequence of capacity-achieving codes with vanishing maximum error probability 1 lim n n D( P Y n (P Y) n) 0 where P Y = N (0, P DL ). Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
45 Vanishing Normalized Relative Entropy The delay-limited capacity [Hanly and Tse (1998)] is C(P DL ), where P DL := P E[1/H] For any sequence of capacity-achieving codes with vanishing maximum error probability 1 lim n n D( P Y n (P Y) n) 0 where P Y = N (0, P DL ). Every good code is s.t. the induced output distribution looks like the n-fold CAOD. Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
46 Vanishing Normalized Relative Entropy The delay-limited capacity [Hanly and Tse (1998)] is C(P DL ), where P DL := P E[1/H] For any sequence of capacity-achieving codes with vanishing maximum error probability 1 lim n n D( P Y n (P Y) n) 0 where P Y = N (0, P DL ). Every good code is s.t. the induced output distribution looks like the n-fold CAOD. Extend to the case where the error probability is non-vanishing Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
47 Vanishing Normalized Relative Entropy The delay-limited capacity [Hanly and Tse (1998)] is C(P DL ), where P DL := P E[1/H] For any sequence of capacity-achieving codes with vanishing maximum error probability 1 lim n n D( P Y n (P Y) n) 0 where P Y = N (0, P DL ). Every good code is s.t. the induced output distribution looks like the n-fold CAOD. Extend to the case where the error probability is non-vanishing Control a variance term [ var log P Y n X n,h(y n X n ], h) P Y n H(Y n = O(n). h) Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
48 Concluding Remarks Gaussian Poincaré inequality is useful for Shannon-theoretic problems with uncountable alphabets var[f (Z n )] E[ f (Z n ) 2 ]. Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
49 Concluding Remarks Gaussian Poincaré inequality is useful for Shannon-theoretic problems with uncountable alphabets var[f (Z n )] E[ f (Z n ) 2 ]. Allows us to establish strong converses and properties of good codes by controlling variance of log-likelihood ratios Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
50 Concluding Remarks Gaussian Poincaré inequality is useful for Shannon-theoretic problems with uncountable alphabets var[f (Z n )] E[ f (Z n ) 2 ]. Allows us to establish strong converses and properties of good codes by controlling variance of log-likelihood ratios For more details, please see Arxiv: (Strong converse for Gaussian broadcast) Arxiv: (Empirical output distribution of good codes for fading channels) Vincent Tan (NUS) Applications of the Poincaré Inequality IZS / 17
A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback
A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback Vincent Y. F. Tan (NUS) Joint work with Silas L. Fong (Toronto) 2017 Information Theory Workshop, Kaohsiung,
More informationEE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15
EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability
More informationThe Gallager Converse
The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing
More informationStrong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach
Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Silas Fong (Joint work with Vincent Tan) Department of Electrical & Computer Engineering National University
More informationSuperposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels
Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:
More informationInformation Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results
Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel
More informationSecond-Order Asymptotics for the Gaussian MAC with Degraded Message Sets
Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets Jonathan Scarlett and Vincent Y. F. Tan Department of Engineering, University of Cambridge Electrical and Computer Engineering,
More informationAsymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities
Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities Vincent Y. F. Tan Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) September 2014 Vincent Tan
More informationSecond-Order Asymptotics in Information Theory
Second-Order Asymptotics in Information Theory Vincent Y. F. Tan (vtan@nus.edu.sg) Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) National Taiwan University November 2015
More informationLecture 4 Channel Coding
Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity
More informationRefined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities
Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities Maxim Raginsky and Igal Sason ISIT 2013, Istanbul, Turkey Capacity-Achieving Channel Codes The set-up DMC
More informationELEC546 Review of Information Theory
ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random
More informationLecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity
5-859: Information Theory and Applications in TCS CMU: Spring 23 Lecture 8: Channel and source-channel coding theorems; BEC & linear codes February 7, 23 Lecturer: Venkatesan Guruswami Scribe: Dan Stahlke
More informationChapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University
Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission
More informationMultiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets
Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,
More informationRelay Networks With Delays
Relay Networks With Delays Abbas El Gamal, Navid Hassanpour, and James Mammen Department of Electrical Engineering Stanford University, Stanford, CA 94305-9510 Email: {abbas, navid, jmammen}@stanford.edu
More informationLecture 3: Channel Capacity
Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.
More informationECE598: Information-theoretic methods in high-dimensional statistics Spring 2016
ECE598: Information-theoretic methods in high-dimensional statistics Spring 06 Lecture : Mutual Information Method Lecturer: Yihong Wu Scribe: Jaeho Lee, Mar, 06 Ed. Mar 9 Quick review: Assouad s lemma
More informationOn Third-Order Asymptotics for DMCs
On Third-Order Asymptotics for DMCs Vincent Y. F. Tan Institute for Infocomm Research (I R) National University of Singapore (NUS) January 0, 013 Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs
More informationInformation Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem
Information Theory for Wireless Communications. Lecture 0 Discrete Memoryless Multiple Access Channel (DM-MAC: The Converse Theorem Instructor: Dr. Saif Khan Mohammed Scribe: Antonios Pitarokoilis I. THE
More informationNational University of Singapore Department of Electrical & Computer Engineering. Examination for
National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:
More informationLecture 10: Broadcast Channel and Superposition Coding
Lecture 10: Broadcast Channel and Superposition Coding Scribed by: Zhe Yao 1 Broadcast channel M 0M 1M P{y 1 y x} M M 01 1 M M 0 The capacity of the broadcast channel depends only on the marginal conditional
More informationLecture 22: Final Review
Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information
More informationMemory in Classical Information Theory: A Brief History
Beyond iid in information theory 8- January, 203 Memory in Classical Information Theory: A Brief History sergio verdu princeton university entropy rate Shannon, 948 entropy rate: memoryless process H(X)
More informationCapacity of a channel Shannon s second theorem. Information Theory 1/33
Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,
More informationOn Multiple User Channels with State Information at the Transmitters
On Multiple User Channels with State Information at the Transmitters Styrmir Sigurjónsson and Young-Han Kim* Information Systems Laboratory Stanford University Stanford, CA 94305, USA Email: {styrmir,yhk}@stanford.edu
More informationLecture 6 I. CHANNEL CODING. X n (m) P Y X
6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder
More informationVariable Rate Channel Capacity. Jie Ren 2013/4/26
Variable Rate Channel Capacity Jie Ren 2013/4/26 Reference This is a introduc?on of Sergio Verdu and Shlomo Shamai s paper. Sergio Verdu and Shlomo Shamai, Variable- Rate Channel Capacity, IEEE Transac?ons
More informationOn the Rate-Limited Gelfand-Pinsker Problem
On the Rate-Limited Gelfand-Pinsker Problem Ravi Tandon Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 ravit@umd.edu ulukus@umd.edu Abstract
More informationCorrelation Detection and an Operational Interpretation of the Rényi Mutual Information
Correlation Detection and an Operational Interpretation of the Rényi Mutual Information Masahito Hayashi 1, Marco Tomamichel 2 1 Graduate School of Mathematics, Nagoya University, and Centre for Quantum
More informationLecture 4 Noisy Channel Coding
Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem
More informationOn the Secrecy Capacity of Fading Channels
On the Secrecy Capacity of Fading Channels arxiv:cs/63v [cs.it] 7 Oct 26 Praveen Kumar Gopala, Lifeng Lai and Hesham El Gamal Department of Electrical and Computer Engineering The Ohio State University
More informationOn the Duality between Multiple-Access Codes and Computation Codes
On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch
More informationCapacity of the Discrete Memoryless Energy Harvesting Channel with Side Information
204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener
More informationDispersion of the Gilbert-Elliott Channel
Dispersion of the Gilbert-Elliott Channel Yury Polyanskiy Email: ypolyans@princeton.edu H. Vincent Poor Email: poor@princeton.edu Sergio Verdú Email: verdu@princeton.edu Abstract Channel dispersion plays
More informationOn Two-user Fading Gaussian Broadcast Channels. with Perfect Channel State Information at the Receivers. Daniela Tuninetti
DIMACS Workshop on Network Information Theory - March 2003 Daniela Tuninetti 1 On Two-user Fading Gaussian Broadcast Channels with Perfect Channel State Information at the Receivers Daniela Tuninetti Mobile
More informationAn Outer Bound for the Gaussian. Interference channel with a relay.
An Outer Bound for the Gaussian Interference Channel with a Relay Ivana Marić Stanford University Stanford, CA ivanam@wsl.stanford.edu Ron Dabora Ben-Gurion University Be er-sheva, Israel ron@ee.bgu.ac.il
More informationThe Capacity Region of the Gaussian MIMO Broadcast Channel
0-0 The Capacity Region of the Gaussian MIMO Broadcast Channel Hanan Weingarten, Yossef Steinberg and Shlomo Shamai (Shitz) Outline Problem statement Background and preliminaries Capacity region of the
More informationLecture 5 Channel Coding over Continuous Channels
Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From
More informationA New Metaconverse and Outer Region for Finite-Blocklength MACs
A New Metaconverse Outer Region for Finite-Blocklength MACs Pierre Moulin Dept of Electrical Computer Engineering University of Illinois at Urbana-Champaign Urbana, IL 680 Abstract An outer rate region
More informationOn the Capacity Region of the Gaussian Z-channel
On the Capacity Region of the Gaussian Z-channel Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@eng.umd.edu ulukus@eng.umd.edu
More informationLecture 11: Polar codes construction
15-859: Information Theory and Applications in TCS CMU: Spring 2013 Lecturer: Venkatesan Guruswami Lecture 11: Polar codes construction February 26, 2013 Scribe: Dan Stahlke 1 Polar codes: recap of last
More informationEE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions
EE/Stat 376B Handout #5 Network Information Theory October, 14, 014 1. Problem.4 parts (b) and (c). Homework Set # Solutions (b) Consider h(x + Y ) h(x + Y Y ) = h(x Y ) = h(x). (c) Let ay = Y 1 + Y, where
More informationVariable Length Codes for Degraded Broadcast Channels
Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates
More informationA Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels
A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels Mehdi Mohseni Department of Electrical Engineering Stanford University Stanford, CA 94305, USA Email: mmohseni@stanford.edu
More informationLECTURE 13. Last time: Lecture outline
LECTURE 13 Last time: Strong coding theorem Revisiting channel and codes Bound on probability of error Error exponent Lecture outline Fano s Lemma revisited Fano s inequality for codewords Converse to
More informationShannon s Noisy-Channel Coding Theorem
Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 2015 Abstract In information theory, Shannon s Noisy-Channel Coding Theorem states that it is possible to communicate over a noisy
More informationThe Method of Types and Its Application to Information Hiding
The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,
More informationFeedback Capacity of a Class of Symmetric Finite-State Markov Channels
Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Nevroz Şen, Fady Alajaji and Serdar Yüksel Department of Mathematics and Statistics Queen s University Kingston, ON K7L 3N6, Canada
More informationX 1 : X Table 1: Y = X X 2
ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access
More informationSimultaneous Nonunique Decoding Is Rate-Optimal
Fiftieth Annual Allerton Conference Allerton House, UIUC, Illinois, USA October 1-5, 2012 Simultaneous Nonunique Decoding Is Rate-Optimal Bernd Bandemer University of California, San Diego La Jolla, CA
More informationOn the Capacity of Interference Channels with Degraded Message sets
On the Capacity of Interference Channels with Degraded Message sets Wei Wu, Sriram Vishwanath and Ari Arapostathis arxiv:cs/060507v [cs.it] 7 May 006 Abstract This paper is motivated by a sensor network
More informationLecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122
Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel
More informationCapacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel
Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel Jun Chen Dept. of Electrical and Computer Engr. McMaster University Hamilton, Ontario, Canada Chao Tian AT&T Labs-Research 80 Park
More informationArimoto Channel Coding Converse and Rényi Divergence
Arimoto Channel Coding Converse and Rényi Divergence Yury Polyanskiy and Sergio Verdú Abstract Arimoto proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code
More informationECE Information theory Final (Fall 2008)
ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1
More informationResearch Collection. The relation between block length and reliability for a cascade of AWGN links. Conference Paper. ETH Library
Research Collection Conference Paper The relation between block length and reliability for a cascade of AWG links Authors: Subramanian, Ramanan Publication Date: 0 Permanent Link: https://doiorg/0399/ethz-a-00705046
More informationLECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem
LECTURE 15 Last time: Feedback channel: setting up the problem Perfect feedback Feedback capacity Data compression Lecture outline Joint source and channel coding theorem Converse Robustness Brain teaser
More informationBounds on Achievable Rates for General Multi-terminal Networks with Practical Constraints
Bounds on Achievable Rates for General Multi-terminal Networs with Practical Constraints Mohammad Ali Khojastepour, Ashutosh Sabharwal, and Behnaam Aazhang Department of Electrical and Computer Engineering
More informationBounds and Capacity Results for the Cognitive Z-interference Channel
Bounds and Capacity Results for the Cognitive Z-interference Channel Nan Liu nanliu@stanford.edu Ivana Marić ivanam@wsl.stanford.edu Andrea J. Goldsmith andrea@wsl.stanford.edu Shlomo Shamai (Shitz) Technion
More informationA Comparison of Two Achievable Rate Regions for the Interference Channel
A Comparison of Two Achievable Rate Regions for the Interference Channel Hon-Fah Chong, Mehul Motani, and Hari Krishna Garg Electrical & Computer Engineering National University of Singapore Email: {g030596,motani,eleghk}@nus.edu.sg
More informationGeneralized Writing on Dirty Paper
Generalized Writing on Dirty Paper Aaron S. Cohen acohen@mit.edu MIT, 36-689 77 Massachusetts Ave. Cambridge, MA 02139-4307 Amos Lapidoth lapidoth@isi.ee.ethz.ch ETF E107 ETH-Zentrum CH-8092 Zürich, Switzerland
More informationKatalin Marton. Abbas El Gamal. Stanford University. Withits A. El Gamal (Stanford University) Katalin Marton Withits / 9
Katalin Marton Abbas El Gamal Stanford University Withits 2010 A. El Gamal (Stanford University) Katalin Marton Withits 2010 1 / 9 Brief Bio Born in 1941, Budapest Hungary PhD from Eötvös Loránd University
More informationDraft. On Limiting Expressions for the Capacity Region of Gaussian Interference Channels. Mojtaba Vaezi and H. Vincent Poor
Preliminaries Counterexample Better Use On Limiting Expressions for the Capacity Region of Gaussian Interference Channels Mojtaba Vaezi and H. Vincent Poor Department of Electrical Engineering Princeton
More informationShannon s noisy-channel theorem
Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for
More informationA Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying
A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying Ahmad Abu Al Haija, and Mai Vu, Department of Electrical and Computer Engineering McGill University Montreal, QC H3A A7 Emails: ahmadabualhaija@mailmcgillca,
More informationEE5139R: Problem Set 4 Assigned: 31/08/16, Due: 07/09/16
EE539R: Problem Set 4 Assigned: 3/08/6, Due: 07/09/6. Cover and Thomas: Problem 3.5 Sets defined by probabilities: Define the set C n (t = {x n : P X n(x n 2 nt } (a We have = P X n(x n P X n(x n 2 nt
More informationEE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018
Please submit the solutions on Gradescope. EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 1. Optimal codeword lengths. Although the codeword lengths of an optimal variable length code
More informationCapacity of a Class of Cognitive Radio Channels: Interference Channels with Degraded Message Sets
Capacity of a Class of Cognitive Radio Channels: Interference Channels with Degraded Message Sets Wei Wu, Sriram Vishwanath and Ari Arapostathis Abstract This paper is motivated by two different scenarios.
More informationChannels with cost constraints: strong converse and dispersion
Channels with cost constraints: strong converse and dispersion Victoria Kostina, Sergio Verdú Dept. of Electrical Engineering, Princeton University, NJ 08544, USA Abstract This paper shows the strong converse
More informationThe Capacity Region of the Cognitive Z-interference Channel with One Noiseless Component
1 The Capacity Region of the Cognitive Z-interference Channel with One Noiseless Component Nan Liu, Ivana Marić, Andrea J. Goldsmith, Shlomo Shamai (Shitz) arxiv:0812.0617v1 [cs.it] 2 Dec 2008 Dept. of
More informationLecture 15: Conditional and Joint Typicaility
EE376A Information Theory Lecture 1-02/26/2015 Lecture 15: Conditional and Joint Typicaility Lecturer: Kartik Venkat Scribe: Max Zimet, Brian Wai, Sepehr Nezami 1 Notation We always write a sequence of
More informationCapacity Theorems for Relay Channels
Capacity Theorems for Relay Channels Abbas El Gamal Department of Electrical Engineering Stanford University April, 2006 MSRI-06 Relay Channel Discrete-memoryless relay channel [vm 7] Relay Encoder Y n
More informationLecture 2: August 31
0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 2: August 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy
More informationNetwork coding for multicast relation to compression and generalization of Slepian-Wolf
Network coding for multicast relation to compression and generalization of Slepian-Wolf 1 Overview Review of Slepian-Wolf Distributed network compression Error exponents Source-channel separation issues
More informationEE 4TM4: Digital Communications II. Channel Capacity
EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.
More informationLECTURE 10. Last time: Lecture outline
LECTURE 10 Joint AEP Coding Theorem Last time: Error Exponents Lecture outline Strong Coding Theorem Reading: Gallager, Chapter 5. Review Joint AEP A ( ɛ n) (X) A ( ɛ n) (Y ) vs. A ( ɛ n) (X, Y ) 2 nh(x)
More informationSoft Covering with High Probability
Soft Covering with High Probability Paul Cuff Princeton University arxiv:605.06396v [cs.it] 20 May 206 Abstract Wyner s soft-covering lemma is the central analysis step for achievability proofs of information
More information(Classical) Information Theory III: Noisy channel coding
(Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way
More informationLecture 2. Capacity of the Gaussian channel
Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN
More informationCapacity bounds for multiple access-cognitive interference channel
Mirmohseni et al. EURASIP Journal on Wireless Communications and Networking, :5 http://jwcn.eurasipjournals.com/content///5 RESEARCH Open Access Capacity bounds for multiple access-cognitive interference
More informationLecture 14 February 28
EE/Stats 376A: Information Theory Winter 07 Lecture 4 February 8 Lecturer: David Tse Scribe: Sagnik M, Vivek B 4 Outline Gaussian channel and capacity Information measures for continuous random variables
More information5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010
5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 Capacity Theorems for Discrete, Finite-State Broadcast Channels With Feedback and Unidirectional Receiver Cooperation Ron Dabora
More informationSecret Key Agreement Using Asymmetry in Channel State Knowledge
Secret Key Agreement Using Asymmetry in Channel State Knowledge Ashish Khisti Deutsche Telekom Inc. R&D Lab USA Los Altos, CA, 94040 Email: ashish.khisti@telekom.com Suhas Diggavi LICOS, EFL Lausanne,
More informationQuiz 2 Date: Monday, November 21, 2016
10-704 Information Processing and Learning Fall 2016 Quiz 2 Date: Monday, November 21, 2016 Name: Andrew ID: Department: Guidelines: 1. PLEASE DO NOT TURN THIS PAGE UNTIL INSTRUCTED. 2. Write your name,
More informationEnergy State Amplification in an Energy Harvesting Communication System
Energy State Amplification in an Energy Harvesting Communication System Omur Ozel Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland College Park, MD 20742 omur@umd.edu
More informationA Formula for the Capacity of the General Gel fand-pinsker Channel
A Formula for the Capacity of the General Gel fand-pinsker Channel Vincent Y. F. Tan Institute for Infocomm Research (I 2 R, A*STAR, Email: tanyfv@i2r.a-star.edu.sg ECE Dept., National University of Singapore
More informationA Summary of Multiple Access Channels
A Summary of Multiple Access Channels Wenyi Zhang February 24, 2003 Abstract In this summary we attempt to present a brief overview of the classical results on the information-theoretic aspects of multiple
More informationAn Extended Fano s Inequality for the Finite Blocklength Coding
An Extended Fano s Inequality for the Finite Bloclength Coding Yunquan Dong, Pingyi Fan {dongyq8@mails,fpy@mail}.tsinghua.edu.cn Department of Electronic Engineering, Tsinghua University, Beijing, P.R.
More informationLecture 4. Capacity of Fading Channels
1 Lecture 4. Capacity of Fading Channels Capacity of AWGN Channels Capacity of Fading Channels Ergodic Capacity Outage Capacity Shannon and Information Theory Claude Elwood Shannon (April 3, 1916 February
More informationAhmed S. Mansour, Rafael F. Schaefer and Holger Boche. June 9, 2015
The Individual Secrecy Capacity of Degraded Multi-Receiver Wiretap Broadcast Channels Ahmed S. Mansour, Rafael F. Schaefer and Holger Boche Lehrstuhl für Technische Universität München, Germany Department
More informationElectrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7
Electrical and Information Technology Information Theory Problems and Solutions Contents Problems.......... Solutions...........7 Problems 3. In Problem?? the binomial coefficent was estimated with Stirling
More informationChapter 8: Differential entropy. University of Illinois at Chicago ECE 534, Natasha Devroye
Chapter 8: Differential entropy Chapter 8 outline Motivation Definitions Relation to discrete entropy Joint and conditional differential entropy Relative entropy and mutual information Properties AEP for
More informationEE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm
EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm 1. Feedback does not increase the capacity. Consider a channel with feedback. We assume that all the recieved outputs are sent back immediately
More informationAchievable Rates for Probabilistic Shaping
Achievable Rates for Probabilistic Shaping Georg Böcherer Mathematical and Algorithmic Sciences Lab Huawei Technologies France S.A.S.U. georg.boecherer@ieee.org May 3, 08 arxiv:707.034v5 cs.it May 08 For
More informationOn the Capacity of the Binary-Symmetric Parallel-Relay Network
On the Capacity of the Binary-Symmetric Parallel-Relay Network Lawrence Ong, Sarah J. Johnson, and Christopher M. Kellett arxiv:207.3574v [cs.it] 6 Jul 202 Abstract We investigate the binary-symmetric
More informationCapacity Bounds for Diamond Networks
Technische Universität München Capacity Bounds for Diamond Networks Gerhard Kramer (TUM) joint work with Shirin Saeedi Bidokhti (TUM & Stanford) DIMACS Workshop on Network Coding Rutgers University, NJ
More informationIET Commun., 2009, Vol. 3, Iss. 4, pp doi: /iet-com & The Institution of Engineering and Technology 2009
Published in IET Communications Received on 10th March 2008 Revised on 10th July 2008 ISSN 1751-8628 Comprehensive partial decoding approach for two-level relay networks L. Ghabeli M.R. Aref Information
More informationEE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.
EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 28 Please submit on Gradescope. Start every question on a new page.. Maximum Differential Entropy (a) Show that among all distributions supported
More informationChannel Dispersion and Moderate Deviations Limits for Memoryless Channels
Channel Dispersion and Moderate Deviations Limits for Memoryless Channels Yury Polyanskiy and Sergio Verdú Abstract Recently, Altug and Wagner ] posed a question regarding the optimal behavior of the probability
More information