Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities
|
|
- Madlyn O’Brien’
- 5 years ago
- Views:
Transcription
1 Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities Maxim Raginsky and Igal Sason ISIT 2013, Istanbul, Turkey
2 Capacity-Achieving Channel Codes The set-up DMC T : X Y with capacity C = C(T ) = max P X I(X; Y ) (n, M)-code: C = (f, g) with encoder f : {1,..., M} X n and decoder g : Y n {1,..., M} Capacity-achieving codes: A sequence {C n } n=1, where each C n is an (n, M n )-code, is capacity-achieving if lim n 1 n log M n = C.
3 Capacity-Achieving Channel Codes Capacity-achieving input and output distributions: P X arg max P X I(X; Y ) (may not be unique) P X T P Y (always unique) Theorem (Shamai Verdú, 1997) Let {C n } be any capacity-achieving code sequence with vanishing error probability. Then 1 ( ) lim n n D P (Cn) P Y n Y n = 0, where P (Cn) Y is the output distribution induced by the code n C n when the messages in {1,..., M n } are equiprobable.
4 Capacity-Achieving Channel Codes lim n 1 n D(P Y n P Y n) = 0 Main message: channel output sequences induced by good code resemble i.i.d. sequences drawn from the CAOD P Y Useful implications: estimate performance characteristics of good channel codes by their expectations w.r.t. P Y n = (P Y )n often much easier to compute explicitly bound estimation accuracy using large-deviation theory (e.g., Sanov s theorem) Question: what about good codes with nonvanishing error probability?
5 Codes with Nonvanishing Error Probability Y. Polyanskiy and S. Verdú, Empirical distribution of good channel codes with non-vanishing error probability (2012) 1. Let C = (f, g) be any (n, M, ε)-code for T : max P ( g(y n ) j f(x n ) = j ) ε. 1 j M Then D ( P (C) ) P Y n Y n nc log M + o(n). 2. If {C n } n=1 is a capacity-achieving sequence, where each C n is an (n, M n, ε)-code for some fixed ε > 0, then 1 ( lim n n D P (Cn) Y n ) P Y n = 0. In some cases, the o(n) term can be improved to O( n).
6 Codes with Nonvanishing Error Probability D(P Y n PY n) nc log M + o(n) The same message: channel output sequences induced by good codes resemble i.i.d. sequences drawn from P Y Main technical tool: concentration of measure Our contribution: sharpening of the Polyanskiy Verdú bounds by identifying explicit expressions for the o(n) term
7 Preliminaries on Concentration of Measure Let Z 1,..., Z n Z be independent random variables. We seek tight bounds on the deviation probabilities P (f(z n ) r) for r > 0 where f : Z n R is some function with E[f(Z n )] = 0. Subgaussian tails: log E[e tf(zn) ] κt 2 /2, t > 0 = P (f(z n ) r) exp ) ( r2, r > 0 2κ
8 Preliminaries on Concentration of Measure Suppose that Z n is a metric space with metric d(, ). L 1 Wasserstein distance: for any µ, ν P(Z n ), W 1 (µ, ν) inf E[d(Z n, Z n )] Z n µ, Z n ν L 1 transportation cost inequalities (Marton): µ P(Z n ) satisfies a T 1 (c) inequality if W 1 (µ, ν) 2cD(ν µ), ν µ T 1 (c) implies concentration!
9 Preliminaries on Concentration of Measure L 1 transportation cost inequalities (Marton): µ P(Z n ) satisfies a T 1 (c) inequality if W 1 (µ, ν) 2cD(ν µ), ν µ Theorem (Bobkov Götze, 1999) A probability measure µ P(Z n ) satisfies T 1 (c) if and only if log E µ [e tf(zn) ] ct 2 /2 for all f with E µ [f(z n )] = 0 and f(z n ) f( z n ) f Lip sup z n z n d(z n, z n ) 1
10 Preliminaries on Concentration of Measure Endow Z n with the weighted Hamming metric n d(z n, z n ) = c i 1 {zi z i }, for some fixed c 1,..., c n > 0. i=1 Marton s coupling argument: Any product measure µ = µ 1... µ n P(Z n ) satisfies T 1 (c) (relative to d) with n c = 1 4 i=1 c 2 i By Bobkov Götze, this is equivalent to the subgaussian property [ ] n log E µ e tf(zn ) t2 c 2 i 8 for any f : Z n R with E µ f = 0 and f Lip 1 (another way to derive McDiarmid s inequality) i=1
11 Relative Entropy at the Output of a Code Consider a DMC T : X Y with T ( ) > 0, and let c(t ) = 2 max max T (y x) x X y,y Y ln T (y x) Theorem. Any (n, M, ε)-code C for T, where ε (0, 1/2), satisfies ( ) D P (C) P Y n Y n nc log M + log 1 n ε + c(t ) 2 log 1 1 2ε Remark: Polyanskiy and Verdú show that ( ) D P (C) P Y n Y n nc log M + a n for some constant a = a(ε)
12 Proof Idea: I Fix x n X n and study concentration of the function h x n(y n ) = log dp Y n X n =x n dp (C) Y n (y n ) around its expectation w.r.t. P Y n X n =x n: ( E[h x n(y n ) X n = x n P (C) ] = D P Y n X n =x n Y n ) Step 1: Because T ( ) > 0, the function h x n(y n ) is 1-Lipschitz w.r.t. scaled Hamming metric d(y n, ȳ n ) = c(t ) n i=1 1 {yi ȳ i }
13 Proof Idea: II Step 1: Because T ( ) > 0, the function h x n(y n ) is 1-Lipschitz w.r.t. scaled Hamming metric d(y n, ȳ n ) = c(t ) n i=1 1 {yi ȳ i } Step 2: Any product probability measure µ on (Y n, d) satisfies log E µ [ e tf(y n ) ] nc(t )2 t 2 for any f with E µ f = 0 and F Lip 1. Proof: tensorization of the Csiszár Kullback Pinsker inequality, followed by appeal to Bobkov Götze. 8
14 Proof Idea: III h x n(y n ) = log dp Y n X n =x n (y n ) dp (C) Y n ( E[h x n(y n ) X n = x n P (C) ] = D P Y n X n =x n Y n ) Step 3: For any x n, µ = P Y n X n =x n is a product measure, so ( ( ) ) ) P h x n(y n P (C) ) D P Y n X n =x n Y + r exp ( 2r2 n nc(t ) 2 Use this with r = c(t ) n log ε : ( ( ) ) P h x n(y n P (C) n ) D P Y n X n =x n Y + c(t ) n 2 log 1 1 2ε 1 2ε Remark: Polyanskiy Verdú show Var[h x n(y n ) X n = x n ] = O(n).
15 Proof Idea: IV Recall: ( ( ) ) P h x n(y n P (C) n ) D P Y n X n =x n Y + c(t ) n 2 log 1 1 2ε 1 2ε Step 4: Same as Polyanskiy Verdú, appeal to Augustin s strong converse to get log M log 1 ( ) P ε + D (C) P (C) n P Y n X n Y n X + c(t ) n 2 log 1 1 2ε ( ) D P (C) P Y n Y n ( P = D P Y n X n Y n ) ( P (C) P (C) X D P n Y n X n Y n P (C) nc log M + log 1 ε + c(t ) n 2 log 1 1 2ε X n )
16 Relative Entropy at the Output of a Code Theorem. Let T : X Y be a DMC with C > 0. Then, for any 0 < ε < 1, any (n, M, ε)-code C for T satisfies D ( P (C) P Y n Y n) nc log M + 2n (log n) 3/2 ( log n + log ( 2 X Y 2). 1 log n log ( ) ) ( ε ) log Y log n Remark: Polyanskiy and Verdú show that D ( P (C) Y n P Y n) nc log M + b n log 3/2 n for some constant b > 0.
17 Concentration of Lipschitz Functions Theorem. Let T : X Y be a DMC with c(t ) <. Let d : Y n Y n R + be a metric, and suppose that P Y n X n =x n, x n X n, as well as P Y n, satisfy T 1(c) for some c > 0. Then, for any ε (0, 1/2), any (n, M, ε)-code C for T, and any function f : Y n R we have ( ) P (C) Y f(y n ) E[f(Y n )] r n ) (nc 4 ε exp ln M + a n 8c f 2 Lip where Y n P Y n, and a c(t ) 1 2 ln 1 1 2ε. r 2, r 0
18 Proof Idea Step 1: For each x n X n, let φ(x n ) E[f(Y n ) X n = x n ]. Then, by Bobkov Götze, ( ) ( P f(y n ) φ(x n ) r X n = x n) 2 exp r2 2c f 2 Lip Step 2: By restricting to a subcode C with codewords x n X n satisfying φ(x n ) E[f(Y n )] + r, we can show that ( r f Lip 2c nc log M + a n + log 1 ), ε ) with M = MP (C) X (φ(x n ) E[f(Y n )] + r. Solve to get n ( ) P (C) X φ(x n ) E[f(Y n )] r 2e nc log M+a n+log 1 ε r 2 n 2c f 2 Lip Step 3: Apply union bound.
19 Empirical Averages at the Code Output Equip Y n with the Hamming metric n d(y n, ȳ n ) = Consider functions of the form f(y n ) = 1 n i=1 1 {yi ȳ i } n f i (y i ), where f i (y i ) f i (ȳ i ) L1 {yi ȳ i } for all i, y i, ȳ i. Then f Lip L/n. Since P Y n X n =x n for all xn and PY n are product measures on Y n, they all satisfy T 1 (n/4) (by tensorization) Therefore, for any (n, M, ε)-code and any such f we have P (C) ( Y f(y n ) E[f(Y n )] r ) n 4 ( ε exp nc log M + a ) n nr2 2L 2 i=1
20 Operational Significance A bound like C(T ) log M 1 n D( P (C) ) P a(t, ε) Y n Y n n quantifies trade-offs between minimal blocklength required for achieving a certain gap (in rate) to capacity with a fixed block error probability ε, and normalized divergence between output distribution induced by the code and the (unique) CAOD of the channel We have identified the precise dependence of a(t, ε) on the channel T and on the block error probability ε These results are similar to a lower bound on rate loss w.r.t. fully random block codes (whose average distance spectrum is binomially distributed) in terms of normalized divergence between the distance spectrum of a specific code and the binomial distribution (Shamai Sason, 2002).
21 Concentration of measure = powerful tool for studying nonasymptotic behavior of stochastic objects in information theory! For more information, see M. Raginsky and I. Sason, Concentration of Measure Inequalities in Information Theory, Communications and Coding, arxiv:
22 That s All, Folks!
Concentration of Measure with Applications in Information Theory, Communications, and Coding
Concentration of Measure with Applications in Information Theory, Communications, and Coding Maxim Raginsky (UIUC) and Igal Sason (Technion) A Tutorial, Presented at 2015 IEEE International Symposium on
More informationEE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15
EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability
More informationTwo Applications of the Gaussian Poincaré Inequality in the Shannon Theory
Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Vincent Y. F. Tan (Joint work with Silas L. Fong) National University of Singapore (NUS) 2016 International Zurich Seminar on
More informationDispersion of the Gilbert-Elliott Channel
Dispersion of the Gilbert-Elliott Channel Yury Polyanskiy Email: ypolyans@princeton.edu H. Vincent Poor Email: poor@princeton.edu Sergio Verdú Email: verdu@princeton.edu Abstract Channel dispersion plays
More informationChannel Dispersion and Moderate Deviations Limits for Memoryless Channels
Channel Dispersion and Moderate Deviations Limits for Memoryless Channels Yury Polyanskiy and Sergio Verdú Abstract Recently, Altug and Wagner ] posed a question regarding the optimal behavior of the probability
More informationDistance-Divergence Inequalities
Distance-Divergence Inequalities Katalin Marton Alfréd Rényi Institute of Mathematics of the Hungarian Academy of Sciences Motivation To find a simple proof of the Blowing-up Lemma, proved by Ahlswede,
More informationChapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University
Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission
More informationLECTURE 10. Last time: Lecture outline
LECTURE 10 Joint AEP Coding Theorem Last time: Error Exponents Lecture outline Strong Coding Theorem Reading: Gallager, Chapter 5. Review Joint AEP A ( ɛ n) (X) A ( ɛ n) (Y ) vs. A ( ɛ n) (X, Y ) 2 nh(x)
More informationLecture 4 Noisy Channel Coding
Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem
More informationA Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback
A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback Vincent Y. F. Tan (NUS) Joint work with Silas L. Fong (Toronto) 2017 Information Theory Workshop, Kaohsiung,
More informationArimoto Channel Coding Converse and Rényi Divergence
Arimoto Channel Coding Converse and Rényi Divergence Yury Polyanskiy and Sergio Verdú Abstract Arimoto proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code
More informationInformation theoretic perspectives on learning algorithms
Information theoretic perspectives on learning algorithms Varun Jog University of Wisconsin - Madison Departments of ECE and Mathematics Shannon Channel Hangout! May 8, 2018 Jointly with Adrian Tovar-Lopez
More informationChannels with cost constraints: strong converse and dispersion
Channels with cost constraints: strong converse and dispersion Victoria Kostina, Sergio Verdú Dept. of Electrical Engineering, Princeton University, NJ 08544, USA Abstract This paper shows the strong converse
More informationTightened Upper Bounds on the ML Decoding Error Probability of Binary Linear Block Codes and Applications
on the ML Decoding Error Probability of Binary Linear Block Codes and Moshe Twitto Department of Electrical Engineering Technion-Israel Institute of Technology Haifa 32000, Israel Joint work with Igal
More informationarxiv: v4 [cs.it] 17 Oct 2015
Upper Bounds on the Relative Entropy and Rényi Divergence as a Function of Total Variation Distance for Finite Alphabets Igal Sason Department of Electrical Engineering Technion Israel Institute of Technology
More informationFunctional Properties of MMSE
Functional Properties of MMSE Yihong Wu epartment of Electrical Engineering Princeton University Princeton, NJ 08544, USA Email: yihongwu@princeton.edu Sergio Verdú epartment of Electrical Engineering
More informationLECTURE 13. Last time: Lecture outline
LECTURE 13 Last time: Strong coding theorem Revisiting channel and codes Bound on probability of error Error exponent Lecture outline Fano s Lemma revisited Fano s inequality for codewords Converse to
More informationStrong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach
Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Silas Fong (Joint work with Vincent Tan) Department of Electrical & Computer Engineering National University
More informationELEC546 Review of Information Theory
ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random
More informationLecture 4 Channel Coding
Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity
More informationSecond-Order Asymptotics in Information Theory
Second-Order Asymptotics in Information Theory Vincent Y. F. Tan (vtan@nus.edu.sg) Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) National Taiwan University November 2015
More informationLower Bounds on the Graphical Complexity of Finite-Length LDPC Codes
Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Igal Sason Department of Electrical Engineering Technion - Israel Institute of Technology Haifa 32000, Israel 2009 IEEE International
More informationEE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.
EE376A - Information Theory Final, Monday March 14th 216 Solutions Instructions: You have three hours, 3.3PM - 6.3PM The exam has 4 questions, totaling 12 points. Please start answering each question on
More informationThe Poisson Channel with Side Information
The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH
More informationCorrelation Detection and an Operational Interpretation of the Rényi Mutual Information
Correlation Detection and an Operational Interpretation of the Rényi Mutual Information Masahito Hayashi 1, Marco Tomamichel 2 1 Graduate School of Mathematics, Nagoya University, and Centre for Quantum
More informationContents 1. Introduction 1 2. Main results 3 3. Proof of the main inequalities 7 4. Application to random dynamical systems 11 References 16
WEIGHTED CSISZÁR-KULLBACK-PINSKER INEQUALITIES AND APPLICATIONS TO TRANSPORTATION INEQUALITIES FRANÇOIS BOLLEY AND CÉDRIC VILLANI Abstract. We strengthen the usual Csiszár-Kullback-Pinsker inequality by
More informationSecret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe
Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe Two party secret key agreement Maurer 93, Ahlswede-Csiszár 93 X F Y K x K y ArandomvariableK
More informationNational University of Singapore Department of Electrical & Computer Engineering. Examination for
National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:
More informationEE 4TM4: Digital Communications II. Channel Capacity
EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.
More informationLogarithmic Sobolev inequalities in discrete product spaces: proof by a transportation cost distance
Logarithmic Sobolev inequalities in discrete product spaces: proof by a transportation cost distance Katalin Marton Alfréd Rényi Institute of Mathematics of the Hungarian Academy of Sciences Relative entropy
More informationEntropy, Inference, and Channel Coding
Entropy, Inference, and Channel Coding Sean Meyn Department of Electrical and Computer Engineering University of Illinois and the Coordinated Science Laboratory NSF support: ECS 02-17836, ITR 00-85929
More informationA Simple Memoryless Proof of the Capacity of the Exponential Server Timing Channel
A Simple Memoryless Proof of the Capacity of the Exponential Server iming Channel odd P. Coleman ECE Department Coordinated Science Laboratory University of Illinois colemant@illinois.edu Abstract his
More informationECE598: Information-theoretic methods in high-dimensional statistics Spring 2016
ECE598: Information-theoretic methods in high-dimensional statistics Spring 06 Lecture : Mutual Information Method Lecturer: Yihong Wu Scribe: Jaeho Lee, Mar, 06 Ed. Mar 9 Quick review: Assouad s lemma
More informationSuperposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels
Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:
More informationf-divergence Inequalities
SASON AND VERDÚ: f-divergence INEQUALITIES f-divergence Inequalities Igal Sason, and Sergio Verdú Abstract arxiv:508.00335v7 [cs.it] 4 Dec 06 This paper develops systematic approaches to obtain f-divergence
More informationOn the Concentration of the Crest Factor for OFDM Signals
On the Concentration of the Crest Factor for OFDM Signals Igal Sason Department of Electrical Engineering Technion - Israel Institute of Technology, Haifa 3, Israel E-mail: sason@eetechnionacil Abstract
More informationLecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122
Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel
More informationThe Method of Types and Its Application to Information Hiding
The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,
More informationWasserstein continuity of entropy and outer bounds for interference channels
Wasserstein continuity of entropy and outer bounds for interference channels Yury Polyanskiy and Yihong Wu January 3, 206 Abstract It is shown that under suitable regularity conditions, differential entropy
More informationSoft Covering with High Probability
Soft Covering with High Probability Paul Cuff Princeton University arxiv:605.06396v [cs.it] 20 May 206 Abstract Wyner s soft-covering lemma is the central analysis step for achievability proofs of information
More informationCovert Communication with Channel-State Information at the Transmitter
Covert Communication with Channel-State Information at the Transmitter Si-Hyeon Lee Joint Work with Ligong Wang, Ashish Khisti, and Gregory W. Wornell July 27, 2017 1 / 21 Covert Communication Transmitter
More informationVariable-length coding with feedback in the non-asymptotic regime
Variable-length coding with feedback in the non-asymptotic regime Yury Polyanskiy, H. Vincent Poor, and Sergio Verdú Abstract Without feedback, the backoff from capacity due to non-asymptotic blocklength
More informationECE Information theory Final (Fall 2008)
ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1
More informationLecture 35: December The fundamental statistical distances
36-705: Intermediate Statistics Fall 207 Lecturer: Siva Balakrishnan Lecture 35: December 4 Today we will discuss distances and metrics between distributions that are useful in statistics. I will be lose
More informationOn Third-Order Asymptotics for DMCs
On Third-Order Asymptotics for DMCs Vincent Y. F. Tan Institute for Infocomm Research (I R) National University of Singapore (NUS) January 0, 013 Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs
More informationCapacity of a channel Shannon s second theorem. Information Theory 1/33
Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,
More informationAnalytical Bounds on Maximum-Likelihood Decoded Linear Codes: An Overview
Analytical Bounds on Maximum-Likelihood Decoded Linear Codes: An Overview Igal Sason Department of Electrical Engineering, Technion Haifa 32000, Israel Sason@ee.technion.ac.il December 21, 2004 Background
More informationLecture 6 I. CHANNEL CODING. X n (m) P Y X
6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder
More informationEnergy State Amplification in an Energy Harvesting Communication System
Energy State Amplification in an Energy Harvesting Communication System Omur Ozel Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland College Park, MD 20742 omur@umd.edu
More informationA General Formula for Compound Channel Capacity
A General Formula for Compound Channel Capacity Sergey Loyka, Charalambos D. Charalambous University of Ottawa, University of Cyprus ETH Zurich (May 2015), ISIT-15 1/32 Outline 1 Introduction 2 Channel
More informationf-divergence Inequalities
f-divergence Inequalities Igal Sason Sergio Verdú Abstract This paper develops systematic approaches to obtain f-divergence inequalities, dealing with pairs of probability measures defined on arbitrary
More informationSimple Channel Coding Bounds
ISIT 2009, Seoul, Korea, June 28 - July 3,2009 Simple Channel Coding Bounds Ligong Wang Signal and Information Processing Laboratory wang@isi.ee.ethz.ch Roger Colbeck Institute for Theoretical Physics,
More informationSeries 7, May 22, 2018 (EM Convergence)
Exercises Introduction to Machine Learning SS 2018 Series 7, May 22, 2018 (EM Convergence) Institute for Machine Learning Dept. of Computer Science, ETH Zürich Prof. Dr. Andreas Krause Web: https://las.inf.ethz.ch/teaching/introml-s18
More informationLECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem
LECTURE 15 Last time: Feedback channel: setting up the problem Perfect feedback Feedback capacity Data compression Lecture outline Joint source and channel coding theorem Converse Robustness Brain teaser
More informationNetwork coding for multicast relation to compression and generalization of Slepian-Wolf
Network coding for multicast relation to compression and generalization of Slepian-Wolf 1 Overview Review of Slepian-Wolf Distributed network compression Error exponents Source-channel separation issues
More informationVariable Rate Channel Capacity. Jie Ren 2013/4/26
Variable Rate Channel Capacity Jie Ren 2013/4/26 Reference This is a introduc?on of Sergio Verdu and Shlomo Shamai s paper. Sergio Verdu and Shlomo Shamai, Variable- Rate Channel Capacity, IEEE Transac?ons
More informationConcentration inequalities and the entropy method
Concentration inequalities and the entropy method Gábor Lugosi ICREA and Pompeu Fabra University Barcelona what is concentration? We are interested in bounding random fluctuations of functions of many
More informationLecture 5 Channel Coding over Continuous Channels
Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From
More informationLecture 6 Channel Coding over Continuous Channels
Lecture 6 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 9, 015 1 / 59 I-Hsiang Wang IT Lecture 6 We have
More informationArimoto-Rényi Conditional Entropy. and Bayesian M-ary Hypothesis Testing. Abstract
Arimoto-Rényi Conditional Entropy and Bayesian M-ary Hypothesis Testing Igal Sason Sergio Verdú Abstract This paper gives upper and lower bounds on the minimum error probability of Bayesian M-ary hypothesis
More informationSHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe
SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/41 Outline Two-terminal model: Mutual information Operational meaning in: Channel coding: channel
More informationLecture 2: August 31
0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 2: August 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy
More informationNew Results on the Equality of Exact and Wyner Common Information Rates
08 IEEE International Symposium on Information Theory (ISIT New Results on the Equality of Exact and Wyner Common Information Rates Badri N. Vellambi Australian National University Acton, ACT 0, Australia
More informationAn Achievable Error Exponent for the Mismatched Multiple-Access Channel
An Achievable Error Exponent for the Mismatched Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@camacuk Albert Guillén i Fàbregas ICREA & Universitat Pompeu Fabra University of
More informationTightened Upper Bounds on the ML Decoding Error Probability of Binary Linear Block Codes and Applications
on the ML Decoding Error Probability of Binary Linear Block Codes and Department of Electrical Engineering Technion-Israel Institute of Technology An M.Sc. Thesis supervisor: Dr. Igal Sason March 30, 2006
More informationEE5585 Data Compression May 2, Lecture 27
EE5585 Data Compression May 2, 2013 Lecture 27 Instructor: Arya Mazumdar Scribe: Fangying Zhang Distributed Data Compression/Source Coding In the previous class we used a H-W table as a simple example,
More informationPhenomena in high dimensions in geometric analysis, random matrices, and computational geometry Roscoff, France, June 25-29, 2012
Phenomena in high dimensions in geometric analysis, random matrices, and computational geometry Roscoff, France, June 25-29, 202 BOUNDS AND ASYMPTOTICS FOR FISHER INFORMATION IN THE CENTRAL LIMIT THEOREM
More informationThe Gallager Converse
The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing
More informationSolutions to Homework Set #1 Sanov s Theorem, Rate distortion
st Semester 00/ Solutions to Homework Set # Sanov s Theorem, Rate distortion. Sanov s theorem: Prove the simple version of Sanov s theorem for the binary random variables, i.e., let X,X,...,X n be a sequence
More informationMultiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets
Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,
More informationOn Achievable Rates for Channels with. Mismatched Decoding
On Achievable Rates for Channels with 1 Mismatched Decoding Anelia Somekh-Baruch arxiv:13050547v1 [csit] 2 May 2013 Abstract The problem of mismatched decoding for discrete memoryless channels is addressed
More informationSHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe
SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/40 Acknowledgement Praneeth Boda Himanshu Tyagi Shun Watanabe 3/40 Outline Two-terminal model: Mutual
More informationUpper Bounds on the Capacity of Binary Intermittent Communication
Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,
More informationConvexity/Concavity of Renyi Entropy and α-mutual Information
Convexity/Concavity of Renyi Entropy and -Mutual Information Siu-Wai Ho Institute for Telecommunications Research University of South Australia Adelaide, SA 5095, Australia Email: siuwai.ho@unisa.edu.au
More informationElectrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7
Electrical and Information Technology Information Theory Problems and Solutions Contents Problems.......... Solutions...........7 Problems 3. In Problem?? the binomial coefficent was estimated with Stirling
More informationGeneralization theory
Generalization theory Daniel Hsu Columbia TRIPODS Bootcamp 1 Motivation 2 Support vector machines X = R d, Y = { 1, +1}. Return solution ŵ R d to following optimization problem: λ min w R d 2 w 2 2 + 1
More informationInformation Theory in Intelligent Decision Making
Information Theory in Intelligent Decision Making Adaptive Systems and Algorithms Research Groups School of Computer Science University of Hertfordshire, United Kingdom June 7, 2015 Information Theory
More informationNotes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel
Introduction to Coding Theory CMU: Spring 2010 Notes 3: Stochastic channels and noisy coding theorem bound January 2010 Lecturer: Venkatesan Guruswami Scribe: Venkatesan Guruswami We now turn to the basic
More informationChannel Dependent Adaptive Modulation and Coding Without Channel State Information at the Transmitter
Channel Dependent Adaptive Modulation and Coding Without Channel State Information at the Transmitter Bradford D. Boyle, John MacLaren Walsh, and Steven Weber Modeling & Analysis of Networks Laboratory
More information(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute
ENEE 739C: Advanced Topics in Signal Processing: Coding Theory Instructor: Alexander Barg Lecture 6 (draft; 9/6/03. Error exponents for Discrete Memoryless Channels http://www.enee.umd.edu/ abarg/enee739c/course.html
More information1 Review of The Learning Setting
COS 5: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #8 Scribe: Changyan Wang February 28, 208 Review of The Learning Setting Last class, we moved beyond the PAC model: in the PAC model we
More informationEE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm
EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm 1. Feedback does not increase the capacity. Consider a channel with feedback. We assume that all the recieved outputs are sent back immediately
More informationVariable Length Codes for Degraded Broadcast Channels
Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates
More informationLecture 4: Proof of Shannon s theorem and an explicit code
CSE 533: Error-Correcting Codes (Autumn 006 Lecture 4: Proof of Shannon s theorem and an explicit code October 11, 006 Lecturer: Venkatesan Guruswami Scribe: Atri Rudra 1 Overview Last lecture we stated
More informationAn Extended Fano s Inequality for the Finite Blocklength Coding
An Extended Fano s Inequality for the Finite Bloclength Coding Yunquan Dong, Pingyi Fan {dongyq8@mails,fpy@mail}.tsinghua.edu.cn Department of Electronic Engineering, Tsinghua University, Beijing, P.R.
More informationFeedback Capacity of a Class of Symmetric Finite-State Markov Channels
Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Nevroz Şen, Fady Alajaji and Serdar Yüksel Department of Mathematics and Statistics Queen s University Kingston, ON K7L 3N6, Canada
More informationA Formula for the Capacity of the General Gel fand-pinsker Channel
A Formula for the Capacity of the General Gel fand-pinsker Channel Vincent Y. F. Tan Institute for Infocomm Research (I 2 R, A*STAR, Email: tanyfv@i2r.a-star.edu.sg ECE Dept., National University of Singapore
More informationTight Bounds for Symmetric Divergence Measures and a New Inequality Relating f-divergences
Tight Bounds for Symmetric Divergence Measures and a New Inequality Relating f-divergences Igal Sason Department of Electrical Engineering Technion, Haifa 3000, Israel E-mail: sason@ee.technion.ac.il Abstract
More informationOn Achievable Rates and Complexity of LDPC Codes over Parallel Channels: Bounds and Applications
On Achievable Rates and Complexity of LDPC Codes over Parallel Channels: Bounds and Applications Igal Sason, Member and Gil Wiechman, Graduate Student Member Abstract A variety of communication scenarios
More informationEE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions
EE/Stat 376B Handout #5 Network Information Theory October, 14, 014 1. Problem.4 parts (b) and (c). Homework Set # Solutions (b) Consider h(x + Y ) h(x + Y Y ) = h(x Y ) = h(x). (c) Let ay = Y 1 + Y, where
More informationQuantum Achievability Proof via Collision Relative Entropy
Quantum Achievability Proof via Collision Relative Entropy Salman Beigi Institute for Research in Fundamental Sciences (IPM) Tehran, Iran Setemper 8, 2014 Based on a joint work with Amin Gohari arxiv:1312.3822
More informationfor some error exponent E( R) as a function R,
. Capacity-achieving codes via Forney concatenation Shannon s Noisy Channel Theorem assures us the existence of capacity-achieving codes. However, exhaustive search for the code has double-exponential
More informationMathematics for Economists
Mathematics for Economists Victor Filipe Sao Paulo School of Economics FGV Metric Spaces: Basic Definitions Victor Filipe (EESP/FGV) Mathematics for Economists Jan.-Feb. 2017 1 / 34 Definitions and Examples
More informationAn upper bound on l q norms of noisy functions
Electronic Collouium on Computational Complexity, Report No. 68 08 An upper bound on l norms of noisy functions Alex Samorodnitsky Abstract Let T ɛ, 0 ɛ /, be the noise operator acting on functions on
More informationSequential prediction with coded side information under logarithmic loss
under logarithmic loss Yanina Shkel Department of Electrical Engineering Princeton University Princeton, NJ 08544, USA Maxim Raginsky Department of Electrical and Computer Engineering Coordinated Science
More informationOn the Capacity Region of the Gaussian Z-channel
On the Capacity Region of the Gaussian Z-channel Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@eng.umd.edu ulukus@eng.umd.edu
More information(Classical) Information Theory III: Noisy channel coding
(Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way
More informationConcentration inequalities: basics and some new challenges
Concentration inequalities: basics and some new challenges M. Ledoux University of Toulouse, France & Institut Universitaire de France Measure concentration geometric functional analysis, probability theory,
More informationStein s Method: Distributional Approximation and Concentration of Measure
Stein s Method: Distributional Approximation and Concentration of Measure Larry Goldstein University of Southern California 36 th Midwest Probability Colloquium, 2014 Concentration of Measure Distributional
More information