Correlation Detection and an Operational Interpretation of the Rényi Mutual Information

Size: px
Start display at page:

Download "Correlation Detection and an Operational Interpretation of the Rényi Mutual Information"

Transcription

1 Correlation Detection and an Operational Interpretation of the Rényi Mutual Information Masahito Hayashi 1, Marco Tomamichel 2 1 Graduate School of Mathematics, Nagoya University, and Centre for Quantum Technologies, National University of Singapore 2 School of Physics, The University of Sydney ISIT 2015, Hong Kong (arxiv: )

2 Outline and Motivation Rényi Entropy and divergence (Rényi 61) have found various applications in information theory: e.g. error exponents for hypothesis testing and channel coding, cryptography, the Honey Do problem, etc. Conditional Rényi entropy and Rényi mutual information are less understood. Mathematical properties of different proposed definitions have recently been investigated see, e.g., Fehr Berens (TIT 14) or Verdú (ITA 15), and many works in quantum We want to find an operational interpretation of the measures. 2 / 21

3 Mutual Information Two discrete random variables (X, Y ) P XY. Many expression for the mutual information are available: I(X : Y ) = H(X) + H(Y ) H(XY ) (1) Which one to generalize? = H(X) H(X Y ) (2) = D(P XY P X P Y ) (3) = min D(P XY P X ) (4) = min Q X, D(P XY Q X ). (5) 3 / 21

4 Rényi Mutual Information Two discrete random variables (X, Y ) P XY. Many expression for the mutual information are available: 1 I α (X : Y ) = H α (X) + H α (Y ) H α (XY ) (1) 2 I α (X : Y ) = H α (X) H α (X Y ) (2) 3 I α (X : Y ) = D α (P XY P X P Y ) (3) 4 I α (X : Y ) = min D α (P XY P X ) (4) 5 I α (X : Y ) = min Q X, D α (P XY Q X ). (5) We want the mutual information to be non-negative! We want it to be non-increasing under local processing! 4 / 21

5 Rényi Mutual Information Two discrete random variables (X, Y ) P XY. Many expression for the mutual information are available: 1 I α (X : Y ) = H α (X) + H α (Y ) H α (XY ) (1) 2 I α (X : Y ) = H α (X) H α (X Y ) (2) 3 I α (X : Y ) = D α (P XY P X P Y ) (3) 4 I α (X : Y ) = min D α (P XY P X ) (4) 5 I α (X : Y ) = min Q X, D α (P XY Q X ). (5) We want the mutual information to be non-negative! We want it to be non-increasing under local processing! 5 / 21

6 Rényi Mutual Information Two discrete random variables (X, Y ) P XY. Many expression for the mutual information are available: 1 I α (X : Y ) = H α (X) + H α (Y ) H α (XY ) (1) 2 I α (X : Y ) = H α (X) H α (X Y ) (2) 3 I α (X : Y ) = D α (P XY P X P Y ) (3) 4 I α (X : Y ) = min D α (P XY P X ) (4) 5 I α (X : Y ) = min Q X, D α (P XY Q X ). (5) We want the mutual information to be non-negative! We want it to be non-increasing under local processing! 6 / 21

7 Rényi Mutual Information Two discrete random variables (X, Y ) P XY. Many expression for the mutual information are available: 1 I α (X : Y ) = H α (X) + H α (Y ) H α (XY ) (1) 2 I α (X : Y ) = H α (X) H α (X Y ) (2) 3 I α (X : Y ) = D α (P XY P X P Y ) (3) I α (X : Y ) = min D α (P XY P X ) (4) 5 I α (X : Y ) = min Q X, D α (P XY Q X ). (5) We want the mutual information to be non-negative! We want it to be non-increasing under local processing! This is Sibson s proposal. 7 / 21

8 Rényi Entropy and Divergence For two pmf s P X Q X, the Rényi divergence is defined as ( ) D α (P X Q X ) = 1 α 1 log P X (x) α Q X (x) 1 α. for any α (0, 1) (1, ) and as a limit for α {0, 1, }. Monotonicity: for α β, we have D α (P X Q X ) D β (P X Q X ). x Kullback-Leibler divergence: lim α 1 D α(p X Q X ) = D(P X Q X ) = x P X (x) log P X(x) Q X (x). Data-processing inequality (DPI): for any channel W, we have D α (P X Q X ) D α (P X W Q X W ). 8 / 21

9 Rényi Mutual Information Recall: I α (X : Y ) = min D α (P XY P X ) Inherits monotonicity and DPI from divergence. We have lim α 1 I α (X : Y ) = I(X : Y ). Sibson s identity (Sibson 69): minimizer satisfies (y) α P X (x)p Y X (y x) α, x I α (X : Y ) = 1 α 1 log ( ) 1 α P X (x)p Y X (y x) α. y x Additivity: (X 1, X 2, Y 1, Y 2 ) P X1 Y 1 P X2 Y 2 independent: I α (X 1 X 2 : Y 1 Y 2 ) = I α (X 1 : Y 1 ) + I α (X 2 : Y 2 ). 9 / 21

10 Correlation Detection and One-Shot Converse Correlation Detection: given a pmf P XY, consider Null Hypothesis: (X, Y ) P XY Alternative Hypothesis: X P X independent of Y For a test T Z XY with Z {0, 1} define errors α(t ) = Pr[Z = 1], (X, Y, Z) P XY T Z XY β(t ) = max Pr[Z = 0], (X, Y, Z) P X T Z XY The one-shot (meta-) converse can be stated in terms of this composite hypothesis testing problem (Polyanskiy 13). Any code on W Y X with input distribution P X using M codewords and average error ε satisfies (P XY = P X W Y X ): M 1 ˆβ(ε), ˆβ(ε) = min { β(t ) T Z XY s.t. α(t ) ε }. 10 / 21

11 Asymptotic Correlation Detection Consider the asymptotics n for the sequence of problems Null Hypothesis: (X n, Y n ) P n XY Alternative Hypothesis: X n P n X independent of Y n For a test T n Z X n Y n with Z {0, 1} define errors α(t n ) = Pr[Z = 1], (X, Y, Z) P n XY T n Z X n Y n β(t n ) = max Pr[Z = 0], (X, Y, Z) P n Q X n T Z X n n Y n Y n Define minimal error for fixed rate R > 0: ˆα(R; n) = min { α(t n ) T n Z X n Y n s.t. β(t n ) exp( nr) }. 11 / 21

12 Error Exponents (Hoeffding) Recall: I s (X : Y ) = min D s (P XY P X ) ˆα(R; n) = min { α(t n ) T n Z X n Y n s.t. β(t n ) exp( nr) } Result (Error Exponent) For any R > 0, we have { 1n } log ˆα(R; n) lim n = sup s (0,1) { 1 s ( Is (X : Y ) R )}. s If R I(X : Y ) it evaluates to 0, else it is positive. I(X : Y ) is the critical rate (cf. Stein s Lemma). If R < I 0 (X : Y ) it diverges to +. This is the zero-error regime. 12 / 21

13 Strong Converse Exponents (Han Kobayashi) Recall: I s (X : Y ) = min D s (P XY P X ) ˆα(R; n) = min { α(t n ) T n Z X n Y n s.t. β(t n ) exp( nr) } Result (Strong Converse Exponent) For any 0 < R < I (X : Y ), we have { lim 1 n n log ( 1 ˆα(R; n) )} { s 1 ( = sup R Is (X : Y ) )}. s>1 s If R I(X : Y ) it evaluates to 0, otherwise it is positive. This implies the strong converse to Stein s Lemma. What if R = I(X : Y )? 13 / 21

14 Second Order Expansion For small deviations r from the rate R, define ˆα(R, r; n) = min { α(t n ) T n Z X n Y n s.t. β(t n ) exp( nr nr) }. Result (Second Order Expansion) For any r R, we have lim ˆα( I(X : Y ), r; n ) = Φ n ( ) r. V (X : Y ) Φ is cumulative (normal) Gaussian distribution function. V (X : Y ) = V (P XY P X P Y ) where V ( ) is the divergence variance d ds s=1 I s (X : Y ) = 1 2 V (X : Y ). 14 / 21

15 Universal Distribution For every n, consider the universal pmf (Hayashi 09) T n Y n(yn ) = λ P n(y ) 1 P n (Y ) U λ(y n ), where U λ is the uniform distribution over the type class λ. Every S n -invariant pmf n satisfies n(y n ) P n (Y ) T n Y n(yn ) y n. Main idea: test P n XY vs. P n X T n Y n. Lemma For any joint pmf P XY, the universal pmf satisfies D α ( P n XY P n X T n Y n ) = niα (X : Y ) + O(log n). 15 / 21

16 Error Exponent: Achievability (1) Fix s (0, 1). Fix sequence {λ n } n to be chosen later. We use Neyman-Pearson tests for P n n XY vs. P X T Y n n: { Z(x n, y n P n XY ) = 1 log (xn, y n } ) P n X (xn )TY n (y n ) λ n. n Then, with (X n, Y n ) P n XY, we have Pr[Z = 1] = x n,y n P n { XY (x n, y n ) 1 log P n XY (xn, y n ) P n X (xn )TY n n(yn ) λn exp ( ) ( (1 s)λ n P n XY (x n, y n ) ) s( P n X (x n )TY n n(yn ) ) 1 s x n,y ( n = exp (1 s) ( λ n D s(p n XY P n X TY n n))). } 16 / 21

17 Error Exponent: Achievability (2) And, with (X n, Y n ) P n X n, we have Pr[Z = 0] = { P X n(x n ) n(y n P n XY )1 log (xn, y n } ) P n x n,y n X (xn )TY n n(yn ) < λn = { P X n(x n ) n(y n P n XY )1 log (xn, y n } ) P n x n,y n X (xn )TY n n(yn ) < λn. where n(y n ) = π S n 1 S n n(p (π)yn ) is S n -invariant. Now we can bring in the universal pmf again: Pr[Z = 0] P n(y ) { P X n(x n )TY n n(yn )1 log x n,y n P n(y ) exp P n XY (xn, y n ) P n X (xn )TY n n(yn ) < λn ( sλ n (1 s)d s(p n XY P n X T n Y n) ). Choose {λ n } such that Pr[Z = 0] exp( nr). } 17 / 21

18 Second Order: Achievability There exits {λ n } n such that Pr[Z = 0] exp ( ni(x : Y ) nr ) (X n, Y n ) P n X Pr[Z = 1] = Pr[F n(x n, Y n) < r] (X n, Y n ) P n XY. with a new sequence of random variables F n(x n, Y n) = 1 ( log n n P n XY (Xn, Y n ) P n X (Xn )TY n n(y nr log Pn(Y ) n ) Asymptotic cumulant generating function: Λ F (t) = lim log E[exp(tF n)] n t = lim (D n n 1+ t (P n n XY = t2 2 V (P XY P X P Y ). ). P n X T n Y n) ni(x : Y ) ) F n converges in distribution to a Gaussian F with variance V (by a variation of Lévi s continuity theorem). 18 / 21

19 Quantum Hypothesis Testing Given a bipartite quantum state ρ AB, consider Null Hypothesis: state is ρ AB Alternative Hypothesis: state is ρ A σ B for some state σ B Using the same notation: lim { 1n } { 1 s log ˆα(R; n) = sup (Īs (A : B) R )}, n s (0,1) s { lim 1 n n log ( 1 α(r; n) )} { s 1 ( = sup R Ĩ s (A : B) )}. s>1 s The definition are similar, } Ī s (A : B) Ĩ s (A : B) = min σ B But D s and D s are different! { Ds (ρ AB ρ A σ B ) D s (ρ AB ρ A σ B ). 19 / 21

20 Two Quantum Rényi Divergences D(ρ σ) D s (ρ σ) D s (ρ σ) s They agree with the classical quantity for commuting states. D s (ρ σ) = 1 s 1 log tr ( ρ s σ 1 s), D s (ρ σ) = 1 s 1 log tr (( σ 1 s 2s 1 s ρσ 2s ) s ). 20 / 21

21 Summary and Outlook Correlation detection gives operational meaning to I α (X : Y ) = min D α (P XY P X ). Similarly Arimoto s conditional Rényi entropy H α (X Y ) = log X min D α (P XY U X ). has an operational interpretation: Null Hypothesis: (X, Y ) P XY Alternative Hypothesis: X U X uniform and indep. of Y Does the symmetric mutual information I α(x : Y ) = min D α (P XY Q X ) Q X, have a natural operational interpretation? 21 / 21

A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks

A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks Marco Tomamichel, Masahito Hayashi arxiv: 1208.1478 Also discussing results of: Second Order Asymptotics for Quantum

More information

Converse bounds for private communication over quantum channels

Converse bounds for private communication over quantum channels Converse bounds for private communication over quantum channels Mark M. Wilde (LSU) joint work with Mario Berta (Caltech) and Marco Tomamichel ( Univ. Sydney + Univ. of Technology, Sydney ) arxiv:1602.08898

More information

On Composite Quantum Hypothesis Testing

On Composite Quantum Hypothesis Testing University of York 7 November 207 On Composite Quantum Hypothesis Testing Mario Berta Department of Computing with Fernando Brandão and Christoph Hirche arxiv:709.07268 Overview Introduction 2 Composite

More information

Quantum Sphere-Packing Bounds and Moderate Deviation Analysis for Classical-Quantum Channels

Quantum Sphere-Packing Bounds and Moderate Deviation Analysis for Classical-Quantum Channels Quantum Sphere-Packing Bounds and Moderate Deviation Analysis for Classical-Quantum Channels (, ) Joint work with Min-Hsiu Hsieh and Marco Tomamichel Hao-Chung Cheng University of Technology Sydney National

More information

Strong Converse and Stein s Lemma in the Quantum Hypothesis Testing

Strong Converse and Stein s Lemma in the Quantum Hypothesis Testing Strong Converse and Stein s Lemma in the Quantum Hypothesis Testing arxiv:uant-ph/9906090 v 24 Jun 999 Tomohiro Ogawa and Hiroshi Nagaoka Abstract The hypothesis testing problem of two uantum states is

More information

Second-Order Asymptotics in Information Theory

Second-Order Asymptotics in Information Theory Second-Order Asymptotics in Information Theory Vincent Y. F. Tan (vtan@nus.edu.sg) Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) National Taiwan University November 2015

More information

Arimoto Channel Coding Converse and Rényi Divergence

Arimoto Channel Coding Converse and Rényi Divergence Arimoto Channel Coding Converse and Rényi Divergence Yury Polyanskiy and Sergio Verdú Abstract Arimoto proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code

More information

On Third-Order Asymptotics for DMCs

On Third-Order Asymptotics for DMCs On Third-Order Asymptotics for DMCs Vincent Y. F. Tan Institute for Infocomm Research (I R) National University of Singapore (NUS) January 0, 013 Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs

More information

EECS 750. Hypothesis Testing with Communication Constraints

EECS 750. Hypothesis Testing with Communication Constraints EECS 750 Hypothesis Testing with Communication Constraints Name: Dinesh Krithivasan Abstract In this report, we study a modification of the classical statistical problem of bivariate hypothesis testing.

More information

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Vincent Y. F. Tan (Joint work with Silas L. Fong) National University of Singapore (NUS) 2016 International Zurich Seminar on

More information

Quantum Achievability Proof via Collision Relative Entropy

Quantum Achievability Proof via Collision Relative Entropy Quantum Achievability Proof via Collision Relative Entropy Salman Beigi Institute for Research in Fundamental Sciences (IPM) Tehran, Iran Setemper 8, 2014 Based on a joint work with Amin Gohari arxiv:1312.3822

More information

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Silas Fong (Joint work with Vincent Tan) Department of Electrical & Computer Engineering National University

More information

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability

More information

Secret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe

Secret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe Two party secret key agreement Maurer 93, Ahlswede-Csiszár 93 X F Y K x K y ArandomvariableK

More information

A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback

A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback Vincent Y. F. Tan (NUS) Joint work with Silas L. Fong (Toronto) 2017 Information Theory Workshop, Kaohsiung,

More information

On asymmetric quantum hypothesis testing

On asymmetric quantum hypothesis testing On asymmetric quantum hypothesis testing JMP, Vol 57, 6, 10.1063/1.4953582 arxiv:1612.01464 Cambyse Rouzé (Cambridge) Joint with Nilanjana Datta (University of Cambridge) and Yan Pautrat (Paris-Saclay)

More information

Multivariate Trace Inequalities

Multivariate Trace Inequalities Multivariate Trace Inequalities Mario Berta arxiv:1604.03023 with Sutter and Tomamichel (to appear in CMP) arxiv:1512.02615 with Fawzi and Tomamichel QMath13 - October 8, 2016 Mario Berta (Caltech) Multivariate

More information

Arimoto-Rényi Conditional Entropy. and Bayesian M-ary Hypothesis Testing. Abstract

Arimoto-Rényi Conditional Entropy. and Bayesian M-ary Hypothesis Testing. Abstract Arimoto-Rényi Conditional Entropy and Bayesian M-ary Hypothesis Testing Igal Sason Sergio Verdú Abstract This paper gives upper and lower bounds on the minimum error probability of Bayesian M-ary hypothesis

More information

Strong converse theorems using Rényi entropies

Strong converse theorems using Rényi entropies Strong converse theorems using Rényi entropies Felix Leditzky joint work with Mark M. Wilde and Nilanjana Datta arxiv:1506.02635 5 January 2016 Table of Contents 1 Weak vs. strong converse 2 Rényi entropies

More information

Dispersion of the Gilbert-Elliott Channel

Dispersion of the Gilbert-Elliott Channel Dispersion of the Gilbert-Elliott Channel Yury Polyanskiy Email: ypolyans@princeton.edu H. Vincent Poor Email: poor@princeton.edu Sergio Verdú Email: verdu@princeton.edu Abstract Channel dispersion plays

More information

LECTURE 3. Last time:

LECTURE 3. Last time: LECTURE 3 Last time: Mutual Information. Convexity and concavity Jensen s inequality Information Inequality Data processing theorem Fano s Inequality Lecture outline Stochastic processes, Entropy rate

More information

ECE 4400:693 - Information Theory

ECE 4400:693 - Information Theory ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential

More information

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157 Lecture 6: Gaussian Channels Copyright G. Caire (Sample Lectures) 157 Differential entropy (1) Definition 18. The (joint) differential entropy of a continuous random vector X n p X n(x) over R is: Z h(x

More information

arxiv: v4 [cs.it] 17 Oct 2015

arxiv: v4 [cs.it] 17 Oct 2015 Upper Bounds on the Relative Entropy and Rényi Divergence as a Function of Total Variation Distance for Finite Alphabets Igal Sason Department of Electrical Engineering Technion Israel Institute of Technology

More information

Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities

Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities Vincent Y. F. Tan Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) September 2014 Vincent Tan

More information

Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities

Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities Maxim Raginsky and Igal Sason ISIT 2013, Istanbul, Turkey Capacity-Achieving Channel Codes The set-up DMC

More information

Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets

Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets Jonathan Scarlett and Vincent Y. F. Tan Department of Engineering, University of Cambridge Electrical and Computer Engineering,

More information

Convexity/Concavity of Renyi Entropy and α-mutual Information

Convexity/Concavity of Renyi Entropy and α-mutual Information Convexity/Concavity of Renyi Entropy and -Mutual Information Siu-Wai Ho Institute for Telecommunications Research University of South Australia Adelaide, SA 5095, Australia Email: siuwai.ho@unisa.edu.au

More information

EE5319R: Problem Set 3 Assigned: 24/08/16, Due: 31/08/16

EE5319R: Problem Set 3 Assigned: 24/08/16, Due: 31/08/16 EE539R: Problem Set 3 Assigned: 24/08/6, Due: 3/08/6. Cover and Thomas: Problem 2.30 (Maimum Entropy): Solution: We are required to maimize H(P X ) over all distributions P X on the non-negative integers

More information

Information Theory and Hypothesis Testing

Information Theory and Hypothesis Testing Summer School on Game Theory and Telecommunications Campione, 7-12 September, 2014 Information Theory and Hypothesis Testing Mauro Barni University of Siena September 8 Review of some basic results linking

More information

Soft Covering with High Probability

Soft Covering with High Probability Soft Covering with High Probability Paul Cuff Princeton University arxiv:605.06396v [cs.it] 20 May 206 Abstract Wyner s soft-covering lemma is the central analysis step for achievability proofs of information

More information

Channel Dispersion and Moderate Deviations Limits for Memoryless Channels

Channel Dispersion and Moderate Deviations Limits for Memoryless Channels Channel Dispersion and Moderate Deviations Limits for Memoryless Channels Yury Polyanskiy and Sergio Verdú Abstract Recently, Altug and Wagner ] posed a question regarding the optimal behavior of the probability

More information

5 Mutual Information and Channel Capacity

5 Mutual Information and Channel Capacity 5 Mutual Information and Channel Capacity In Section 2, we have seen the use of a quantity called entropy to measure the amount of randomness in a random variable. In this section, we introduce several

More information

Non-Asymptotic and Asymptotic Analyses on Markov Chains in Several Problems

Non-Asymptotic and Asymptotic Analyses on Markov Chains in Several Problems Non-Asymptotic and Asymptotic Analyses on Markov Chains in Several Problems Masahito Hayashi and Shun Watanabe Graduate School of Mathematics, Nagoya University, Japan, and Centre for Quantum Technologies,

More information

A One-to-One Code and Its Anti-Redundancy

A One-to-One Code and Its Anti-Redundancy A One-to-One Code and Its Anti-Redundancy W. Szpankowski Department of Computer Science, Purdue University July 4, 2005 This research is supported by NSF, NSA and NIH. Outline of the Talk. Prefix Codes

More information

LECTURE 13. Last time: Lecture outline

LECTURE 13. Last time: Lecture outline LECTURE 13 Last time: Strong coding theorem Revisiting channel and codes Bound on probability of error Error exponent Lecture outline Fano s Lemma revisited Fano s inequality for codewords Converse to

More information

A New Metaconverse and Outer Region for Finite-Blocklength MACs

A New Metaconverse and Outer Region for Finite-Blocklength MACs A New Metaconverse Outer Region for Finite-Blocklength MACs Pierre Moulin Dept of Electrical Computer Engineering University of Illinois at Urbana-Champaign Urbana, IL 680 Abstract An outer rate region

More information

EE 4TM4: Digital Communications II. Channel Capacity

EE 4TM4: Digital Communications II. Channel Capacity EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.

More information

Consistency of the maximum likelihood estimator for general hidden Markov models

Consistency of the maximum likelihood estimator for general hidden Markov models Consistency of the maximum likelihood estimator for general hidden Markov models Jimmy Olsson Centre for Mathematical Sciences Lund University Nordstat 2012 Umeå, Sweden Collaborators Hidden Markov models

More information

Channels with cost constraints: strong converse and dispersion

Channels with cost constraints: strong converse and dispersion Channels with cost constraints: strong converse and dispersion Victoria Kostina, Sergio Verdú Dept. of Electrical Engineering, Princeton University, NJ 08544, USA Abstract This paper shows the strong converse

More information

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information 4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information Ramji Venkataramanan Signal Processing and Communications Lab Department of Engineering ramji.v@eng.cam.ac.uk

More information

The Method of Types and Its Application to Information Hiding

The Method of Types and Its Application to Information Hiding The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,

More information

Classical and Quantum Channel Simulations

Classical and Quantum Channel Simulations Classical and Quantum Channel Simulations Mario Berta (based on joint work with Fernando Brandão, Matthias Christandl, Renato Renner, Joseph Renes, Stephanie Wehner, Mark Wilde) Outline Classical Shannon

More information

Quiz 2 Date: Monday, November 21, 2016

Quiz 2 Date: Monday, November 21, 2016 10-704 Information Processing and Learning Fall 2016 Quiz 2 Date: Monday, November 21, 2016 Name: Andrew ID: Department: Guidelines: 1. PLEASE DO NOT TURN THIS PAGE UNTIL INSTRUCTED. 2. Write your name,

More information

Capacity Estimates of TRO Channels

Capacity Estimates of TRO Channels Capacity Estimates of TRO Channels arxiv: 1509.07294 and arxiv: 1609.08594 Li Gao University of Illinois at Urbana-Champaign QIP 2017, Microsoft Research, Seattle Joint work with Marius Junge and Nicholas

More information

ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016

ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016 ECE598: Information-theoretic methods in high-dimensional statistics Spring 06 Lecture : Mutual Information Method Lecturer: Yihong Wu Scribe: Jaeho Lee, Mar, 06 Ed. Mar 9 Quick review: Assouad s lemma

More information

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute ENEE 739C: Advanced Topics in Signal Processing: Coding Theory Instructor: Alexander Barg Lecture 6 (draft; 9/6/03. Error exponents for Discrete Memoryless Channels http://www.enee.umd.edu/ abarg/enee739c/course.html

More information

Semidefinite programming strong converse bounds for quantum channel capacities

Semidefinite programming strong converse bounds for quantum channel capacities Semidefinite programming strong converse bounds for quantum channel capacities Xin Wang UTS: Centre for Quantum Software and Information Joint work with Runyao Duan and Wei Xie arxiv:1610.06381 & 1601.06888

More information

Phenomena in high dimensions in geometric analysis, random matrices, and computational geometry Roscoff, France, June 25-29, 2012

Phenomena in high dimensions in geometric analysis, random matrices, and computational geometry Roscoff, France, June 25-29, 2012 Phenomena in high dimensions in geometric analysis, random matrices, and computational geometry Roscoff, France, June 25-29, 202 BOUNDS AND ASYMPTOTICS FOR FISHER INFORMATION IN THE CENTRAL LIMIT THEOREM

More information

Lecture 11: Quantum Information III - Source Coding

Lecture 11: Quantum Information III - Source Coding CSCI5370 Quantum Computing November 25, 203 Lecture : Quantum Information III - Source Coding Lecturer: Shengyu Zhang Scribe: Hing Yin Tsang. Holevo s bound Suppose Alice has an information source X that

More information

Information measures in simple coding problems

Information measures in simple coding problems Part I Information measures in simple coding problems in this web service in this web service Source coding and hypothesis testing; information measures A(discrete)source is a sequence {X i } i= of random

More information

Information Theoretic Limits of Randomness Generation

Information Theoretic Limits of Randomness Generation Information Theoretic Limits of Randomness Generation Abbas El Gamal Stanford University Shannon Centennial, University of Michigan, September 2016 Information theory The fundamental problem of communication

More information

Source Coding with Lists and Rényi Entropy or The Honey-Do Problem

Source Coding with Lists and Rényi Entropy or The Honey-Do Problem Source Coding with Lists and Rényi Entropy or The Honey-Do Problem Amos Lapidoth ETH Zurich October 8, 2013 Joint work with Christoph Bunte. A Task from your Spouse Using a fixed number of bits, your spouse

More information

EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm

EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm 1. Feedback does not increase the capacity. Consider a channel with feedback. We assume that all the recieved outputs are sent back immediately

More information

LECTURE 10. Last time: Lecture outline

LECTURE 10. Last time: Lecture outline LECTURE 10 Joint AEP Coding Theorem Last time: Error Exponents Lecture outline Strong Coding Theorem Reading: Gallager, Chapter 5. Review Joint AEP A ( ɛ n) (X) A ( ɛ n) (Y ) vs. A ( ɛ n) (X, Y ) 2 nh(x)

More information

COMPSCI 650 Applied Information Theory Jan 21, Lecture 2

COMPSCI 650 Applied Information Theory Jan 21, Lecture 2 COMPSCI 650 Applied Information Theory Jan 21, 2016 Lecture 2 Instructor: Arya Mazumdar Scribe: Gayane Vardoyan, Jong-Chyi Su 1 Entropy Definition: Entropy is a measure of uncertainty of a random variable.

More information

Hypothesis testing for Stochastic PDEs. Igor Cialenco

Hypothesis testing for Stochastic PDEs. Igor Cialenco Hypothesis testing for Stochastic PDEs Igor Cialenco Department of Applied Mathematics Illinois Institute of Technology igor@math.iit.edu Joint work with Liaosha Xu Research partially funded by NSF grants

More information

Information-theoretic Secrecy A Cryptographic Perspective

Information-theoretic Secrecy A Cryptographic Perspective Information-theoretic Secrecy A Cryptographic Perspective Stefano Tessaro UC Santa Barbara WCS 2017 April 30, 2017 based on joint works with M. Bellare and A. Vardy Cryptography Computational assumptions

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( ) Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio (2014-2015) Etienne Tanré - Olivier Faugeras INRIA - Team Tosca October 22nd, 2014 E. Tanré (INRIA - Team Tosca) Mathematical

More information

Strong converse theorems using Rényi entropies

Strong converse theorems using Rényi entropies Strong converse theorems using Rényi entropies Felix Leditzky a, Mark M. Wilde b, and Nilanjana Datta a a Statistical Laboratory, Centre for Mathematical Sciences, University of Cambridge, Cambridge C3

More information

Large Deviations Performance of Knuth-Yao algorithm for Random Number Generation

Large Deviations Performance of Knuth-Yao algorithm for Random Number Generation Large Deviations Performance of Knuth-Yao algorithm for Random Number Generation Akisato KIMURA akisato@ss.titech.ac.jp Tomohiko UYEMATSU uematsu@ss.titech.ac.jp April 2, 999 No. AK-TR-999-02 Abstract

More information

Probability Background

Probability Background Probability Background Namrata Vaswani, Iowa State University August 24, 2015 Probability recap 1: EE 322 notes Quick test of concepts: Given random variables X 1, X 2,... X n. Compute the PDF of the second

More information

Chapter 2: Entropy and Mutual Information. University of Illinois at Chicago ECE 534, Natasha Devroye

Chapter 2: Entropy and Mutual Information. University of Illinois at Chicago ECE 534, Natasha Devroye Chapter 2: Entropy and Mutual Information Chapter 2 outline Definitions Entropy Joint entropy, conditional entropy Relative entropy, mutual information Chain rules Jensen s inequality Log-sum inequality

More information

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

Secret Key Agreement: General Capacity and Second-Order Asymptotics

Secret Key Agreement: General Capacity and Second-Order Asymptotics Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe Abstract We revisit the problem of secret key agreement using interactive public communication

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

Primer on statistics:

Primer on statistics: Primer on statistics: MLE, Confidence Intervals, and Hypothesis Testing ryan.reece@gmail.com http://rreece.github.io/ Insight Data Science - AI Fellows Workshop Feb 16, 018 Outline 1. Maximum likelihood

More information

TUTORIAL 8 SOLUTIONS #

TUTORIAL 8 SOLUTIONS # TUTORIAL 8 SOLUTIONS #9.11.21 Suppose that a single observation X is taken from a uniform density on [0,θ], and consider testing H 0 : θ = 1 versus H 1 : θ =2. (a) Find a test that has significance level

More information

Entropy, Inference, and Channel Coding

Entropy, Inference, and Channel Coding Entropy, Inference, and Channel Coding Sean Meyn Department of Electrical and Computer Engineering University of Illinois and the Coordinated Science Laboratory NSF support: ECS 02-17836, ITR 00-85929

More information

ECE534, Spring 2018: Solutions for Problem Set #3

ECE534, Spring 2018: Solutions for Problem Set #3 ECE534, Spring 08: Solutions for Problem Set #3 Jointly Gaussian Random Variables and MMSE Estimation Suppose that X, Y are jointly Gaussian random variables with µ X = µ Y = 0 and σ X = σ Y = Let their

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

Lecture 14 February 28

Lecture 14 February 28 EE/Stats 376A: Information Theory Winter 07 Lecture 4 February 8 Lecturer: David Tse Scribe: Sagnik M, Vivek B 4 Outline Gaussian channel and capacity Information measures for continuous random variables

More information

Hypothesis Testing with Communication Constraints

Hypothesis Testing with Communication Constraints Hypothesis Testing with Communication Constraints Dinesh Krithivasan EECS 750 April 17, 2006 Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 1 / 21 Presentation Outline

More information

Information Theory Primer:

Information Theory Primer: Information Theory Primer: Entropy, KL Divergence, Mutual Information, Jensen s inequality Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro,

More information

Hands-On Learning Theory Fall 2016, Lecture 3

Hands-On Learning Theory Fall 2016, Lecture 3 Hands-On Learning Theory Fall 016, Lecture 3 Jean Honorio jhonorio@purdue.edu 1 Information Theory First, we provide some information theory background. Definition 3.1 (Entropy). The entropy of a discrete

More information

LECTURE 2. Convexity and related notions. Last time: mutual information: definitions and properties. Lecture outline

LECTURE 2. Convexity and related notions. Last time: mutual information: definitions and properties. Lecture outline LECTURE 2 Convexity and related notions Last time: Goals and mechanics of the class notation entropy: definitions and properties mutual information: definitions and properties Lecture outline Convexity

More information

Classical communication over classical channels using non-classical correlation. Will Matthews University of Waterloo

Classical communication over classical channels using non-classical correlation. Will Matthews University of Waterloo Classical communication over classical channels using non-classical correlation. Will Matthews IQC @ University of Waterloo 1 Classical data over classical channels Q F E n Y X G ˆQ Finite input alphabet

More information

Solutions to Homework Set #4 Differential Entropy and Gaussian Channel

Solutions to Homework Set #4 Differential Entropy and Gaussian Channel Solutions to Homework Set #4 Differential Entropy and Gaussian Channel 1. Differential entropy. Evaluate the differential entropy h(x = f lnf for the following: (a Find the entropy of the exponential density

More information

MGMT 69000: Topics in High-dimensional Data Analysis Falll 2016

MGMT 69000: Topics in High-dimensional Data Analysis Falll 2016 MGMT 69000: Topics in High-dimensional Data Analysis Falll 2016 Lecture 14: Information Theoretic Methods Lecturer: Jiaming Xu Scribe: Hilda Ibriga, Adarsh Barik, December 02, 2016 Outline f-divergence

More information

(Classical) Information Theory II: Source coding

(Classical) Information Theory II: Source coding (Classical) Information Theory II: Source coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract The information content of a random variable

More information

UCSD ECE 255C Handout #14 Prof. Young-Han Kim Thursday, March 9, Solutions to Homework Set #5

UCSD ECE 255C Handout #14 Prof. Young-Han Kim Thursday, March 9, Solutions to Homework Set #5 UCSD ECE 255C Handout #14 Prof. Young-Han Kim Thursday, March 9, 2017 Solutions to Homework Set #5 3.18 Bounds on the quadratic rate distortion function. Recall that R(D) = inf F(ˆx x):e(x ˆX)2 DI(X; ˆX).

More information

Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment

Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment Jean-Pierre Nadal CNRS & EHESS Laboratoire de Physique Statistique (LPS, UMR 8550 CNRS - ENS UPMC Univ.

More information

The Moment Method; Convex Duality; and Large/Medium/Small Deviations

The Moment Method; Convex Duality; and Large/Medium/Small Deviations Stat 928: Statistical Learning Theory Lecture: 5 The Moment Method; Convex Duality; and Large/Medium/Small Deviations Instructor: Sham Kakade The Exponential Inequality and Convex Duality The exponential

More information

Common Randomness Principles of Secrecy

Common Randomness Principles of Secrecy Common Randomness Principles of Secrecy Himanshu Tyagi Department of Electrical and Computer Engineering and Institute of Systems Research 1 Correlated Data, Distributed in Space and Time Sensor Networks

More information

Lecture 22: Error exponents in hypothesis testing, GLRT

Lecture 22: Error exponents in hypothesis testing, GLRT 10-704: Information Processing and Learning Spring 2012 Lecture 22: Error exponents in hypothesis testing, GLRT Lecturer: Aarti Singh Scribe: Aarti Singh Disclaimer: These notes have not been subjected

More information

Lecture 8: Information Theory and Statistics

Lecture 8: Information Theory and Statistics Lecture 8: Information Theory and Statistics Part II: Hypothesis Testing and I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 23, 2015 1 / 50 I-Hsiang

More information

Asymptotic Statistics-III. Changliang Zou

Asymptotic Statistics-III. Changliang Zou Asymptotic Statistics-III Changliang Zou The multivariate central limit theorem Theorem (Multivariate CLT for iid case) Let X i be iid random p-vectors with mean µ and and covariance matrix Σ. Then n (

More information

Different quantum f-divergences and the reversibility of quantum operations

Different quantum f-divergences and the reversibility of quantum operations Different quantum f-divergences and the reversibility of quantum operations Fumio Hiai Tohoku University 2016, September (at Budapest) Joint work with Milán Mosonyi 1 1 F.H. and M. Mosonyi, Different quantum

More information

Discrete Memoryless Channels with Memoryless Output Sequences

Discrete Memoryless Channels with Memoryless Output Sequences Discrete Memoryless Channels with Memoryless utput Sequences Marcelo S Pinho Department of Electronic Engineering Instituto Tecnologico de Aeronautica Sao Jose dos Campos, SP 12228-900, Brazil Email: mpinho@ieeeorg

More information

Secret Key Agreement: General Capacity and Second-Order Asymptotics

Secret Key Agreement: General Capacity and Second-Order Asymptotics Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe 1 Abstract We revisit the problem of secret key agreement using interactive public communication

More information

A Formula for the Capacity of the General Gel fand-pinsker Channel

A Formula for the Capacity of the General Gel fand-pinsker Channel A Formula for the Capacity of the General Gel fand-pinsker Channel Vincent Y. F. Tan Institute for Infocomm Research (I 2 R, A*STAR, Email: tanyfv@i2r.a-star.edu.sg ECE Dept., National University of Singapore

More information

Series 7, May 22, 2018 (EM Convergence)

Series 7, May 22, 2018 (EM Convergence) Exercises Introduction to Machine Learning SS 2018 Series 7, May 22, 2018 (EM Convergence) Institute for Machine Learning Dept. of Computer Science, ETH Zürich Prof. Dr. Andreas Krause Web: https://las.inf.ethz.ch/teaching/introml-s18

More information

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12 Lecture 1: The Multiple Access Channel Copyright G. Caire 12 Outline Two-user MAC. The Gaussian case. The K-user case. Polymatroid structure and resource allocation problems. Copyright G. Caire 13 Two-user

More information

Summary of Chapters 7-9

Summary of Chapters 7-9 Summary of Chapters 7-9 Chapter 7. Interval Estimation 7.2. Confidence Intervals for Difference of Two Means Let X 1,, X n and Y 1, Y 2,, Y m be two independent random samples of sizes n and m from two

More information

Simple Channel Coding Bounds

Simple Channel Coding Bounds ISIT 2009, Seoul, Korea, June 28 - July 3,2009 Simple Channel Coding Bounds Ligong Wang Signal and Information Processing Laboratory wang@isi.ee.ethz.ch Roger Colbeck Institute for Theoretical Physics,

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

Information Theory. David Rosenberg. June 15, New York University. David Rosenberg (New York University) DS-GA 1003 June 15, / 18

Information Theory. David Rosenberg. June 15, New York University. David Rosenberg (New York University) DS-GA 1003 June 15, / 18 Information Theory David Rosenberg New York University June 15, 2015 David Rosenberg (New York University) DS-GA 1003 June 15, 2015 1 / 18 A Measure of Information? Consider a discrete random variable

More information