A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks

Size: px
Start display at page:

Download "A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks"

Transcription

1 A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks Marco Tomamichel, Masahito Hayashi arxiv: Also discussing results of: Second Order Asymptotics for Quantum Hypothesis Testing Ke Li, arxiv: CQT, National University of Singapore Graduate School of Mathematics, Nagoya University Cambridge, January 2013

2 Outline 1 Hypothesis Testing and β ε 2 Main Result: Asymptotic Expansion of β ε and Smooth Entropies 3 Two Applications Randomness Extraction against Quantum Side Information Data Compression with Quantum Side Information 4 Finite Block Length Results 5 Main Proof Ideas 6 Conclusion and Discussion

3 Quantum Hypothesis Testing Given one out of two quantum states, either ρ or σ, we must decide which one we received. For a given test POVM, {Q, 1 Q}, 0 Q 1, the error of the first and second kind are α ρ (Q) = tr(ρ(1 Q)) and β σ (Q) = tr(σq), respectively. We are interested in the minimal β that can be achieved if α is required to smaller than a given constant ε, i.e. the SDP β ε ρ,σ := min β σ (Q) = 0 Q 1 αρ(q) ε min tr(σq). 0 Q 1 tr(ρq) 1 ε Alternatively, one may consider the divergence ( β Dh ε ε ) (ρ σ) := log ρ,σ, 0 < ε < 1. 1 ε (The additive normalization log(1 ε) ensures that ρ = σ D ε h(ρ σ) = 0.) 3

4 I.i.d. Asymptotic Expansion of D ε h Given n copies of either σ and ρ, a quantum generalization of Stein s Lemma (Hiai&Petz 91) and its strong converse (Ogawa&Nagaoka 00) imply D ε h( ρ n σ n) = nd(ρ σ) + o(n) This was recently improved (Audenaert,Mosonyi&Verstraete 12) Dh( ε ρ n σ n) nd(ρ σ) + O ( n ) and ( ρ n σ n) nd(ρ σ) O ( n ) D ε h by giving explicit upper and lower bounds. However, the terms proportional to n in the upper and lower bounds are different. (They do not have the same sign!) Our goal is to investigate the second order term, O( n). 4

5 Main Result Theorem For two states ρ, σ with supp{σ} supp{ρ}, and 0 < ε < 1, we find Dh ε (ρ n σ n ) nd(ρ σ) + nv (ρ σ)φ 1 (ε) + 2 log n + O(1), and Dh ε (ρ n σ n ) nd(ρ σ) + nv (ρ σ)φ 1 (ε) O(1). D and V are the mean and variance of log ρ log σ under ρ, i.e. V (ρ σ) := tr ( ρ ( log ρ log σ D(ρ σ) ) 2). Φ is the cumulative normal distribution function, and Φ 1 (ε) is The bounds given here are due to Li. A similar result was independently derived by T&H. 5.

6 Main Result Theorem For two states ρ, σ with supp{σ} supp{ρ}, and 0 < ε < 1, we find Dh ε (ρ n σ n ) nd(ρ σ) + nv (ρ σ)φ 1 (ε) + 2 log n + O(1), and Dh ε (ρ n σ n ) nd(ρ σ) + nv (ρ σ)φ 1 (ε) O(1). We also have bounds on the constant terms, enabling us to calculate upper and lower bounds on D ε h (ρ n σ n ) for finite n. Classically, the above holds with logarithmic terms in upper and lower bound equal to 1 2 log n (e.g. Strassen 62,Polyanskiy,Poor&Verdú 10). One ingredient of both proof is the Berry-Essèen theorem, which quantizes the convergence of the distribution of a sum of i.i.d. random variables to a normal distribution. Intuitively, our results can be seen as quantum, entropic formulation of the central limit theorem. The bounds given here are due to Li. A similar result was independently derived by T&H. 6

7 Smooth Entropies We also investigate the smooth min-entropy (Renner 05), where it was known (T,Colbeck&Renner 09,T 12) that, for ε (0, 1), H ε min(a n B n ) ρ n nh(a B) ρ + O ( n ), H ε min(a n B n ) ρ n nh(a B) ρ O ( n ). and We derive the following expansion Hmin(A ε n B n ) ρ n nh(a B) ρ + nv (A B) ρ Φ 1 (ε 2 ) + O(log n), Hmin(A ε n B n ) ρ n nh(a B) ρ + nv (A B) ρ Φ 1 (ε 2 ) O(log n), where H(A B) ρ = D(ρ AB 1 A ρ B ) and V (A B) ρ = V (ρ AB 1 A ρ B ). Both hypothesis testing and smooth entropies have various applications in information theory, some of which we explore next. 7

8 Randomness Extraction against Side Information Consider a CQ random source that outputs states ρ XE = x p x x x ρ x E. Investigate the amount of randomness that can be extracted from X such that it is independent of E and the seeded randomness S. A protocol P : XS ZS extracts a random number Z from X, producing a state τ ZES when applied to ρ XE ρ S. For any 0 ε < 1 and ρ XE a CQ state, we define l ε (X E) := max { l N P, σe : Z = 2 l τ ZES ε 2 l 1 Z σ E τ S }. This quantity can be characterized in terms of the smooth min-entropy (Renner 05). We tighten this and show Theorem Consider an i.i.d. source ρ X n E n = ρ n XE and 0 < ε < 1. Then, l ε (X n E n ) nh(x E) + nv (X E)Φ 1 (ε 2 ) ± O(log n). 8

9 Data Compression with Side Information Consider a CQ random source that outputs states ρ XB = x p x x x ρ x B. Find the minimum encoding length for data reconciliation of X if quantum side information B is available. A protocol P encodes X into M and then produces an estimate X of X from B and M. For any 0 ε < 1 and ρ XB a CQ state, we define m ε (X B) ρ := min { m N P : M = 2 m P[X X ] ε }. This quantity can be characterized using hypothesis testing (H&Nagaoka 04). We tighten this and show Theorem Consider an i.i.d. source ρ X n B n = ρ n XB and 0 < ε < 1. Then, m ε (X n B n ) nh(x B) nv (X B)Φ 1 (ε) ± O(log n). 9

10 Example of Second Order Asymptotics Consider transmission of 0, 1 through a Pauli channel to B (phase and bit flip independent) with environment E. This yields the states ρ XB = 1 2 x x ( (1 p) x x + p 1 x 1 x ), ρ XE = 1 2 x x φ x φ x, φ x = p 0 + ( 1) x 1 p Plot of first and second order asymptotic approximation of 1 n lε (X E) and 1 n mε (X B) for p = 0.05 and ε =

11 Example of Finite Block Length Bounds r 0.7 1st order 2nd order converse bound direct bound n 11

12 Different Layers of Approximation Class Role Quantities Class 1 Optimal performance of protocol. m ε (X B) ρ Calculation is very difficult. l ε (X B) ρ, Class 2 One-shot bound for general source. Hh ε(a B) ρ, SDP tractable for small systems. Hmin ε (A B) ρ Class 3 Quantum information spectrum. Ds ε (ρ σ) Class 4 Classical information spectrum. Ds ε (P 0,ρ,σ P 1,ρ,σ ) Approximately possible for i.i.d. Class 5 Second order asymptotics. nh(x B)+ Calculation is easy for large n. n s(x B)Φ 1 (ε) Classes Difference Method 1 2 O(log n) Random coding and data-processing inequalities. 2 4 O(log n) Relations between entropies. 4 5 O(1) Berry-Essèen Theorem. Let ε (0, 1) and consider n i.i.d. repetitions of tasks for large n. 12

13 Step 1: One-Shot bounds I Theorem (One-Shot Randomness Extraction) Let ρ XB be a CQ state and 0 < η ε < 1. Then, H ε η min (X B) ρ log 1 η 4 3 lε (X B) ρ H ε min(x B) ρ. Theorem (One-Shot Data Compression) Let ρ XB be a CQ state and 0 < η ε < 1. Then, Hh ε (X B) ρ m ε (X B) ρ H ε η h (X B) ρ + log ε η Converse bounds keep ε intact. Achievability up to η, where η can be chosen arbitrarily small. The idea is to choose η 1/ n for the i.i.d. analysis. 13

14 Step 1: One-Shot bounds II Theorem Let ρ XB be a CQ state and 0 < η ε < 1. Then, H ε η min (X B) ρ log 1 η 4 3 lε (X B) ρ H ε min(x B) ρ. In this sense, we have l ε (X B) ρ H ε min (X B) ρ, i.e. the smooth min-entropy characterizes randomness extraction. The smooth min-entropy can be calculated efficiently (using an SDP) for small systems. The quantity l ε (X B) ρ has some natural properties, i.e. For any function: l ε (f (X ) B) ρ l ε (X B) ρ. For any quantum channel: l ε (X E(B)) ρ l ε (X B) ρ. These properties are mimicked by the smooth min-entropy. The converse follows solely from the first of these monotonicity properties. Achievability uses two-universal hashing. 14

15 Step 1: One-Shot bounds III Theorem Let ρ XB be a CQ state and 0 < η ε < 1. Then, Hh ε (X B) ρ m ε (X B) ρ H ε η h (X B) ρ + log ε η In this sense, we have m ε (X B) ρ H ε h (X B) ρ, i.e. source coding is characterized by a conditional hypothesis testing entropy. The hypothesis testing entropy can be calculated efficiently (using an SDP) for small systems. The quantity m ε (X B) ρ has some natural properties, i.e. For any message: m ε (X BM) ρ m ε (X B) ρ log M. For any quantum channel: m ε (X E(B)) ρ m ε (X B) ρ. These properties are mimicked by H h. The converse follows solely from these properties. Achievability uses two-universal hashing (corresponds to random coding) and pretty good measurements. 15

16 Step 2: Relation to Information Spectrum I While the one-shot entropies can be computed for small systems, they are generally intractable for large systems. Hence, we want to approximate them further. This is done as follows. We write everything in terms of relative entropies. H ε h (A B) ρ = max σ D ε h (ρ AB 1 A σ B ) and H ε min (A B) ρ = max σ D ε max(ρ AB 1 A σ B ). Use relations between relative entropies: D ε h (ρ σ) D 1 ε max (ρ σ) D ε s (ρ σ) D ε s (P 0 P 1 ) This holds up to terms log Θ, where Θ is at most the number of distinct eigenvalues of σ, or 2 log(λ max (σ)/λ min (σ)). In particular, Θ = O(n) in the i.i.d. case. 16

17 Step 2: Relation to Information Spectrum II We focus on the last quantity, the classical information spectrum. Decompose ρ = x r x v x v x and σ = y s y u y u y. Nussbaum&Szkoła 09 introduced the distributions P 0 (x, y) := r x v x u y 2 and P 1 (x, y) := s y v x u y 2 They satisfy D(P 0 P 1 ) = D(ρ σ) and V (P 0 P 1 ) = V (ρ σ), i.e. the first two moments agree with the quantum analogue. This reduces the problem to analyzing the classical quantity { Ds ε [ (P 0 P 1 ) := max R R Pr log P0 log P 1 R ] } ε. P 0 So far we have not used any i.i.d. assumption except that we assumed that the eigenvalues of σ behave nicely. 17

18 Step 3: Asymptotic Expansion Assume i.i.d. states ρ n XB, are i.i.d. as well. P n 1. The corresponding distributions, P n 0 and We write P0 n (x n, y n ) = i P[i] 0 (x i, y i ), Z i = log P [i] 0 D ε s (P n 0 P n log P[i] 1, and 1 {R ) = max [ R Pr log P n 0 log P1 n R ] } ε P 0 { [ 1 ] } = n max R R Pr Z i R ε P 0 n where the Z i are i.i.d. distributed. The central limit theorem states that the distribution of 1 n i Z i converges to a Gaussian distribution with mean E[Z] = D(P 0 P 1 ) and variance E [ (Z E[Z]) 2] = V (P 0 P 1 ) 2. This yields the expansion D ε s (P n 0 P n 1 ) = nd(p 0 P 1 ) + nv (P 0 P 1 )Φ 1 (ε) + O(1). i 18

19 Relation to Asymptotic Information Spectrum Our results also strengthens the relation of one-shot entropies to the sup/inf-information spectrum (Datta&Renner 09). These are generally defined as (Nagaoka&Hayashi 07) D(ε ρ σ) := sup { R R lim sup trρ n {ρ n 2 nr σ n } ε }, n D(ε ρ σ) := inf { R R lim inf trρ n{ρ n 2 nr σ n } ε } n = sup { R R lim inf n trρ n{ρ n 2 nr σ n } < ε }. We find the following relations, which are valid for all ε [0, 1]. { 1 D(ε ρ σ) = sup lim inf ɛ n { 1 D(ε ρ σ) = sup lim inf ɛ n n D 1 ɛ n max n D 1 ɛ n max (ρ n σ n ) lim sup ɛ n ε }, n (ρ n σ n ) lim inf n ɛ n < ε }, The original relations by Datta&Renner only consider ε {0, 1}. 19

20 Conclusion, Open Questions and Discussion We find the second order asymptotics for quantum hypothesis testing and give bounds on β ε for finite n. We use one-shot entropies and techniques developed for hypothesis testing and the information spectrum method to find a second order expansion and finite block length bounds for operational quantities. There is a difference of 2 log n between the current upper and lower bounds on Dh ε (ρ σ). Is this fundamental, i.e. do there exist ρ and σ for which these bounds are tight? Or can this be further improved? Note that classically, the upper and lower bounds only differ in the constant term. The bounds depend on the parameter ε of the operational quantity, either l ε or m ε. This implies, in particular, that the ε in the respective one-shot entropies has clear operational meaning. Provocative question to those of us using one-shot entropies in information theory: What does it mean to characterize a quantity? When is a result considered tight? 20

21 Thank you for your attention. 21

Correlation Detection and an Operational Interpretation of the Rényi Mutual Information

Correlation Detection and an Operational Interpretation of the Rényi Mutual Information Correlation Detection and an Operational Interpretation of the Rényi Mutual Information Masahito Hayashi 1, Marco Tomamichel 2 1 Graduate School of Mathematics, Nagoya University, and Centre for Quantum

More information

Quantum Achievability Proof via Collision Relative Entropy

Quantum Achievability Proof via Collision Relative Entropy Quantum Achievability Proof via Collision Relative Entropy Salman Beigi Institute for Research in Fundamental Sciences (IPM) Tehran, Iran Setemper 8, 2014 Based on a joint work with Amin Gohari arxiv:1312.3822

More information

On Third-Order Asymptotics for DMCs

On Third-Order Asymptotics for DMCs On Third-Order Asymptotics for DMCs Vincent Y. F. Tan Institute for Infocomm Research (I R) National University of Singapore (NUS) January 0, 013 Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs

More information

On asymmetric quantum hypothesis testing

On asymmetric quantum hypothesis testing On asymmetric quantum hypothesis testing JMP, Vol 57, 6, 10.1063/1.4953582 arxiv:1612.01464 Cambyse Rouzé (Cambridge) Joint with Nilanjana Datta (University of Cambridge) and Yan Pautrat (Paris-Saclay)

More information

Second-Order Asymptotics in Information Theory

Second-Order Asymptotics in Information Theory Second-Order Asymptotics in Information Theory Vincent Y. F. Tan (vtan@nus.edu.sg) Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) National Taiwan University November 2015

More information

Lecture 11: Quantum Information III - Source Coding

Lecture 11: Quantum Information III - Source Coding CSCI5370 Quantum Computing November 25, 203 Lecture : Quantum Information III - Source Coding Lecturer: Shengyu Zhang Scribe: Hing Yin Tsang. Holevo s bound Suppose Alice has an information source X that

More information

Converse bounds for private communication over quantum channels

Converse bounds for private communication over quantum channels Converse bounds for private communication over quantum channels Mark M. Wilde (LSU) joint work with Mario Berta (Caltech) and Marco Tomamichel ( Univ. Sydney + Univ. of Technology, Sydney ) arxiv:1602.08898

More information

THE characterization of information theoretic tasks that are repeated only once (the one-shot setting) or a finite

THE characterization of information theoretic tasks that are repeated only once (the one-shot setting) or a finite A Hierarcy of Information Quantities for Finite Block Lengt Analysis of Quantum Tasks Marco Tomamicel Member, IEEE and Masaito Hayasi Senior Member, IEEE, arxiv:208.478v3 [quant-p] 24 Oct 203 Abstract

More information

Secret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe

Secret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe Two party secret key agreement Maurer 93, Ahlswede-Csiszár 93 X F Y K x K y ArandomvariableK

More information

On Composite Quantum Hypothesis Testing

On Composite Quantum Hypothesis Testing University of York 7 November 207 On Composite Quantum Hypothesis Testing Mario Berta Department of Computing with Fernando Brandão and Christoph Hirche arxiv:709.07268 Overview Introduction 2 Composite

More information

Secret Key Agreement: General Capacity and Second-Order Asymptotics

Secret Key Agreement: General Capacity and Second-Order Asymptotics Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe 1 Abstract We revisit the problem of secret key agreement using interactive public communication

More information

Convergence of Quantum Statistical Experiments

Convergence of Quantum Statistical Experiments Convergence of Quantum Statistical Experiments M!d!lin Gu"! University of Nottingham Jonas Kahn (Paris-Sud) Richard Gill (Leiden) Anna Jen#ová (Bratislava) Statistical experiments and statistical decision

More information

Strong converse theorems using Rényi entropies

Strong converse theorems using Rényi entropies Strong converse theorems using Rényi entropies Felix Leditzky a, Mark M. Wilde b, and Nilanjana Datta a a Statistical Laboratory, Centre for Mathematical Sciences, University of Cambridge, Cambridge C3

More information

A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback

A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback Vincent Y. F. Tan (NUS) Joint work with Silas L. Fong (Toronto) 2017 Information Theory Workshop, Kaohsiung,

More information

Secret Key Agreement: General Capacity and Second-Order Asymptotics

Secret Key Agreement: General Capacity and Second-Order Asymptotics Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe Abstract We revisit the problem of secret key agreement using interactive public communication

More information

Semidefinite programming strong converse bounds for quantum channel capacities

Semidefinite programming strong converse bounds for quantum channel capacities Semidefinite programming strong converse bounds for quantum channel capacities Xin Wang UTS: Centre for Quantum Software and Information Joint work with Runyao Duan and Wei Xie arxiv:1610.06381 & 1601.06888

More information

Capacity Estimates of TRO Channels

Capacity Estimates of TRO Channels Capacity Estimates of TRO Channels arxiv: 1509.07294 and arxiv: 1609.08594 Li Gao University of Illinois at Urbana-Champaign QIP 2017, Microsoft Research, Seattle Joint work with Marius Junge and Nicholas

More information

Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets

Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets Jonathan Scarlett and Vincent Y. F. Tan Department of Engineering, University of Cambridge Electrical and Computer Engineering,

More information

Lecture 19 October 28, 2015

Lecture 19 October 28, 2015 PHYS 7895: Quantum Information Theory Fall 2015 Prof. Mark M. Wilde Lecture 19 October 28, 2015 Scribe: Mark M. Wilde This document is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike

More information

Strong converse theorems using Rényi entropies

Strong converse theorems using Rényi entropies Strong converse theorems using Rényi entropies Felix Leditzky joint work with Mark M. Wilde and Nilanjana Datta arxiv:1506.02635 5 January 2016 Table of Contents 1 Weak vs. strong converse 2 Rényi entropies

More information

Efficient achievability for quantum protocols using decoupling theorems

Efficient achievability for quantum protocols using decoupling theorems Efficient achievability for quantum protocols using decoupling theorems Christoph Hirche and Ciara Morgan Institut für Theoretische Physik Leibniz Universität Hannover Appelstraße, D-3067 Hannover, Germany

More information

Hypothesis Testing with Communication Constraints

Hypothesis Testing with Communication Constraints Hypothesis Testing with Communication Constraints Dinesh Krithivasan EECS 750 April 17, 2006 Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 1 / 21 Presentation Outline

More information

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability

More information

arxiv: v4 [quant-ph] 28 Oct 2009

arxiv: v4 [quant-ph] 28 Oct 2009 Max- Relative Entropy of Entanglement, alias Log Robustness arxiv:0807.2536v4 [quant-ph] 28 Oct 2009 Nilanjana Datta, Statistical Laboratory, DPMMS, University of Cambridge, Cambridge CB3 0WB, UK (Dated:

More information

Multivariate trace inequalities. David Sutter, Mario Berta, Marco Tomamichel

Multivariate trace inequalities. David Sutter, Mario Berta, Marco Tomamichel Multivariate trace inequalities David Sutter, Mario Berta, Marco Tomamichel What are trace inequalities and why we should care. Main difference between classical and quantum world are complementarity and

More information

Network coding for multicast relation to compression and generalization of Slepian-Wolf

Network coding for multicast relation to compression and generalization of Slepian-Wolf Network coding for multicast relation to compression and generalization of Slepian-Wolf 1 Overview Review of Slepian-Wolf Distributed network compression Error exponents Source-channel separation issues

More information

arxiv: v3 [quant-ph] 9 Jan 2017

arxiv: v3 [quant-ph] 9 Jan 2017 Decoding quantum information via the etz recovery map Salman Beigi Nilanjana Datta Felix Leditzky May 23, 2016 arxiv:1504.04449v3 [quant-ph] 9 Jan 2017 Abstract We obtain a lower bound on the maximum number

More information

The Communication Complexity of Correlation. Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan

The Communication Complexity of Correlation. Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan The Communication Complexity of Correlation Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan Transmitting Correlated Variables (X, Y) pair of correlated random variables Transmitting

More information

Entropy in Classical and Quantum Information Theory

Entropy in Classical and Quantum Information Theory Entropy in Classical and Quantum Information Theory William Fedus Physics Department, University of California, San Diego. Entropy is a central concept in both classical and quantum information theory,

More information

The Quantum Reverse Shannon Theorem Based on One-Shot Information Theory

The Quantum Reverse Shannon Theorem Based on One-Shot Information Theory Commun. Math. Phys. 306, 579 65 (0) Digital Object Identifier (DOI) 0.007/s000-0-309-7 Communications in Mathematical Physics The Quantum Reverse Shannon Theorem Based on One-Shot Information Theory Mario

More information

AQI: Advanced Quantum Information Lecture 6 (Module 2): Distinguishing Quantum States January 28, 2013

AQI: Advanced Quantum Information Lecture 6 (Module 2): Distinguishing Quantum States January 28, 2013 AQI: Advanced Quantum Information Lecture 6 (Module 2): Distinguishing Quantum States January 28, 2013 Lecturer: Dr. Mark Tame Introduction With the emergence of new types of information, in this case

More information

Interactive Communication for Data Exchange

Interactive Communication for Data Exchange Interactive Communication for Data Exchange Himanshu Tyagi Indian Institute of Science, Bangalore Joint work with Pramod Viswanath and Shun Watanabe The Data Exchange Problem [ElGamal-Orlitsky 84], [Csiszár-Narayan

More information

Quantum Sphere-Packing Bounds and Moderate Deviation Analysis for Classical-Quantum Channels

Quantum Sphere-Packing Bounds and Moderate Deviation Analysis for Classical-Quantum Channels Quantum Sphere-Packing Bounds and Moderate Deviation Analysis for Classical-Quantum Channels (, ) Joint work with Min-Hsiu Hsieh and Marco Tomamichel Hao-Chung Cheng University of Technology Sydney National

More information

On the distinguishability of random quantum states

On the distinguishability of random quantum states 1 1 Department of Computer Science University of Bristol Bristol, UK quant-ph/0607011 Distinguishing quantum states Distinguishing quantum states This talk Question Consider a known ensemble E of n quantum

More information

Quantum Hadamard channels (II)

Quantum Hadamard channels (II) Quantum Hadamard channels (II) Vlad Gheorghiu Department of Physics Carnegie Mellon University Pittsburgh, PA 15213, USA August 5, 2010 Vlad Gheorghiu (CMU) Quantum Hadamard channels (II) August 5, 2010

More information

Phenomena in high dimensions in geometric analysis, random matrices, and computational geometry Roscoff, France, June 25-29, 2012

Phenomena in high dimensions in geometric analysis, random matrices, and computational geometry Roscoff, France, June 25-29, 2012 Phenomena in high dimensions in geometric analysis, random matrices, and computational geometry Roscoff, France, June 25-29, 202 BOUNDS AND ASYMPTOTICS FOR FISHER INFORMATION IN THE CENTRAL LIMIT THEOREM

More information

Strong Converse and Stein s Lemma in the Quantum Hypothesis Testing

Strong Converse and Stein s Lemma in the Quantum Hypothesis Testing Strong Converse and Stein s Lemma in the Quantum Hypothesis Testing arxiv:uant-ph/9906090 v 24 Jun 999 Tomohiro Ogawa and Hiroshi Nagaoka Abstract The hypothesis testing problem of two uantum states is

More information

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Vincent Y. F. Tan (Joint work with Silas L. Fong) National University of Singapore (NUS) 2016 International Zurich Seminar on

More information

Self-normalized Cramér-Type Large Deviations for Independent Random Variables

Self-normalized Cramér-Type Large Deviations for Independent Random Variables Self-normalized Cramér-Type Large Deviations for Independent Random Variables Qi-Man Shao National University of Singapore and University of Oregon qmshao@darkwing.uoregon.edu 1. Introduction Let X, X

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities

Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities Vincent Y. F. Tan Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) September 2014 Vincent Tan

More information

Classical and Quantum Channel Simulations

Classical and Quantum Channel Simulations Classical and Quantum Channel Simulations Mario Berta (based on joint work with Fernando Brandão, Matthias Christandl, Renato Renner, Joseph Renes, Stephanie Wehner, Mark Wilde) Outline Classical Shannon

More information

Dispersion of the Gilbert-Elliott Channel

Dispersion of the Gilbert-Elliott Channel Dispersion of the Gilbert-Elliott Channel Yury Polyanskiy Email: ypolyans@princeton.edu H. Vincent Poor Email: poor@princeton.edu Sergio Verdú Email: verdu@princeton.edu Abstract Channel dispersion plays

More information

Entropic Formulation of Heisenberg s Measurement-Disturbance Relation

Entropic Formulation of Heisenberg s Measurement-Disturbance Relation Entropic Formulation of Heisenberg s Measurement-Disturbance Relation arxiv:1311.7637 QPL 2014, Kyoto, 6. June 2014 Fabian Furrer Joint work with: Patrick Coles @ Different Uncertainty Trade-Offs Uncertainty

More information

Extreme inference in stationary time series

Extreme inference in stationary time series Extreme inference in stationary time series Moritz Jirak FOR 1735 February 8, 2013 1 / 30 Outline 1 Outline 2 Motivation The multivariate CLT Measuring discrepancies 3 Some theory and problems The problem

More information

Capacity of AWGN channels

Capacity of AWGN channels Chapter 3 Capacity of AWGN channels In this chapter we prove that the capacity of an AWGN channel with bandwidth W and signal-tonoise ratio SNR is W log 2 (1+SNR) bits per second (b/s). The proof that

More information

Quantum hypothesis testing for Gaussian states

Quantum hypothesis testing for Gaussian states Quantum hypothesis testing for Gaussian states Wataru Kumagai (Tohoku University) Joint Work with Masahito Hayashi Bernoulli Society Satellite Meeting to the ISI World Statistics Congress 203 (203/9/4)

More information

to mere bit flips) may affect the transmission.

to mere bit flips) may affect the transmission. 5 VII. QUANTUM INFORMATION THEORY to mere bit flips) may affect the transmission. A. Introduction B. A few bits of classical information theory Information theory has developed over the past five or six

More information

National University of Singapore Department of Electrical & Computer Engineering. Examination for

National University of Singapore Department of Electrical & Computer Engineering. Examination for National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:

More information

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/41 Outline Two-terminal model: Mutual information Operational meaning in: Channel coding: channel

More information

Fourier analysis of boolean functions in quantum computation

Fourier analysis of boolean functions in quantum computation Fourier analysis of boolean functions in quantum computation Ashley Montanaro Centre for Quantum Information and Foundations, Department of Applied Mathematics and Theoretical Physics, University of Cambridge

More information

A large deviation principle for a RWRC in a box

A large deviation principle for a RWRC in a box A large deviation principle for a RWRC in a box 7th Cornell Probability Summer School Michele Salvi TU Berlin July 12, 2011 Michele Salvi (TU Berlin) An LDP for a RWRC in a nite box July 12, 2011 1 / 15

More information

Randomness extraction via a quantum

Randomness extraction via a quantum 1 Randomness extraction via a quantum generalization of the conditional collision entropy Yodai Watanabe, Member, IEEE arxiv:1703.1036v3 [cs.it 4 Dec 018 Abstract Randomness extraction against side information

More information

LECTURE 13. Last time: Lecture outline

LECTURE 13. Last time: Lecture outline LECTURE 13 Last time: Strong coding theorem Revisiting channel and codes Bound on probability of error Error exponent Lecture outline Fano s Lemma revisited Fano s inequality for codewords Converse to

More information

Recoverability in quantum information theory

Recoverability in quantum information theory Recoverability in quantum information theory Mark M. Wilde Hearne Institute for Theoretical Physics, Department of Physics and Astronomy, Center for Computation and Technology, Louisiana State University,

More information

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/40 Acknowledgement Praneeth Boda Himanshu Tyagi Shun Watanabe 3/40 Outline Two-terminal model: Mutual

More information

A Holevo-type bound for a Hilbert Schmidt distance measure

A Holevo-type bound for a Hilbert Schmidt distance measure Journal of Quantum Information Science, 205, *,** Published Online **** 204 in SciRes. http://www.scirp.org/journal/**** http://dx.doi.org/0.4236/****.204.***** A Holevo-type bound for a Hilbert Schmidt

More information

Multivariate Trace Inequalities

Multivariate Trace Inequalities Multivariate Trace Inequalities Mario Berta arxiv:1604.03023 with Sutter and Tomamichel (to appear in CMP) arxiv:1512.02615 with Fawzi and Tomamichel QMath13 - October 8, 2016 Mario Berta (Caltech) Multivariate

More information

Entanglement: concept, measures and open problems

Entanglement: concept, measures and open problems Entanglement: concept, measures and open problems Division of Mathematical Physics Lund University June 2013 Project in Quantum information. Supervisor: Peter Samuelsson Outline 1 Motivation for study

More information

Entanglement Manipulation

Entanglement Manipulation Entanglement Manipulation Steven T. Flammia 1 1 Perimeter Institute for Theoretical Physics, Waterloo, Ontario, N2L 2Y5 Canada (Dated: 22 March 2010) These are notes for my RIT tutorial lecture at the

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem LECTURE 15 Last time: Feedback channel: setting up the problem Perfect feedback Feedback capacity Data compression Lecture outline Joint source and channel coding theorem Converse Robustness Brain teaser

More information

Introduction to quantum information processing

Introduction to quantum information processing Introduction to quantum information processing Fidelity and other distance metrics Brad Lackey 3 November 2016 FIDELITY AND OTHER DISTANCE METRICS 1 of 17 OUTLINE 1 Noise channels 2 Fidelity 3 Trace distance

More information

MMSE Dimension. snr. 1 We use the following asymptotic notation: f(x) = O (g(x)) if and only

MMSE Dimension. snr. 1 We use the following asymptotic notation: f(x) = O (g(x)) if and only MMSE Dimension Yihong Wu Department of Electrical Engineering Princeton University Princeton, NJ 08544, USA Email: yihongwu@princeton.edu Sergio Verdú Department of Electrical Engineering Princeton University

More information

EECS 750. Hypothesis Testing with Communication Constraints

EECS 750. Hypothesis Testing with Communication Constraints EECS 750 Hypothesis Testing with Communication Constraints Name: Dinesh Krithivasan Abstract In this report, we study a modification of the classical statistical problem of bivariate hypothesis testing.

More information

Detecting pure entanglement is easy, so detecting mixed entanglement is hard

Detecting pure entanglement is easy, so detecting mixed entanglement is hard A quantum e-meter Detecting pure entanglement is easy, so detecting mixed entanglement is hard Aram Harrow (University of Washington) and Ashley Montanaro (University of Cambridge) arxiv:1001.0017 Waterloo

More information

Semidefinite approximations of the matrix logarithm

Semidefinite approximations of the matrix logarithm 1/26 Semidefinite approximations of the matrix logarithm Hamza Fawzi DAMTP, University of Cambridge Joint work with James Saunderson (Monash University) and Pablo Parrilo (MIT) April 5, 2018 2/26 Logarithm

More information

The Moment Method; Convex Duality; and Large/Medium/Small Deviations

The Moment Method; Convex Duality; and Large/Medium/Small Deviations Stat 928: Statistical Learning Theory Lecture: 5 The Moment Method; Convex Duality; and Large/Medium/Small Deviations Instructor: Sham Kakade The Exponential Inequality and Convex Duality The exponential

More information

Universal recoverability in quantum information theory

Universal recoverability in quantum information theory Universal recoverability in quantum information theory Mark M. Wilde Hearne Institute for Theoretical Physics, Department of Physics and Astronomy, Center for Computation and Technology, Louisiana State

More information

Lecture 5: Asymptotic Equipartition Property

Lecture 5: Asymptotic Equipartition Property Lecture 5: Asymptotic Equipartition Property Law of large number for product of random variables AEP and consequences Dr. Yao Xie, ECE587, Information Theory, Duke University Stock market Initial investment

More information

Diversity-Multiplexing Tradeoff in MIMO Channels with Partial CSIT. ECE 559 Presentation Hoa Pham Dec 3, 2007

Diversity-Multiplexing Tradeoff in MIMO Channels with Partial CSIT. ECE 559 Presentation Hoa Pham Dec 3, 2007 Diversity-Multiplexing Tradeoff in MIMO Channels with Partial CSIT ECE 559 Presentation Hoa Pham Dec 3, 2007 Introduction MIMO systems provide two types of gains Diversity Gain: each path from a transmitter

More information

Quantum Measurement Uncertainty

Quantum Measurement Uncertainty Quantum Measurement Uncertainty New Relations for Qubits Paul Busch Department of Mathematics Siegen, 9 July 2015 Paul Busch (York) Quantum Measurement Uncertainty 1 / 40 Peter Mittelstaedt 1929-2014 Paul

More information

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

Distinguishing multi-partite states by local measurements

Distinguishing multi-partite states by local measurements Distinguishing multi-partite states by local measurements Guillaume Aubrun / Andreas Winter Université Claude Bernard Lyon 1 / Universitat Autònoma de Barcelona Cécilia Lancien UCBL1 / UAB Madrid OA and

More information

Exponential Separation of Quantum Communication and Classical Information

Exponential Separation of Quantum Communication and Classical Information Exponential Separation of Quantum Communication and Classical Information Dave Touchette IQC and C&O, University of Waterloo, and Perimeter Institute for Theoretical Physics jt. work with Anurag Anshu

More information

Malliavin calculus and central limit theorems

Malliavin calculus and central limit theorems Malliavin calculus and central limit theorems David Nualart Department of Mathematics Kansas University Seminar on Stochastic Processes 2017 University of Virginia March 8-11 2017 David Nualart (Kansas

More information

Channel Dispersion and Moderate Deviations Limits for Memoryless Channels

Channel Dispersion and Moderate Deviations Limits for Memoryless Channels Channel Dispersion and Moderate Deviations Limits for Memoryless Channels Yury Polyanskiy and Sergio Verdú Abstract Recently, Altug and Wagner ] posed a question regarding the optimal behavior of the probability

More information

The circular law. Lewis Memorial Lecture / DIMACS minicourse March 19, Terence Tao (UCLA)

The circular law. Lewis Memorial Lecture / DIMACS minicourse March 19, Terence Tao (UCLA) The circular law Lewis Memorial Lecture / DIMACS minicourse March 19, 2008 Terence Tao (UCLA) 1 Eigenvalue distributions Let M = (a ij ) 1 i n;1 j n be a square matrix. Then one has n (generalised) eigenvalues

More information

Lecture 11: Continuous-valued signals and differential entropy

Lecture 11: Continuous-valued signals and differential entropy Lecture 11: Continuous-valued signals and differential entropy Biology 429 Carl Bergstrom September 20, 2008 Sources: Parts of today s lecture follow Chapter 8 from Cover and Thomas (2007). Some components

More information

Kolmogorov complexity

Kolmogorov complexity Kolmogorov complexity In this section we study how we can define the amount of information in a bitstring. Consider the following strings: 00000000000000000000000000000000000 0000000000000000000000000000000000000000

More information

Non-Asymptotic and Asymptotic Analyses on Markov Chains in Several Problems

Non-Asymptotic and Asymptotic Analyses on Markov Chains in Several Problems Non-Asymptotic and Asymptotic Analyses on Markov Chains in Several Problems Masahito Hayashi and Shun Watanabe Graduate School of Mathematics, Nagoya University, Japan, and Centre for Quantum Technologies,

More information

Chapter 8. Hypothesis Testing. Po-Ning Chen. Department of Communications Engineering. National Chiao-Tung University. Hsin Chu, Taiwan 30050

Chapter 8. Hypothesis Testing. Po-Ning Chen. Department of Communications Engineering. National Chiao-Tung University. Hsin Chu, Taiwan 30050 Chapter 8 Hypothesis Testing Po-Ning Chen Department of Communications Engineering National Chiao-Tung University Hsin Chu, Taiwan 30050 Error exponent and divergence II:8-1 Definition 8.1 (exponent) Arealnumbera

More information

Quantum Data Compression

Quantum Data Compression PHYS 476Q: An Introduction to Entanglement Theory (Spring 2018) Eric Chitambar Quantum Data Compression With the basic foundation of quantum mechanics in hand, we can now explore different applications.

More information

SUPPLEMENTARY INFORMATION

SUPPLEMENTARY INFORMATION SUPPLEMENTARY INFORMATION doi: 0.038/nPHYS734 Supplementary Information The Uncertainty Principle in the Presence of Quantum Memory Mario Berta,, 2 Matthias Christandl,, 2 Roger Colbeck, 3,, 4 Joseph M.

More information

Contraction Coefficients for Noisy Quantum Channels

Contraction Coefficients for Noisy Quantum Channels Contraction Coefficients for Noisy Quantum Channels Mary Beth Ruskai ruskai@member.ams.org joint work with F. Hiai Daejeon February, 2016 1 M. B. Ruskai Contraction of Quantum Channels Recall Relative

More information

Quantum reading capacity: General definition and bounds. Siddhartha Das Mark M. Wilde. May 4, 2017

Quantum reading capacity: General definition and bounds. Siddhartha Das Mark M. Wilde. May 4, 2017 Quantum reading capacity: General definition and bounds Siddhartha Das Mark M. Wilde May 4, 2017 arxiv:1703.03706v2 [quant-ph] 3 May 2017 Abstract Quantum reading refers to the task of reading out classical

More information

ECE 4400:693 - Information Theory

ECE 4400:693 - Information Theory ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential

More information

Operator norm convergence for sequence of matrices and application to QIT

Operator norm convergence for sequence of matrices and application to QIT Operator norm convergence for sequence of matrices and application to QIT Benoît Collins University of Ottawa & AIMR, Tohoku University Cambridge, INI, October 15, 2013 Overview Overview Plan: 1. Norm

More information

Quantum Measurement Uncertainty

Quantum Measurement Uncertainty Quantum Measurement Uncertainty Reading Heisenberg s mind or invoking his spirit? Paul Busch Department of Mathematics Quantum Physics and Logic QPL 2015, Oxford, 13-17 July 2015 Paul Busch (York) Quantum

More information

Introduction to Self-normalized Limit Theory

Introduction to Self-normalized Limit Theory Introduction to Self-normalized Limit Theory Qi-Man Shao The Chinese University of Hong Kong E-mail: qmshao@cuhk.edu.hk Outline What is the self-normalization? Why? Classical limit theorems Self-normalized

More information

Minimax Estimation of Kernel Mean Embeddings

Minimax Estimation of Kernel Mean Embeddings Minimax Estimation of Kernel Mean Embeddings Bharath K. Sriperumbudur Department of Statistics Pennsylvania State University Gatsby Computational Neuroscience Unit May 4, 2016 Collaborators Dr. Ilya Tolstikhin

More information

Soft Covering with High Probability

Soft Covering with High Probability Soft Covering with High Probability Paul Cuff Princeton University arxiv:605.06396v [cs.it] 20 May 206 Abstract Wyner s soft-covering lemma is the central analysis step for achievability proofs of information

More information

When Is a Measurement Reversible?

When Is a Measurement Reversible? When Is a Measurement Reversible? Information Gain and Disturbance in Quantum Measurements (revisited) Physical Review A 93, 062314 (2016) Francesco Buscemi, Siddhartha Das, Mark M. Wilde contributed talk

More information

COMPRESSION FOR QUANTUM POPULATION CODING

COMPRESSION FOR QUANTUM POPULATION CODING COMPRESSION FOR QUANTUM POPULATION CODING Ge Bai, The University of Hong Kong Collaborative work with: Yuxiang Yang, Giulio Chiribella, Masahito Hayashi INTRODUCTION Population: A group of identical states

More information

(Classical) Information Theory II: Source coding

(Classical) Information Theory II: Source coding (Classical) Information Theory II: Source coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract The information content of a random variable

More information

A Framework for Non-Asymptotic Quantum Information Theory

A Framework for Non-Asymptotic Quantum Information Theory Diss. ETH No. 20213 A Framework for Non-Asymptotic Quantum Information Theory arxiv:1203.2142v2 [quant-ph] 5 Jul 2013 A dissertation submitted to ETH ZURICH for the degree of Doctor of Sciences presented

More information

Quantum to Classical Randomness Extractors

Quantum to Classical Randomness Extractors Quantum to Classical Randomness Extractors Mario Berta, Omar Fawzi, Stephanie Wehner - Full version preprint available at arxiv: 1111.2026v3 08/23/2012 - CRYPTO University of California, Santa Barbara

More information

Classical codes for quantum broadcast channels. arxiv: Ivan Savov and Mark M. Wilde School of Computer Science, McGill University

Classical codes for quantum broadcast channels. arxiv: Ivan Savov and Mark M. Wilde School of Computer Science, McGill University Classical codes for quantum broadcast channels arxiv:1111.3645 Ivan Savov and Mark M. Wilde School of Computer Science, McGill University International Symposium on Information Theory, Boston, USA July

More information

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions EE/Stat 376B Handout #5 Network Information Theory October, 14, 014 1. Problem.4 parts (b) and (c). Homework Set # Solutions (b) Consider h(x + Y ) h(x + Y Y ) = h(x Y ) = h(x). (c) Let ay = Y 1 + Y, where

More information

arxiv: v2 [cs.it] 8 Apr 2013

arxiv: v2 [cs.it] 8 Apr 2013 ONE-BIT COMPRESSED SENSING WITH NON-GAUSSIAN MEASUREMENTS ALBERT AI, ALEX LAPANOWSKI, YANIV PLAN, AND ROMAN VERSHYNIN arxiv:1208.6279v2 [cs.it] 8 Apr 2013 Abstract. In one-bit compressed sensing, previous

More information