Quantum Achievability Proof via Collision Relative Entropy

Size: px
Start display at page:

Download "Quantum Achievability Proof via Collision Relative Entropy"

Transcription

1 Quantum Achievability Proof via Collision Relative Entropy Salman Beigi Institute for Research in Fundamental Sciences (IPM) Tehran, Iran Setemper 8, 2014 Based on a joint work with Amin Gohari arxiv:

2 Technique of Yassaee et al (ISIT 2013) x N X!Y y

3 Technique of Yassaee et al (ISIT 2013) m E x m N X!Y D random encoding maximum likelihood y ˆm

4 Technique of Yassaee et al (ISIT 2013) m E x m N X!Y D random encoding maximum likelihood y ˆm Random encoding: choose some p X pick i.i.d samples x 1,..., x M

5 Technique of Yassaee et al (ISIT 2013) m E x m N X!Y D random encoding stochastic likelihood y ˆm Random encoding: choose some p X pick i.i.d samples x 1,..., x M Stochastic likelihood decoder sample from p(m y) p(m, y) = 1 M p(y x m) p(m y) = p(y x m) m p(y x m )

6 Technique of Yassaee et al (ISIT 2013) Random codebook: C = {x 1,..., x M } 1 E C Pr[succ] = E C M p(y x p(y x m ) m) m p(y x m ) m,y

7 Technique of Yassaee et al (ISIT 2013) Random codebook: C = {x 1,..., x M } 1 E C Pr[succ] = E C M p(y x p(y x m ) m) m p(y x m ) m,y p(y x 1 ) = E C p(y x 1 ) m p(y x m ) y

8 Technique of Yassaee et al (ISIT 2013) Random codebook: C = {x 1,..., x M } 1 E C Pr[succ] = E C M p(y x p(y x m ) m) m p(y x m ) m,y p(y x 1 ) = E C p(y x 1 ) m p(y x m ) (Jensen) y p(y x 1 ) E x1 p(y x 1 ) E C\x1 m p(y x m ) y

9 Technique of Yassaee et al (ISIT 2013) Random codebook: C = {x 1,..., x M } 1 E C Pr[succ] = E C M p(y x p(y x m ) m) m p(y x m ) m,y p(y x 1 ) = E C p(y x 1 ) m p(y x m ) (Jensen) y p(y x 1 ) E x1 p(y x 1 ) E C\x1 m p(y x m ) y p(y x 1 ) = E x1 p(y x 1 ) p(y x 1 ) + (M 1)p(y) y

10 Technique of Yassaee et al (ISIT 2013) Random codebook: C = {x 1,..., x M } 1 E C Pr[succ] = E C M p(y x p(y x m ) m) m p(y x m ) m,y p(y x 1 ) = E C p(y x 1 ) m p(y x m ) (Jensen) y p(y x 1 ) E x1 p(y x 1 ) E C\x1 m p(y x m ) y p(y x 1 ) = E x1 p(y x 1 ) p(y x 1 ) + (M 1)p(y) y = E xy p(y x) p(y x) + (M 1)p(y) = E xy (M 1) p(y) p(y x)

11 Plan for the talk Better understanding and quantum generalization of Stochastic likelihood decoder convexity Jensen s inequality Asymptotic analysis of the result

12 Pretty good measurement Pretty good measure for signal states {ρ 1,..., ρ M } ( ) 1 ( ) E m = ρ i ρ m ρ i i i

13 Pretty good measurement Pretty good measure for signal states {ρ 1,..., ρ M } ( ) 1 ( ) E m = ρ i ρ m ρ i If the signal states are ρ m p(y x m ) then i E m p(y x m) i p(y x i) i

14 Pretty good measurement Pretty good measure for signal states {ρ 1,..., ρ M } ( ) 1 ( ) E m = ρ i ρ m ρ i If the signal states are ρ m p(y x m ) then i E m p(y x m) i p(y x i) The probability of successful decoding Pr[succ] = 1 ( ) 1 ( ) 2 1 tr [ρ ] 2 m ρ i ρ m ρ i M m i i i

15 Collision relative entropy D 2 (ρ σ) = log tr[ρσ 1 2 ρσ 1 2 ]

16 Collision relative entropy D 2 (ρ σ) = log tr[ρσ 1 2 ρσ 1 2 ] D α (ρ σ) = 1 [( ) α ] log tr σ 1 α 1 α 2α ρσ 2α α 1

17 Collision relative entropy D 2 (ρ σ) = log tr[ρσ 1 2 ρσ 1 2 ] D α (ρ σ) = 1 [( ) α ] log tr σ 1 α 1 α 2α ρσ 2α α 1 Müller-Lennert et al, arxiv: Wilde, Winter, Yang, arxiv: Frank, Lieb, arxiv: SB, arxiv: (Positivity) D 2 (ρ σ) 0. (Data processing) D 2 (ρ σ) D 2 (N (ρ) N (σ)). (Joint convexity) exp D 2 ( ) is jointly convex

18 Classical-Quantum channels x N X!Y x

19 Classical-Quantum channels m E x m N X!Y xm D random encoding pretty good measurement ˆm input distribution p(x) pick random codebook x 1,..., x M pretty good measurement at the decoder

20 Classical-Quantum channels m E x m N X!Y xm D random encoding pretty good measurement ˆm input distribution p(x) pick random codebook x 1,..., x M pretty good measurement at the decoder σ UXY = 1 M M m m x m x m ρ xm, m=1

21 c-q channel coding m E random encoding x m N X!Y xm D pretty good measurement ˆm σ UXB = Pr[succ] = 1 M M m=1 1 M m m x m x m ρ xm, [ ] M ( ) 1/2ρxm ( ) 1/2 tr ρ xm ρ xi ρ xi m=1 i i

22 c-q channel coding m E random encoding x m N X!Y xm D pretty good measurement ˆm σ UXB = Pr[succ] = 1 M M m=1 1 M m m x m x m ρ xm, [ ] M ( ) 1/2ρxm ( ) 1/2 tr ρ xm ρ xi ρ xi m=1 = 1 M exp D 2(σ UXY σ UX σ Y ) i i

23 c-q channel coding m E random encoding x m N X!Y xm D pretty good measurement ˆm σ UXB = Pr[succ] = 1 M M m=1 1 M m m x m x m ρ xm, [ ] M ( ) 1/2ρxm ( ) 1/2 tr ρ xm ρ xi ρ xi m=1 = 1 M exp D 2(σ UXY σ UX σ Y ) 1 M exp D 2(σ XY σ X σ Y ) i i

24 Error analysis Use the joint convexity of exp D 2 ( ) E C Pr[succ] 1 M E C exp D 2 (σ XY σ X σ Y ) 1 M exp D 2(E C σ XY E C σ X σ Y )

25 Error analysis Use the joint convexity of exp D 2 ( ) E C Pr[succ] 1 M E C exp D 2 (σ XY σ X σ Y ) 1 M exp D 2(E C σ XY E C σ X σ Y ) ( = M 1 exp D 2 ρ XY 1 M ρ XY + ( 1 1 ) ) ρx ρ Y, M ρ XY = x p(x) x x ρ x

26 Information spectrum relative entropy Π U 0 : projection on eigenspaces with non-negative eigenvalues Π U V := Π U V 0

27 Information spectrum relative entropy Π U 0 : projection on eigenspaces with non-negative eigenvalues Π U V := Π U V 0 D ɛ s(ρ σ) := sup { R tr ( ρπ ρ 2 R σ) ɛ }.

28 Information spectrum relative entropy Π U 0 : projection on eigenspaces with non-negative eigenvalues Π U V := Π U V 0 D ɛ s(ρ σ) := sup { R tr ( ρπ ρ 2 R σ) ɛ }. D ɛ s(p q) = sup { R x: p(x) 2 R q(x) p(x) ɛ }

29 Information spectrum relative entropy Π U 0 : projection on eigenspaces with non-negative eigenvalues Π U V := Π U V 0 D ɛ s(ρ σ) := sup { R tr ( ρπ ρ 2 R σ) ɛ }. D ɛ s(p q) = sup { R x: p(x) 2 R q(x) p(x) ɛ } Theorem For every 0 < ɛ, λ < 1 and density matrices ρ, σ we have exp D 2 (ρ λρ + (1 λ)σ) (1 ɛ) [ λ + (1 λ) exp ( D ɛ s(ρ σ) )] 1

30 A one-shot achievable bound ( E C Pr[succ] M 1 exp D 2 ρ XY 1 M ρ XY + ( 1 1 ) ) ρx ρ Y M (1 ɛ) [ 1 + (M 1) exp ( D ɛ s(ρ XY ρ X ρ Y ) )] 1 Theorem M (1, ɛ) : maximum number of messages with error ɛ ɛ δ M (1, ɛ) max p(x), 0<δ<ɛ 1 ɛ exp ( D δ s(ρ XY ρ X ρ Y ) ) + 1

31 Asymptotic analysis Relative entropy: D(σ ρ) := tr(σ(log σ log ρ)). Information variance or the relative entropy variance V(ρ σ) := tr ( ρ(log ρ log σ D(ρ σ)) 2). Theorem [Tomamichel & Hayashi 12] For fixed 0 < ɛ < 1 and δ proportional to 1/ n we have D ɛ±δ s (ρ n σ n ) = nd(ρ σ) + nv(ρ σ)φ 1 (ɛ) + O(log n), Φ(u) := u 1 2π e 1 2 t2 dt, and Φ 1 (ɛ) = sup{u Φ(u) ɛ}.

32 Asymptotic analysis Theorem For 0 < ɛ < 1/2 we have log M (n, ɛ) nc + nvφ 1 (ɛ) + O(log n), where C is the channel capacity and V is the channel dispersion V = C := max I(X; Y) p(x) min V(ρ XB ρ X ρ B ) p(x) argmax I(X;Y) [Tan & Tomamichel 13]: this achievable bound is tight

33 Summary m E random encoding N n D pretty good measurement ˆm Write probability of successful decoding in terms of collision relative entropy

34 Summary m E random encoding N n D pretty good measurement ˆm Write probability of successful decoding in terms of collision relative entropy Use joint convexity of collision relative entropy

35 Summary m E random encoding N n D pretty good measurement ˆm Write probability of successful decoding in terms of collision relative entropy Use joint convexity of collision relative entropy Lower bound collision relative entropy with information spectrum relative entropy

36 Summary m E random encoding N n D pretty good measurement ˆm Write probability of successful decoding in terms of collision relative entropy Use joint convexity of collision relative entropy Lower bound collision relative entropy with information spectrum relative entropy Compute the asymptotics of information spectrum relative entropy log M (n, ɛ) nc + nvφ 1 (ɛ) + O(log n)

37 Hypothesis testing Given either ρ or σ Want to decide which one Apply POVM measurement {F ρ, F σ }

38 Hypothesis testing Given either ρ or σ Want to decide which one Apply POVM measurement {F ρ, F σ } Pr[type I error] = Pr[error ρ] = tr(ρf σ ) Pr[type II error] = Pr[error σ] = tr(σf ρ )

39 Hypothesis testing Given either ρ or σ Want to decide which one Apply POVM measurement {F ρ, F σ } Pr[type I error] = Pr[error ρ] = tr(ρf σ ) Pr[type II error] = Pr[error σ] = tr(σf ρ ) β ɛ (ρ σ) = min Pr[type II error] Pr[type I error] ɛ

40 Hypothesis testing Given either ρ or σ Want to decide which one Apply POVM measurement {F ρ, F σ } Pr[type I error] = Pr[error ρ] = tr(ρf σ ) Pr[type II error] = Pr[error σ] = tr(σf ρ ) β ɛ (ρ σ) = min Pr[type II error] Pr[type I error] ɛ Pick some M and define F ρ = (ρ + Mσ) 1/2 ρ(ρ + Mσ) 1/2 F σ = (M 1 ρ + σ) 1/2 σ(m 1 ρ + σ) 1/2

41 A one-shot hypothesis testing 1 Pr[type I error] = exp D 2 (ρ ρ + Mσ) (1 ɛ) [ 1 + M exp ( D ɛ s(ρ σ) )] 1

42 A one-shot hypothesis testing 1 Pr[type I error] = exp D 2 (ρ ρ + Mσ) (1 ɛ) [ 1 + M exp ( D ɛ s(ρ σ) )] 1 1 Pr[type II error] = exp D 2 (σ M 1 ρ + σ) (1 + M 1 ) 1,

43 A one-shot hypothesis testing 1 Pr[type I error] = exp D 2 (ρ ρ + Mσ) (1 ɛ) [ 1 + M exp ( D ɛ s(ρ σ) )] 1 1 Pr[type II error] = exp D 2 (σ M 1 ρ + σ) (1 + M 1 ) 1, Theorem log β 2ɛ (ρ σ) D ɛ s(ρ σ) + log ɛ.

44 A one-shot hypothesis testing 1 Pr[type I error] = exp D 2 (ρ ρ + Mσ) (1 ɛ) [ 1 + M exp ( D ɛ s(ρ σ) )] 1 1 Pr[type II error] = exp D 2 (σ M 1 ρ + σ) (1 + M 1 ) 1, Theorem Corollary log β 2ɛ (ρ σ) D ɛ s(ρ σ) + log ɛ. log β ɛ (ρ n σ n ) nd(ρ σ) + nv(ρ σ)φ 1 (ɛ) + o(n). This upper bound is tight [Tomamichel & Hayashi 12, Li 12]

45 Source compression with side information x x Alice Bob

46 Source compression with side information x Alice m = h(x) x Bob

47 Source compression with side information x Alice m = h(x) x Bob {F m x 0 : x0 2 X} ρ XY = x p(x) x x ρ x

48 Source compression with side information x Alice m = h(x) x Bob {F m x 0 : x0 2 X} ρ XY = x p(x) x x ρ x Pr[succ] = x,m p X(x)1 {m=h(x)} tr(ρ x F m x )

49 Source compression with side information x Alice m = h(x) x Bob {F m x 0 : x0 2 X} ρ XY = x p(x) x x ρ x Pr[succ] = x,m p X(x)1 {m=h(x)} tr(ρ x F m x ) pick a random h : X {1,..., M}

50 Source compression with side information x Alice m = h(x) x Bob {F m x 0 : x0 2 X} ρ XY = x p(x) x x ρ x Pr[succ] = x,m p X(x)1 {m=h(x)} tr(ρ x F m x ) pick a random h : X {1,..., M} Let {F m x : x X } be the pretty good measurement for {p(x)ρ x x X, h(x) = m}

51 Source compression with side information σ XB = ( x p(x)1 {h(x)=1} x x ρ x, ) ( ) τ XB = x 1 {h(x)=1} x x x p(x )1 {h(x )=1}ρ x. E h Pr[succ] = ME h exp D 2 (σ XB τ XB )

52 Source compression with side information σ XB = ( x p(x)1 {h(x)=1} x x ρ x, ) ( ) τ XB = x 1 {h(x)=1} x x x p(x )1 {h(x )=1}ρ x. E h Pr[succ] = ME h exp D 2 (σ XB τ XB ) Theorem There exists some h : X {1,..., M} s.t. ( ( Pr[succ] exp D 2 ρ XB 1 ) 1 ρxb + 1 ) M M I X ρ B 1 ɛ 1 + M 1 exp ( D ɛ s(ρ XB I X ρ B ) )

53 Source compression with side information σ XB = ( x p(x)1 {h(x)=1} x x ρ x, ) ( ) τ XB = x 1 {h(x)=1} x x x p(x )1 {h(x )=1}ρ x. E h Pr[succ] = ME h exp D 2 (σ XB τ XB ) Theorem There exists some h : X {1,..., M} s.t. ( ( Pr[succ] exp D 2 ρ XB 1 ) 1 ρxb + 1 ) M M I X ρ B 1 ɛ 1 + M 1 exp ( D ɛ s(ρ XB I X ρ B ) ) This also gives a second order asymptotics which again is tight [Tomamichel & Hayashi 12]

54 Our contribution Reformulation of the technique of Yassaee et al (2013) Stochastic likelihood = pretty good measurement Collision relative entropy: D 2 ( ) Information spectrum relative entropy: D ɛ s( ) c-q channel coding quantum hypothesis testing source compression of quantum side information

55 Our contribution Reformulation of the technique of Yassaee et al (2013) Stochastic likelihood = pretty good measurement Collision relative entropy: D 2 ( ) Information spectrum relative entropy: D ɛ s( ) c-q channel coding quantum hypothesis testing source compression of quantum side information We avoided Hayashi-Nagaoka operator inequality

56 What else? Quantum capacity: achievability of coherent information?

57 What else? Quantum capacity: achievability of coherent information? Entanglement-assisted capacity [Datta, Tomamichel, Wilde 14]

58 What else? Quantum capacity: achievability of coherent information? Entanglement-assisted capacity [Datta, Tomamichel, Wilde 14] Yassaee et al apply this method to Gelfand-Pinsker (coding over a state-dependent channel) Broadcast channel (Marton bound) Multiple access channel (MAC)

59 What else? Quantum capacity: achievability of coherent information? Entanglement-assisted capacity [Datta, Tomamichel, Wilde 14] Yassaee et al apply this method to Gelfand-Pinsker (coding over a state-dependent channel) Broadcast channel (Marton bound) Multiple access channel (MAC) Need more convexity properties in the quantum case Yassaee et al use the joint convexity of (x, y) 1 xy

60 Thanks for your attention!

A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks

A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks Marco Tomamichel, Masahito Hayashi arxiv: 1208.1478 Also discussing results of: Second Order Asymptotics for Quantum

More information

Strong converse theorems using Rényi entropies

Strong converse theorems using Rényi entropies Strong converse theorems using Rényi entropies Felix Leditzky joint work with Mark M. Wilde and Nilanjana Datta arxiv:1506.02635 5 January 2016 Table of Contents 1 Weak vs. strong converse 2 Rényi entropies

More information

A Holevo-type bound for a Hilbert Schmidt distance measure

A Holevo-type bound for a Hilbert Schmidt distance measure Journal of Quantum Information Science, 205, *,** Published Online **** 204 in SciRes. http://www.scirp.org/journal/**** http://dx.doi.org/0.4236/****.204.***** A Holevo-type bound for a Hilbert Schmidt

More information

Correlation Detection and an Operational Interpretation of the Rényi Mutual Information

Correlation Detection and an Operational Interpretation of the Rényi Mutual Information Correlation Detection and an Operational Interpretation of the Rényi Mutual Information Masahito Hayashi 1, Marco Tomamichel 2 1 Graduate School of Mathematics, Nagoya University, and Centre for Quantum

More information

Classical codes for quantum broadcast channels. arxiv: Ivan Savov and Mark M. Wilde School of Computer Science, McGill University

Classical codes for quantum broadcast channels. arxiv: Ivan Savov and Mark M. Wilde School of Computer Science, McGill University Classical codes for quantum broadcast channels arxiv:1111.3645 Ivan Savov and Mark M. Wilde School of Computer Science, McGill University International Symposium on Information Theory, Boston, USA July

More information

Classical and Quantum Channel Simulations

Classical and Quantum Channel Simulations Classical and Quantum Channel Simulations Mario Berta (based on joint work with Fernando Brandão, Matthias Christandl, Renato Renner, Joseph Renes, Stephanie Wehner, Mark Wilde) Outline Classical Shannon

More information

Lecture 11: Quantum Information III - Source Coding

Lecture 11: Quantum Information III - Source Coding CSCI5370 Quantum Computing November 25, 203 Lecture : Quantum Information III - Source Coding Lecturer: Shengyu Zhang Scribe: Hing Yin Tsang. Holevo s bound Suppose Alice has an information source X that

More information

The Communication Complexity of Correlation. Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan

The Communication Complexity of Correlation. Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan The Communication Complexity of Correlation Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan Transmitting Correlated Variables (X, Y) pair of correlated random variables Transmitting

More information

Quantum Sphere-Packing Bounds and Moderate Deviation Analysis for Classical-Quantum Channels

Quantum Sphere-Packing Bounds and Moderate Deviation Analysis for Classical-Quantum Channels Quantum Sphere-Packing Bounds and Moderate Deviation Analysis for Classical-Quantum Channels (, ) Joint work with Min-Hsiu Hsieh and Marco Tomamichel Hao-Chung Cheng University of Technology Sydney National

More information

Different quantum f-divergences and the reversibility of quantum operations

Different quantum f-divergences and the reversibility of quantum operations Different quantum f-divergences and the reversibility of quantum operations Fumio Hiai Tohoku University 2016, September (at Budapest) Joint work with Milán Mosonyi 1 1 F.H. and M. Mosonyi, Different quantum

More information

Lecture 19 October 28, 2015

Lecture 19 October 28, 2015 PHYS 7895: Quantum Information Theory Fall 2015 Prof. Mark M. Wilde Lecture 19 October 28, 2015 Scribe: Mark M. Wilde This document is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike

More information

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions EE/Stat 376B Handout #5 Network Information Theory October, 14, 014 1. Problem.4 parts (b) and (c). Homework Set # Solutions (b) Consider h(x + Y ) h(x + Y Y ) = h(x Y ) = h(x). (c) Let ay = Y 1 + Y, where

More information

EECS 750. Hypothesis Testing with Communication Constraints

EECS 750. Hypothesis Testing with Communication Constraints EECS 750 Hypothesis Testing with Communication Constraints Name: Dinesh Krithivasan Abstract In this report, we study a modification of the classical statistical problem of bivariate hypothesis testing.

More information

Multivariate Trace Inequalities

Multivariate Trace Inequalities Multivariate Trace Inequalities Mario Berta arxiv:1604.03023 with Sutter and Tomamichel (to appear in CMP) arxiv:1512.02615 with Fawzi and Tomamichel QMath13 - October 8, 2016 Mario Berta (Caltech) Multivariate

More information

Capacity Estimates of TRO Channels

Capacity Estimates of TRO Channels Capacity Estimates of TRO Channels arxiv: 1509.07294 and arxiv: 1609.08594 Li Gao University of Illinois at Urbana-Champaign QIP 2017, Microsoft Research, Seattle Joint work with Marius Junge and Nicholas

More information

Second-Order Asymptotics in Information Theory

Second-Order Asymptotics in Information Theory Second-Order Asymptotics in Information Theory Vincent Y. F. Tan (vtan@nus.edu.sg) Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) National Taiwan University November 2015

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

Lecture 2: August 31

Lecture 2: August 31 0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 2: August 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy

More information

Multivariate trace inequalities. David Sutter, Mario Berta, Marco Tomamichel

Multivariate trace inequalities. David Sutter, Mario Berta, Marco Tomamichel Multivariate trace inequalities David Sutter, Mario Berta, Marco Tomamichel What are trace inequalities and why we should care. Main difference between classical and quantum world are complementarity and

More information

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem LECTURE 15 Last time: Feedback channel: setting up the problem Perfect feedback Feedback capacity Data compression Lecture outline Joint source and channel coding theorem Converse Robustness Brain teaser

More information

Semidefinite programming strong converse bounds for quantum channel capacities

Semidefinite programming strong converse bounds for quantum channel capacities Semidefinite programming strong converse bounds for quantum channel capacities Xin Wang UTS: Centre for Quantum Software and Information Joint work with Runyao Duan and Wei Xie arxiv:1610.06381 & 1601.06888

More information

Lecture 10: Broadcast Channel and Superposition Coding

Lecture 10: Broadcast Channel and Superposition Coding Lecture 10: Broadcast Channel and Superposition Coding Scribed by: Zhe Yao 1 Broadcast channel M 0M 1M P{y 1 y x} M M 01 1 M M 0 The capacity of the broadcast channel depends only on the marginal conditional

More information

Quantum Information Theory and Cryptography

Quantum Information Theory and Cryptography Quantum Information Theory and Cryptography John Smolin, IBM Research IPAM Information Theory A Mathematical Theory of Communication, C.E. Shannon, 1948 Lies at the intersection of Electrical Engineering,

More information

LECTURE 10. Last time: Lecture outline

LECTURE 10. Last time: Lecture outline LECTURE 10 Joint AEP Coding Theorem Last time: Error Exponents Lecture outline Strong Coding Theorem Reading: Gallager, Chapter 5. Review Joint AEP A ( ɛ n) (X) A ( ɛ n) (Y ) vs. A ( ɛ n) (X, Y ) 2 nh(x)

More information

On asymmetric quantum hypothesis testing

On asymmetric quantum hypothesis testing On asymmetric quantum hypothesis testing JMP, Vol 57, 6, 10.1063/1.4953582 arxiv:1612.01464 Cambyse Rouzé (Cambridge) Joint with Nilanjana Datta (University of Cambridge) and Yan Pautrat (Paris-Saclay)

More information

Quantum rate distortion, reverse Shannon theorems, and source-channel separation

Quantum rate distortion, reverse Shannon theorems, and source-channel separation Quantum rate distortion, reverse Shannon theorems, and source-channel separation ilanjana Datta, Min-Hsiu Hsieh, Mark Wilde (1) University of Cambridge,U.K. (2) McGill University, Montreal, Canada Classical

More information

Information Theoretic Limits of Randomness Generation

Information Theoretic Limits of Randomness Generation Information Theoretic Limits of Randomness Generation Abbas El Gamal Stanford University Shannon Centennial, University of Michigan, September 2016 Information theory The fundamental problem of communication

More information

A New Metaconverse and Outer Region for Finite-Blocklength MACs

A New Metaconverse and Outer Region for Finite-Blocklength MACs A New Metaconverse Outer Region for Finite-Blocklength MACs Pierre Moulin Dept of Electrical Computer Engineering University of Illinois at Urbana-Champaign Urbana, IL 680 Abstract An outer rate region

More information

The Method of Types and Its Application to Information Hiding

The Method of Types and Its Application to Information Hiding The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,

More information

Network coding for multicast relation to compression and generalization of Slepian-Wolf

Network coding for multicast relation to compression and generalization of Slepian-Wolf Network coding for multicast relation to compression and generalization of Slepian-Wolf 1 Overview Review of Slepian-Wolf Distributed network compression Error exponents Source-channel separation issues

More information

Simultaneous Nonunique Decoding Is Rate-Optimal

Simultaneous Nonunique Decoding Is Rate-Optimal Fiftieth Annual Allerton Conference Allerton House, UIUC, Illinois, USA October 1-5, 2012 Simultaneous Nonunique Decoding Is Rate-Optimal Bernd Bandemer University of California, San Diego La Jolla, CA

More information

On Third-Order Asymptotics for DMCs

On Third-Order Asymptotics for DMCs On Third-Order Asymptotics for DMCs Vincent Y. F. Tan Institute for Infocomm Research (I R) National University of Singapore (NUS) January 0, 013 Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs

More information

Efficient achievability for quantum protocols using decoupling theorems

Efficient achievability for quantum protocols using decoupling theorems Efficient achievability for quantum protocols using decoupling theorems Christoph Hirche and Ciara Morgan Institut für Theoretische Physik Leibniz Universität Hannover Appelstraße, D-3067 Hannover, Germany

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

AQI: Advanced Quantum Information Lecture 6 (Module 2): Distinguishing Quantum States January 28, 2013

AQI: Advanced Quantum Information Lecture 6 (Module 2): Distinguishing Quantum States January 28, 2013 AQI: Advanced Quantum Information Lecture 6 (Module 2): Distinguishing Quantum States January 28, 2013 Lecturer: Dr. Mark Tame Introduction With the emergence of new types of information, in this case

More information

Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities

Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities Vincent Y. F. Tan Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) September 2014 Vincent Tan

More information

to mere bit flips) may affect the transmission.

to mere bit flips) may affect the transmission. 5 VII. QUANTUM INFORMATION THEORY to mere bit flips) may affect the transmission. A. Introduction B. A few bits of classical information theory Information theory has developed over the past five or six

More information

Strong converse theorems using Rényi entropies

Strong converse theorems using Rényi entropies Strong converse theorems using Rényi entropies Felix Leditzky a, Mark M. Wilde b, and Nilanjana Datta a a Statistical Laboratory, Centre for Mathematical Sciences, University of Cambridge, Cambridge C3

More information

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability

More information

MGMT 69000: Topics in High-dimensional Data Analysis Falll 2016

MGMT 69000: Topics in High-dimensional Data Analysis Falll 2016 MGMT 69000: Topics in High-dimensional Data Analysis Falll 2016 Lecture 14: Information Theoretic Methods Lecturer: Jiaming Xu Scribe: Hilda Ibriga, Adarsh Barik, December 02, 2016 Outline f-divergence

More information

Introduction to Information Theory. B. Škorić, Physical Aspects of Digital Security, Chapter 2

Introduction to Information Theory. B. Škorić, Physical Aspects of Digital Security, Chapter 2 Introduction to Information Theory B. Škorić, Physical Aspects of Digital Security, Chapter 2 1 Information theory What is it? - formal way of counting information bits Why do we need it? - often used

More information

Sparse Regression Codes for Multi-terminal Source and Channel Coding

Sparse Regression Codes for Multi-terminal Source and Channel Coding Sparse Regression Codes for Multi-terminal Source and Channel Coding Ramji Venkataramanan Yale University Sekhar Tatikonda Allerton 2012 1 / 20 Compression with Side-Information X Encoder Rate R Decoder

More information

Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets

Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets Jonathan Scarlett and Vincent Y. F. Tan Department of Engineering, University of Cambridge Electrical and Computer Engineering,

More information

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Vincent Y. F. Tan (Joint work with Silas L. Fong) National University of Singapore (NUS) 2016 International Zurich Seminar on

More information

Dispersion of the Gilbert-Elliott Channel

Dispersion of the Gilbert-Elliott Channel Dispersion of the Gilbert-Elliott Channel Yury Polyanskiy Email: ypolyans@princeton.edu H. Vincent Poor Email: poor@princeton.edu Sergio Verdú Email: verdu@princeton.edu Abstract Channel dispersion plays

More information

On Multiple User Channels with State Information at the Transmitters

On Multiple User Channels with State Information at the Transmitters On Multiple User Channels with State Information at the Transmitters Styrmir Sigurjónsson and Young-Han Kim* Information Systems Laboratory Stanford University Stanford, CA 94305, USA Email: {styrmir,yhk}@stanford.edu

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

Converse bounds for private communication over quantum channels

Converse bounds for private communication over quantum channels Converse bounds for private communication over quantum channels Mark M. Wilde (LSU) joint work with Mario Berta (Caltech) and Marco Tomamichel ( Univ. Sydney + Univ. of Technology, Sydney ) arxiv:1602.08898

More information

arxiv: v3 [quant-ph] 9 Jan 2017

arxiv: v3 [quant-ph] 9 Jan 2017 Decoding quantum information via the etz recovery map Salman Beigi Nilanjana Datta Felix Leditzky May 23, 2016 arxiv:1504.04449v3 [quant-ph] 9 Jan 2017 Abstract We obtain a lower bound on the maximum number

More information

X 1 : X Table 1: Y = X X 2

X 1 : X Table 1: Y = X X 2 ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access

More information

2015 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media,

2015 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, 2015 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising

More information

Lecture 15: Conditional and Joint Typicaility

Lecture 15: Conditional and Joint Typicaility EE376A Information Theory Lecture 1-02/26/2015 Lecture 15: Conditional and Joint Typicaility Lecturer: Kartik Venkat Scribe: Max Zimet, Brian Wai, Sepehr Nezami 1 Notation We always write a sequence of

More information

Semidefinite programming strong converse bounds for quantum channel capacities

Semidefinite programming strong converse bounds for quantum channel capacities Semidefinite programming strong converse bounds for quantum channel capacities Xin Wang UTS: Centre for Quantum Software and Information Joint work with Wei Xie, Runyao Duan (UTS:QSI) QIP 2017, Microsoft

More information

EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm

EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm 1. Feedback does not increase the capacity. Consider a channel with feedback. We assume that all the recieved outputs are sent back immediately

More information

The Lovász ϑ-function in Quantum Mechanics

The Lovász ϑ-function in Quantum Mechanics The Lovász ϑ-function in Quantum Mechanics Zero-error Quantum Information and Noncontextuality Simone Severini UCL Oxford Jan 2013 Simone Severini (UCL) Lovász ϑ-function in Quantum Mechanics Oxford Jan

More information

Lecture 3: Channel Capacity

Lecture 3: Channel Capacity Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.

More information

The Gallager Converse

The Gallager Converse The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing

More information

Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities

Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities Maxim Raginsky and Igal Sason ISIT 2013, Istanbul, Turkey Capacity-Achieving Channel Codes The set-up DMC

More information

Covert Communication with Channel-State Information at the Transmitter

Covert Communication with Channel-State Information at the Transmitter Covert Communication with Channel-State Information at the Transmitter Si-Hyeon Lee Joint Work with Ligong Wang, Ashish Khisti, and Gregory W. Wornell July 27, 2017 1 / 21 Covert Communication Transmitter

More information

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Silas Fong (Joint work with Vincent Tan) Department of Electrical & Computer Engineering National University

More information

Explicit Receivers for Optical Communication and Quantum Reading

Explicit Receivers for Optical Communication and Quantum Reading Explicit Receivers for Optical Communication and Quantum Reading Mark M. Wilde School of Computer Science, McGill University Joint work with Saikat Guha, Si-Hui Tan, and Seth Lloyd arxiv:1202.0518 ISIT

More information

On the Capacity Region of the Gaussian Z-channel

On the Capacity Region of the Gaussian Z-channel On the Capacity Region of the Gaussian Z-channel Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@eng.umd.edu ulukus@eng.umd.edu

More information

Hypothesis Testing with Communication Constraints

Hypothesis Testing with Communication Constraints Hypothesis Testing with Communication Constraints Dinesh Krithivasan EECS 750 April 17, 2006 Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 1 / 21 Presentation Outline

More information

LECTURE 13. Last time: Lecture outline

LECTURE 13. Last time: Lecture outline LECTURE 13 Last time: Strong coding theorem Revisiting channel and codes Bound on probability of error Error exponent Lecture outline Fano s Lemma revisited Fano s inequality for codewords Converse to

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying Ahmad Abu Al Haija, and Mai Vu, Department of Electrical and Computer Engineering McGill University Montreal, QC H3A A7 Emails: ahmadabualhaija@mailmcgillca,

More information

An Introduction to Quantum Computation and Quantum Information

An Introduction to Quantum Computation and Quantum Information An to and Graduate Group in Applied Math University of California, Davis March 13, 009 A bit of history Benioff 198 : First paper published mentioning quantum computing Feynman 198 : Use a quantum computer

More information

Maximal Entanglement A New Measure of Entanglement

Maximal Entanglement A New Measure of Entanglement 1 Maximal Entanglement A New Measure of Entanglement Salman Beigi School of Mathematics, Institute for Research in Fundamental Sciences IPM, Tehran, Iran arxiv:1405.50v1 [quant-ph] 11 May 014 Abstract

More information

On the distinguishability of random quantum states

On the distinguishability of random quantum states 1 1 Department of Computer Science University of Bristol Bristol, UK quant-ph/0607011 Distinguishing quantum states Distinguishing quantum states This talk Question Consider a known ensemble E of n quantum

More information

On Large Deviation Analysis of Sampling from Typical Sets

On Large Deviation Analysis of Sampling from Typical Sets Communications and Signal Processing Laboratory (CSPL) Technical Report No. 374, University of Michigan at Ann Arbor, July 25, 2006. On Large Deviation Analysis of Sampling from Typical Sets Dinesh Krithivasan

More information

Problem Set: TT Quantum Information

Problem Set: TT Quantum Information Problem Set: TT Quantum Information Basics of Information Theory 1. Alice can send four messages A, B, C, and D over a classical channel. She chooses A with probability 1/, B with probability 1/4 and C

More information

Simple Channel Coding Bounds

Simple Channel Coding Bounds ISIT 2009, Seoul, Korea, June 28 - July 3,2009 Simple Channel Coding Bounds Ligong Wang Signal and Information Processing Laboratory wang@isi.ee.ethz.ch Roger Colbeck Institute for Theoretical Physics,

More information

Asymptotic Pure State Transformations

Asymptotic Pure State Transformations Asymptotic Pure State Transformations PHYS 500 - Southern Illinois University April 18, 2017 PHYS 500 - Southern Illinois University Asymptotic Pure State Transformations April 18, 2017 1 / 15 Entanglement

More information

Distributed Source Coding Using LDPC Codes

Distributed Source Coding Using LDPC Codes Distributed Source Coding Using LDPC Codes Telecommunications Laboratory Alex Balatsoukas-Stimming Technical University of Crete May 29, 2010 Telecommunications Laboratory (TUC) Distributed Source Coding

More information

EE 4TM4: Digital Communications II. Channel Capacity

EE 4TM4: Digital Communications II. Channel Capacity EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.

More information

FRAMES IN QUANTUM AND CLASSICAL INFORMATION THEORY

FRAMES IN QUANTUM AND CLASSICAL INFORMATION THEORY FRAMES IN QUANTUM AND CLASSICAL INFORMATION THEORY Emina Soljanin Mathematical Sciences Research Center, Bell Labs April 16, 23 A FRAME 1 A sequence {x i } of vectors in a Hilbert space with the property

More information

Limits on classical communication from quantum entropy power inequalities

Limits on classical communication from quantum entropy power inequalities Limits on classical communication from quantum entropy power inequalities Graeme Smith, IBM Research (joint work with Robert Koenig) QIP 2013 Beijing Channel Capacity X N Y p(y x) Capacity: bits per channel

More information

Homework Set #2 Data Compression, Huffman code and AEP

Homework Set #2 Data Compression, Huffman code and AEP Homework Set #2 Data Compression, Huffman code and AEP 1. Huffman coding. Consider the random variable ( x1 x X = 2 x 3 x 4 x 5 x 6 x 7 0.50 0.26 0.11 0.04 0.04 0.03 0.02 (a Find a binary Huffman code

More information

II. THE TWO-WAY TWO-RELAY CHANNEL

II. THE TWO-WAY TWO-RELAY CHANNEL An Achievable Rate Region for the Two-Way Two-Relay Channel Jonathan Ponniah Liang-Liang Xie Department of Electrical Computer Engineering, University of Waterloo, Canada Abstract We propose an achievable

More information

Single-shot Quantum State Merging

Single-shot Quantum State Merging ETH ZICH arxiv:0912.4495v1 [quant-ph] 22 Dec 2009 Single-shot Quantum State Merging by Mario Berta A Diploma thesis submitted to the Institute for Theorectical Physics Department of Physics Supervisor

More information

Channels with cost constraints: strong converse and dispersion

Channels with cost constraints: strong converse and dispersion Channels with cost constraints: strong converse and dispersion Victoria Kostina, Sergio Verdú Dept. of Electrical Engineering, Princeton University, NJ 08544, USA Abstract This paper shows the strong converse

More information

Quantum Information Theory Tutorial

Quantum Information Theory Tutorial Quantum Information Theory Tutorial Mark M. Wilde Hearne Institute for Theoretical Physics, Department of Physics and Astronomy, Center for Computation and Technology, Louisiana State University, Baton

More information

Degradable Quantum Channels

Degradable Quantum Channels Degradable Quantum Channels Li Yu Department of Physics, Carnegie-Mellon University, Pittsburgh, PA May 27, 2010 References The capacity of a quantum channel for simultaneous transmission of classical

More information

Fourier analysis of boolean functions in quantum computation

Fourier analysis of boolean functions in quantum computation Fourier analysis of boolean functions in quantum computation Ashley Montanaro Centre for Quantum Information and Foundations, Department of Applied Mathematics and Theoretical Physics, University of Cambridge

More information

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel

More information

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,

More information

On Composite Quantum Hypothesis Testing

On Composite Quantum Hypothesis Testing University of York 7 November 207 On Composite Quantum Hypothesis Testing Mario Berta Department of Computing with Fernando Brandão and Christoph Hirche arxiv:709.07268 Overview Introduction 2 Composite

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

On some special cases of the Entropy Photon-Number Inequality

On some special cases of the Entropy Photon-Number Inequality On some special cases of the Entropy Photon-Number Inequality Smarajit Das, Naresh Sharma and Siddharth Muthukrishnan Tata Institute of Fundamental Research December 5, 2011 One of the aims of information

More information

Quantum Data Compression

Quantum Data Compression PHYS 476Q: An Introduction to Entanglement Theory (Spring 2018) Eric Chitambar Quantum Data Compression With the basic foundation of quantum mechanics in hand, we can now explore different applications.

More information

Lecture: Quantum Information

Lecture: Quantum Information Lecture: Quantum Information Transcribed by: Crystal Noel and Da An (Chi Chi) November 10, 016 1 Final Proect Information Find an issue related to class you are interested in and either: read some papers

More information

Quiz 2 Date: Monday, November 21, 2016

Quiz 2 Date: Monday, November 21, 2016 10-704 Information Processing and Learning Fall 2016 Quiz 2 Date: Monday, November 21, 2016 Name: Andrew ID: Department: Guidelines: 1. PLEASE DO NOT TURN THIS PAGE UNTIL INSTRUCTED. 2. Write your name,

More information

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem Information Theory for Wireless Communications. Lecture 0 Discrete Memoryless Multiple Access Channel (DM-MAC: The Converse Theorem Instructor: Dr. Saif Khan Mohammed Scribe: Antonios Pitarokoilis I. THE

More information

Quantum Information Chapter 10. Quantum Shannon Theory

Quantum Information Chapter 10. Quantum Shannon Theory Quantum Information Chapter 10. Quantum Shannon Theory John Preskill Institute for Quantum Information and Matter California Institute of Technology Updated January 2018 For further updates and additional

More information

On the Capacity of the Interference Channel with a Relay

On the Capacity of the Interference Channel with a Relay On the Capacity of the Interference Channel with a Relay Ivana Marić, Ron Dabora and Andrea Goldsmith Stanford University, Stanford, CA {ivanam,ron,andrea}@wsl.stanford.edu Abstract Capacity gains due

More information

Interactive Decoding of a Broadcast Message

Interactive Decoding of a Broadcast Message In Proc. Allerton Conf. Commun., Contr., Computing, (Illinois), Oct. 2003 Interactive Decoding of a Broadcast Message Stark C. Draper Brendan J. Frey Frank R. Kschischang University of Toronto Toronto,

More information

EE5585 Data Compression May 2, Lecture 27

EE5585 Data Compression May 2, Lecture 27 EE5585 Data Compression May 2, 2013 Lecture 27 Instructor: Arya Mazumdar Scribe: Fangying Zhang Distributed Data Compression/Source Coding In the previous class we used a H-W table as a simple example,

More information

Capacity of channel with energy harvesting transmitter

Capacity of channel with energy harvesting transmitter IET Communications Research Article Capacity of channel with energy harvesting transmitter ISSN 75-868 Received on nd May 04 Accepted on 7th October 04 doi: 0.049/iet-com.04.0445 www.ietdl.org Hamid Ghanizade

More information

Classical Capacities of Quantum Channels

Classical Capacities of Quantum Channels Classical Capacities of Quantum Channels Jens Christian Bo Jørgensen Supervisor: Jan Philip Solovej Thesis for the Master degree in Mathematics Institute for Mathematical Sciences University of Copenhagen

More information