FRAMES IN QUANTUM AND CLASSICAL INFORMATION THEORY

Size: px
Start display at page:

Download "FRAMES IN QUANTUM AND CLASSICAL INFORMATION THEORY"

Transcription

1 FRAMES IN QUANTUM AND CLASSICAL INFORMATION THEORY Emina Soljanin Mathematical Sciences Research Center, Bell Labs April 16, 23

2 A FRAME 1 A sequence {x i } of vectors in a Hilbert space with the property that there are constants A, B such that A x 2 i x, x i 2 B x 2 for all x in the Hilbert space. Examples?

3 A SOURCE OF INFORMATION Classical 2 Discrete: produces sequences of letters. Letters belong to a finite alphabet X.

4 A SOURCE OF INFORMATION Classical 2 Discrete: produces sequences of letters. Letters belong to a finite alphabet X. Memoryless: each letter is produced independently. Probability of letter a is P a.

5 A SOURCE OF INFORMATION Classical 2 Discrete: produces sequences of letters. Letters belong to a finite alphabet X. Memoryless: each letter is produced independently. Probability of letter a is P a. Example: coin tossing with X = {H, T }.

6 A SOURCE OF INFORMATION Quantum 3 Letters are transmitted as d-dimensional unit-length vectors.

7 A SOURCE OF INFORMATION Quantum 3 Letters are transmitted as d-dimensional unit-length vectors. e and e 1 are the basis vectors of 2D space H 2 : [ ] [ ] 1 e = e 1 = 1 A qubit is a vector in H 2 : ψ = α e + β e 1

8 A SOURCE OF INFORMATION Quantum 3 Letters are transmitted as d-dimensional unit-length vectors. e and e 1 are the basis vectors of 2D space H 2 : [ ] [ ] 1 e = e 1 = 1 A qubit is a vector in H 2 : ψ = α e + β e 1 Example: X = {, 1, 2, 3}, ψ = α e + β e 1 ψ 1 = α 1 e + β 1 e 1 ψ 2 = α 2 e + β 2 e 1 ψ 3 = α 3 e + β 3 e 1.

9 QUANTUM DISCRETE MEMORYLESS SOURCE The Density Matrix and Von Neumann Entropy 4 Source density matrix: ρ = a X P a ψ a ψ a.

10 QUANTUM DISCRETE MEMORYLESS SOURCE The Density Matrix and Von Neumann Entropy 4 Source density matrix: ρ = a X P a ψ a ψ a. Von Neumann entropy of the source: S(ρ) = Tr ρ log ρ = i λ i log λ i, where λ i are the eigenvalues of ρ.

11 QUANTUM DISCRETE MEMORYLESS SOURCE MB Example 5 X = {1, 2, 3} P 1 = P 2 = P 3 = 1/3 ψ 1 = [ 1 ] d = 2 ψ 3 = [ 1/2 3/2 ] ψ 2 = [ ] 1/2 3/2

12 QUANTUM DISCRETE MEMORYLESS SOURCE MB Example 5 X = {1, 2, 3} P 1 = P 2 = P 3 = 1/3 ψ 1 = [ 1 ] ρ = 1 3 ψ 1 ψ ψ 2 ψ ψ 3 ψ 3 = 1 2 I d = 2 ψ 3 = [ 1/2 3/2 ] ψ 2 = [ ] 1/2 3/2

13 QUANTUM DISCRETE MEMORYLESS SOURCE MB Example 5 X = {1, 2, 3} P 1 = P 2 = P 3 = 1/3 ψ 1 = [ 1 ] ρ = 1 3 ψ 1 ψ ψ 2 ψ ψ 3 ψ 3 = 1 2 I d = 2 S(ρ) = 1 ψ 3 = [ 1/2 3/2 ] ψ 2 = [ ] 1/2 3/2

14 QUANTUM DISCRETE MEMORYLESS SOURCE Vector Sequences 6 Sequences of length n are d n -dimensional vectors. Source vector-sequence (state): Ψ x = ψ x1 ψ xn, x i X.

15 QUANTUM DISCRETE MEMORYLESS SOURCE Vector Sequences 6 Sequences of length n are d n -dimensional vectors. Source vector-sequence (state): Ψ x = ψ x1 ψ xn, x i X. Among all states that come from the source, we can distinguish 2 n(s(ρ) ε n) reliably.

16 SENDING PACKETS OVER LOSSY NETWORKS MB Example 7 Send φ by sending φ and φ 1 ψ 1 = [ 1 ] φ = [ φ1 φ ] d = 2 ψ 3 = [ 1/2 3/2 ] ψ 2 = [ ] 1/2 3/2

17 SENDING PACKETS OVER LOSSY NETWORKS MB Example 7 Send φ by sending φ and φ 1 Send φ by sending ψ 1 φ, ψ 2 φ, ψ 3 φ ψ 1 = [ 1 ] φ = [ φ1 φ ] d = 2 ψ 3 = [ 1/2 3/2 ] ψ 2 = [ ] 1/2 3/2

18 SENDING PACKETS OVER LOSSY NETWORKS MB Example 7 Send φ by sending φ and φ 1 Send φ by sending ψ 1 φ, ψ 2 φ, ψ 3 φ ψ 1 = [ 1 ] φ = [ φ1 φ ] d = 2 ψ 3 = [ 1/2 3/2 ] ψ 2 = [ ] 1/2 3/2 φ = 2 3 ( ψ 1 φ ψ 1 + ψ 2 φ ψ 2 + ψ 3 φ ψ 3 )

19 SENDING PACKETS OVER LOSSY NETWORKS MB Example 7 Send φ by sending φ and φ 1 Send φ by sending ψ 1 φ, ψ 2 φ, ψ 3 φ ψ 1 = [ 1 ] φ = [ φ1 φ ] d = 2 ψ 3 = [ 1/2 3/2 ] ψ 2 = [ ] 1/2 3/2 φ = 2 3 ( ψ 1 φ ψ 1 + ψ 2 φ ψ 2 + ψ 3 φ ψ 3 ) φ = 2 3 ( ψ 1 ψ 1 + ψ 2 ψ 2 + ψ 3 ψ 3 ) φ

20 SYNCHRONOUS CDMA SYSTEMS K users and processing gain N 8 Each user has a signature N 1 length- N complex vector. Let s i be the signature and p i the power of user i for 1 i K.

21 SYNCHRONOUS CDMA SYSTEMS K users and processing gain N 8 Each user has a signature N 1 length- N complex vector. Let s i be the signature and p i the power of user i for 1 i K. The received vector is given by r = K i=1 pi b i s i + n where b i is the information symbol, for user i, E[b i ] =, E[b 2 i ] = 1; n is the (Gaussian) noise vector; E[n] =, E[nn ] = σ 2 I N.

22 SYNCHRONOUS CDMA SYSTEMS The Sum Capacity 9 Let user signatures and powers be given: S = [s 1,... s K ] and P = diag{p 1,..., p K } The sum capacity: C sum = 1 2 log[det(i N + σ 2 SP S )]

23 A COMMON MODEL The Object of Interest 1 An ensemble: K d-dimensional unit-length vectors ψ i

24 A COMMON MODEL The Object of Interest 1 An ensemble: K d-dimensional unit-length vectors ψ i K real numbers p i such that p p K = 1.

25 A COMMON MODEL The Object of Interest 1 An ensemble: K d-dimensional unit-length vectors ψ i K real numbers p i such that p p K = 1. Two matrices: F is the d K matrix whose columns are p i ψ i. Thus F F = K p i ψ i ψ i. i=1

26 A COMMON MODEL The Object of Interest 1 An ensemble: K d-dimensional unit-length vectors ψ i K real numbers p i such that p p K = 1. Two matrices: F is the d K matrix whose columns are p i ψ i. Thus F F = K i=1 p i ψ i ψ i. F F is the density matrix (frame operator) F F is the Gram matrix.

27 A FRAME 11 An ensemble { p i ψ i } of vectors in a Hilbert space with the property that there are constants A, B such that A ϕ ϕ i p i ϕ ψ i 2 B ϕ ϕ for all ϕ in the Hilbert space.

28 A FRAME 11 An ensemble { p i ψ i } of vectors in a Hilbert space with the property that there are constants A, B such that A ϕ ϕ i p i ϕ ψ i 2 B ϕ ϕ for all ϕ in the Hilbert space. Equivalently, AI d F F BI d

29 A COMMON MODEL Information Measures 12 The Von Neumann entropy: S = Tr F F log F F The sum capacity: C sum = 1 2 log[det(i d + dσ 2 F F )]

30 A COMMON MODEL Information Measures 12 The Von Neumann entropy: S = Tr F F log F F The sum capacity: C sum = 1 2 log[det(i d + dσ 2 F F )] Both are maximized by F F =

31 A COMMON MODEL Information Measures 12 The Von Neumann entropy: S = Tr F F log F F The sum capacity: C sum = 1 2 log[det(i d + dσ 2 F F )] Both are maximized by F F = 1 d I d.

32 HOW TO MIX A DENSITY MATRIX F F = K i=1 p i ψ i ψ i 13 Two problems: classification of ensembles having a given density matrix,

33 HOW TO MIX A DENSITY MATRIX F F = K i=1 p i ψ i ψ i 13 Two problems: classification of ensembles having a given density matrix, characterization of PDs consistent with a given density matrix.

34 HOW TO MIX A DENSITY MATRIX F F = K i=1 p i ψ i ψ i 13 Two problems: classification of ensembles having a given density matrix, characterization of PDs consistent with a given density matrix. Let {p i } be a PD, and p 1 p 2 p K.

35 HOW TO MIX A DENSITY MATRIX F F = K i=1 p i ψ i ψ i 13 Two problems: classification of ensembles having a given density matrix, characterization of PDs consistent with a given density matrix. Let {p i } be a PD, and p 1 p 2 p K. Let ρ be a density matrix, and λ 1 λ 2 λ d its eigenvalues. There exist vectors ψ i such that ρ = K i=1 p i ψ i ψ i iff n i=1 p i n λ i for all n < d. i=1

36 HOW TO MIX A DENSITY MATRIX F F = 1 d I d 14 For ρ = 1 d I d, condition n p i n λ i for all n < d. i=1 i=1 becomes p 1 1/d.

37 HOW TO MIX A DENSITY MATRIX F F = 1 d I d 14 For ρ = 1 d I d, condition n p i n λ i for all n < d. i=1 i=1 becomes p 1 1/d. In CDMA, user i is said to be oversized if K p i > j=i+1 d i p j

38 INTERFERENCE MEASURE IN CDMA Total Square Correlation (TSC) 15 The Welch s lower bound to TSC (frame potential): K i=1 K p i p j ψ i ψ j 2 1 d j=1

39 INTERFERENCE MEASURE IN CDMA Total Square Correlation (TSC) 15 The Welch s lower bound to TSC (frame potential): K i=1 K j=1 p i p j ψ i ψ j 2 1 d with equality iff K i=1 p i ψ i ψ i = 1 d I d. WBE sequences minimize the TSC maximize the sum capacity and Von Neumann entropy

40 INTERFERENCE MEASURE IN CDMA Total Square Correlation (TSC) 15 The Welch s lower bound to TSC (frame potential): K i=1 K j=1 p i p j ψ i ψ j 2 1 d with equality iff K i=1 p i ψ i ψ i = 1 d I d. WBE sequences minimize the TSC maximize the sum capacity and Von Neumann entropy What does reducing TSC mean?

41 SOME COMMUNICATION CHANNELS 16 Binary Symmetric Channel: input 1 1 w w w 1 w 1 output

42 SOME COMMUNICATION CHANNELS 16 Binary Symmetric Channel: 1 w input w w output 1 1 w 1 Noisy Typewriter: B B input H H output N N

43 A CQ COMMUNICATION CHANNEL A Probabilistic Device 17 Inputs: vectors ψ i, i X. Outputs: vectors ϕ j determined by the chosen measurement. Transition probabilities determined by the chosen measurement. output input ψ i P (j i) ϕ j ϕ l ψ k P (l k)

44 QUANTUM MEASUREMENT Von Neumann s Measurement 18 A set of pairwise orthogonal projection operators {Π i }. They form a complete resolution of the identity: i Π i = I.

45 QUANTUM MEASUREMENT Von Neumann s Measurement 18 A set of pairwise orthogonal projection operators {Π i }. They form a complete resolution of the identity: i Π i = I. For input ψ j, output Π i ψ j happens with probability ψ j Π i ψ j.

46 QUANTUM MEASUREMENT Von Neumann s Measurement 18 A set of pairwise orthogonal projection operators {Π i }. They form a complete resolution of the identity: i Π i = I. For input ψ j, output Π i ψ j happens with probability ψ j Π i ψ j. Example: Π = ψ 1 ψ ψ 2 Π 1 = ψ input output ψ 1 ψ 1 2

47 QUANTUM MEASUREMENT Positive Operator-Valued Measure 19 Any set of positive-semidefinite operators {E i }. They form a complete resolution of the identity: i E i = I. For input ψ j, output E i ψ j happens with probability ψ j E i ψ j.

48 QUANTUM MEASUREMENT Positive Operator-Valued Measure 19 Any set of positive-semidefinite operators {E i }. They form a complete resolution of the identity: i E i = I. For input ψ j, output E i ψ j happens with probability ψ j E i ψ j. Example: ψ 1 ϕ 3 ψ ψ ϕ 2 ϕ ϕ ψ input ψ ϕ 3 2 ϕ 3 output ψ 1 ϕ 3 2 ϕ 1 ψ 1 ψ 1 ϕ 1 2 ϕ 1

49 OPTIMAL QUANTUM MEASUREMENTS Some Open Problems 2 POVMs minimizing the detection error-probability.

50 OPTIMAL QUANTUM MEASUREMENTS Some Open Problems 2 POVMs minimizing the detection error-probability. POVMs attaining the accessible information (number of elements)

51 OPTIMAL QUANTUM MEASUREMENTS Some Open Problems 2 POVMs minimizing the detection error-probability. POVMs attaining the accessible information (number of elements). Example: ψ 1 = [ 1 ] ψ 3 = [ 1/2 3/2 ] ψ 2 = [ ] 1/2 3/2

52 OPTIMAL QUANTUM MEASUREMENTS Some Open Problems 2 POVMs minimizing the detection error-probability. POVMs attaining the accessible information (number of elements). Example: 1 α ψ 1 = α ψ 1 = [ 1 ] ψ 3 = 1 α/2 3 1 α/2 α [ 1/2 ψ 3 = 3/2 ] ψ 2 = ψ 2 = 1 α/2 3 1 α/2 α [ ] 1/2 3/2

53 A SOURCE OF INFORMATION Classical 21 Discrete: produces sequences of letters. Letters belong to a finite alphabet X. Memoryless: each letter is produced independently. Probability of letter a is P a. Example: coin tossing with X = {H, T }.

54 CLASSICAL DISCRETE MEMORYLESS SOURCE Sequences and Large Sets 22 Source sequence x= x 1, x 2,..., x n is in X n. N(a x) denotes the number occurrences of a in x.

55 CLASSICAL DISCRETE MEMORYLESS SOURCE Sequences and Large Sets 22 Source sequence x= x 1, x 2,..., x n is in X n. N(a x) denotes the number occurrences of a in x. Consider all sequences x for which 1 n N(a x) P a δ for every a X. They form the set of typical sequences T n P,δ.

56 CLASSICAL DISCRETE MEMORYLESS SOURCE Sequences and Large Sets 22 Source sequence x= x 1, x 2,..., x n is in X n. N(a x) denotes the number occurrences of a in x. Consider all sequences x for which 1 n N(a x) P a δ for every a X. They form the set of typical sequences T n P,δ. Set T n P,δ is probabilistically large: P n (T n P,δ) 1 ɛ n.

57 DISCRETE MEMORYLESS SOURCE Shannon Entropy 23 H(P ) = a X P a log P a.

58 DISCRETE MEMORYLESS SOURCE Shannon Entropy 23 H(P ) = a X P a log P a. Set T n P,δ contains approximately 2nH(P ) sequences: 2 n[h(p ) ɛ n] T n P,δ 2 n[h(p )+ɛ n]

59 DISCRETE MEMORYLESS SOURCE Shannon Entropy 23 H(P ) = a X P a log P a. Set T n P,δ contains approximately 2nH(P ) sequences: 2 n[h(p ) ɛ n] T n P,δ 2 n[h(p )+ɛ n] The probability of typical sequences x is approximately 2 nh(p ) : 2 n[h(p )+ɛ n ] P x 2 n[h(p ) ɛ n ]

60 24 QUANTUM DISCRETE MEMORYLESS SOURCE Vector Sequences Sequences of length n are d n -dimensional vectors: e e e e 1 e 1 e e 1 e

61 24 QUANTUM DISCRETE MEMORYLESS SOURCE Vector Sequences Sequences of length n are d n -dimensional vectors: e e e e 1 e 1 e e 1 e Source vector-sequence (state) Ψ x = ψ x1 ψ x2 ψ xn, x i X, appears with probability P x = P x1 P x2... P xn.

62 QUANTUM DISCRETE MEMORYLESS SOURCE Vector Sequences 25 Source vector-sequence (state) Ψ x = ψ x1 ψ x2 ψ xn, x i X, appears with probability P x = P x1 P x2... P xn. Typical states Ψ x H 2n correspond to typical sequences x. There are approximately 2 nh(p ) typical states.

63 QUANTUM DISCRETE MEMORYLESS SOURCE Typical Subspace 26 Typical states Ψ x H 2n live in the typical subspace.

64 QUANTUM DISCRETE MEMORYLESS SOURCE Typical Subspace 26 Typical states Ψ x H 2n live in the typical subspace. Typical subspace Λ n of H 2n : Λ n Ψ x = Ψ x Λ n + Ψ x Λ n Λ n

65 QUANTUM DISCRETE MEMORYLESS SOURCE Typical Subspace 26 Typical states Ψ x H 2n live in the typical subspace. Typical subspace Λ n of H 2n : Λ n Ψ x = Ψ x Λ n + Ψ x Λ n Λ n The dimension of Λ n is approximately 2 ns(ρ).

66 Code C = {x 1,..., x M } X n 27 F is the d n M matrix whose columns are ψ xi / M. Thus F F = 1 M M Ψ xi Ψ xi. i=1

67 Code C = {x 1,..., x M } X n 27 F is the d n M matrix whose columns are ψ xi / M. Thus F F = 1 M M Ψ xi Ψ xi. i=1 Ψ x1,..., Ψ xm span U, an r-dimensional subspace of H dn.

68 Code C = {x 1,..., x M } X n 27 F is the d n M matrix whose columns are ψ xi / M. Thus F F = 1 M M Ψ xi Ψ xi. i=1 Ψ x1,..., Ψ xm span U, an r-dimensional subspace of H dn. Perform the SVD of F and define a scaled projection on U: F = r λk u k v k, F F = k=1 r λ k u k u k, P U = k=1 r k=1 1 r u k u k

69 Code C = {x 1,..., x M } X n 27 F is the d n M matrix whose columns are ψ xi / M. Thus F F = 1 M M Ψ xi Ψ xi. i=1 Ψ x1,..., Ψ xm span U, an r-dimensional subspace of H dn. Perform the SVD of F and define a scaled projection on U: F = r λk u k v k, F F = k=1 r λ k u k u k, P U = k=1 r k=1 1 r u k u k How far is F F from P U?

70 DISTANCES BETWEEN DENSITY MATRICES ρ and σ 28 Trace distance: D(σ, ω) = 1 2 Tr σ ω, A denotes the positive square root of A A.

71 DISTANCES BETWEEN DENSITY MATRICES ρ and σ 28 Trace distance: D(σ, ω) = 1 2 Tr σ ω, A denotes the positive square root of A A. Uhlman Fidelity: F (σ, ω) = { Tr [ ( σω σ) 1/2]} 2. 1 F (σ, ω) D(σ, ω) 1 F (σ, ω) 2

72 DISTANCES BETWEEN DENSITY MATRICES ρ and σ 28 Trace distance: D(σ, ω) = 1 2 Tr σ ω, A denotes the positive square root of A A. Uhlman Fidelity: F (σ, ω) = { Tr [ ( σω σ) 1/2]} 2. 1 F (σ, ω) D(σ, ω) 1 F (σ, ω) 2 Frobenius (Hilbert-Schmidt)?

73 DISTANCES BETWEEN PD S An Example 29 A N = {a 1,..., a N } N = 2 K and n = 2 k, with k/k = c < 1.

74 DISTANCES BETWEEN PD S An Example 29 A N = {a 1,..., a N } N = 2 K and n = 2 k, with k/k = c < 1. Distributions P and Q: P (a i ) = 1 N

75 DISTANCES BETWEEN PD S An Example 29 A N = {a 1,..., a N } N = 2 K and n = 2 k, with k/k = c < 1. Distributions P and Q: { 1/n, 1 i n P (a i ) = 1 N and Q(a i) = n + 1 i N

76 DISTANCES BETWEEN PD S An Example 29 A N = {a 1,..., a N } N = 2 K and n = 2 k, with k/k = c < 1. Distributions P and Q: { 1/n, 1 i n P (a i ) = 1 N and Q(a i) = n + 1 i N P ({a n+1,..., a N }) 1 as k, K.

77 DISTANCES BETWEEN PD S An Example 29 A N = {a 1,..., a N } N = 2 K and n = 2 k, with k/k = c < 1. Distributions P and Q: { 1/n, 1 i n P (a i ) = 1 N and Q(a i) = n + 1 i N P ({a n+1,..., a N }) 1 as k, K. 1 2 i P (a i) Q(a i ) 1 and i P (a i) Q(a i ) 2.

78 A BASIS FOR Λ n 3 How far is F F from P U?

79 A BASIS FOR Λ n 3 How far is F F from P U? [ r λ k 1 ] 2 r ( r r λ k 1 ) r k=1 k=1 ( r = r k=1 λ 2 k ) 1

80 A BASIS FOR Λ n 3 How far is F F from P U? [ r λ k 1 ] 2 r ( r r λ k 1 ) r k=1 k=1 = r M 2 M i=1 ( r = r k=1 λ 2 k ) M Ψ xi Ψ xj 2 1 j=1 1

81 A BASIS FOR Λ n 3 How far is F F from P U? [ r λ k 1 ] 2 r ( r r λ k 1 ) r k=1 k=1 = r M 2 1 M M i=1 M i=1 ( r = r k=1 λ 2 k ) M Ψ xi Ψ xj 2 1 j=1 M Ψ xi Ψ xj 2 j=1 j i 1

82 A BASIS FOR Λ n 3 How far is F F from P U? [ r λ k 1 ] 2 r ( r r λ k 1 ) r k=1 k=1 = r M 2 1 M M i=1 M i=1 ( r = r k=1 λ 2 k ) M Ψ xi Ψ xj 2 1 j=1 M Ψ xi Ψ xj 2 j=1 j i 1 Use the random coding argument!

83 A BASIS FOR Λ n The Random Coding Argument 31 Averaging over all codes: { M E i=1 M Ψ xi Ψ xj 2} = M(M 1) Tr(ρ n ρ n ) j=1 j i

84 A BASIS FOR Λ n The Random Coding Argument 31 Averaging over all codes: { M E i=1 M Ψ xi Ψ xj 2} = M(M 1) Tr(ρ n ρ n ) j=1 j i There exists a code C with M codewords s.t. [ r λk 1 ] 2 M Tr(ρ n ρ n ) r k=1

85 A BASIS FOR Λ n The Random Coding Argument 31 Averaging over all codes: { M E i=1 M Ψ xi Ψ xj 2} = M(M 1) Tr(ρ n ρ n ) j=1 j i There exists a code C with M codewords s.t. [ r λk 1 ] 2 M Tr(ρ n ρ n ) r k=1 There exists a code C, C = 2 nr s.t. on Λ n [ r λ k 1 ] 2 2 n(s(ρ) ε n R) r k=1

86 COMBINATORICS AND GEOMETRY The MB Example 32 X = {1, 2, 3} P 1 = P 2 = P 3 = 1/3 ψ 1 = [ 1 ] ρ = 1 3 ψ 1 ψ ψ 2 ψ ψ 3 ψ 3 = 1 2 I d = 2 S(ρ) = 1 ψ 3 = [ 1/2 3/2 ] ψ 2 = [ ] 1/2 3/2

87 COMBINATORICS AND GEOMETRY The MB Example 32 X = {1, 2, 3} P 1 = P 2 = P 3 = 1/3 ψ 1 = [ 1 ] ρ = 1 3 ψ 1 ψ ψ 2 ψ ψ 3 ψ 3 = 1 2 I d = 2 S(ρ) = 1 ψ 3 = [ 1/2 3/2 ] ψ 2 = [ ] 1/2 3/2 There are 3 n typical vectors forming a frame in H 2n.

88 COMBINATORICS AND GEOMETRY The MB Example 32 X = {1, 2, 3} P 1 = P 2 = P 3 = 1/3 ψ 1 = [ 1 ] ρ = 1 3 ψ 1 ψ ψ 2 ψ ψ 3 ψ 3 = 1 2 I d = 2 S(ρ) = 1 ψ 3 = [ 1/2 3/2 ] ψ 2 = [ ] 1/2 3/2 There are 3 n typical vectors forming a frame in H 2n. About 2 n of those vectors form a basis of H 2n. Which ones?

to mere bit flips) may affect the transmission.

to mere bit flips) may affect the transmission. 5 VII. QUANTUM INFORMATION THEORY to mere bit flips) may affect the transmission. A. Introduction B. A few bits of classical information theory Information theory has developed over the past five or six

More information

MP 472 Quantum Information and Computation

MP 472 Quantum Information and Computation MP 472 Quantum Information and Computation http://www.thphys.may.ie/staff/jvala/mp472.htm Outline Open quantum systems The density operator ensemble of quantum states general properties the reduced density

More information

Introduction to Quantum Information Hermann Kampermann

Introduction to Quantum Information Hermann Kampermann Introduction to Quantum Information Hermann Kampermann Heinrich-Heine-Universität Düsseldorf Theoretische Physik III Summer school Bleubeuren July 014 Contents 1 Quantum Mechanics...........................

More information

An Introduction to Quantum Computation and Quantum Information

An Introduction to Quantum Computation and Quantum Information An to and Graduate Group in Applied Math University of California, Davis March 13, 009 A bit of history Benioff 198 : First paper published mentioning quantum computing Feynman 198 : Use a quantum computer

More information

Asymptotic Pure State Transformations

Asymptotic Pure State Transformations Asymptotic Pure State Transformations PHYS 500 - Southern Illinois University April 18, 2017 PHYS 500 - Southern Illinois University Asymptotic Pure State Transformations April 18, 2017 1 / 15 Entanglement

More information

9. Distance measures. 9.1 Classical information measures. Head Tail. How similar/close are two probability distributions? Trace distance.

9. Distance measures. 9.1 Classical information measures. Head Tail. How similar/close are two probability distributions? Trace distance. 9. Distance measures 9.1 Classical information measures How similar/close are two probability distributions? Trace distance Fidelity Example: Flipping two coins, one fair one biased Head Tail Trace distance

More information

Quantum Data Compression

Quantum Data Compression PHYS 476Q: An Introduction to Entanglement Theory (Spring 2018) Eric Chitambar Quantum Data Compression With the basic foundation of quantum mechanics in hand, we can now explore different applications.

More information

Lecture 11: Quantum Information III - Source Coding

Lecture 11: Quantum Information III - Source Coding CSCI5370 Quantum Computing November 25, 203 Lecture : Quantum Information III - Source Coding Lecturer: Shengyu Zhang Scribe: Hing Yin Tsang. Holevo s bound Suppose Alice has an information source X that

More information

Quantum Information Theory

Quantum Information Theory Chapter 5 Quantum Information Theory Quantum information theory is a rich subject that could easily have occupied us all term. But because we are short of time (I m anxious to move on to quantum computation),

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

Entanglement: concept, measures and open problems

Entanglement: concept, measures and open problems Entanglement: concept, measures and open problems Division of Mathematical Physics Lund University June 2013 Project in Quantum information. Supervisor: Peter Samuelsson Outline 1 Motivation for study

More information

Compression and entanglement, entanglement transformations

Compression and entanglement, entanglement transformations PHYSICS 491: Symmetry and Quantum Information April 27, 2017 Compression and entanglement, entanglement transformations Lecture 8 Michael Walter, Stanford University These lecture notes are not proof-read

More information

Entropy in Classical and Quantum Information Theory

Entropy in Classical and Quantum Information Theory Entropy in Classical and Quantum Information Theory William Fedus Physics Department, University of California, San Diego. Entropy is a central concept in both classical and quantum information theory,

More information

Entropies & Information Theory

Entropies & Information Theory Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information

More information

Quantum Information Types

Quantum Information Types qitd181 Quantum Information Types Robert B. Griffiths Version of 6 February 2012 References: R. B. Griffiths, Types of Quantum Information, Phys. Rev. A 76 (2007) 062320; arxiv:0707.3752 Contents 1 Introduction

More information

Schur-Weyl duality, quantum data compression, tomography

Schur-Weyl duality, quantum data compression, tomography PHYSICS 491: Symmetry and Quantum Information April 25, 2017 Schur-Weyl duality, quantum data compression, tomography Lecture 7 Michael Walter, Stanford University These lecture notes are not proof-read

More information

AQI: Advanced Quantum Information Lecture 6 (Module 2): Distinguishing Quantum States January 28, 2013

AQI: Advanced Quantum Information Lecture 6 (Module 2): Distinguishing Quantum States January 28, 2013 AQI: Advanced Quantum Information Lecture 6 (Module 2): Distinguishing Quantum States January 28, 2013 Lecturer: Dr. Mark Tame Introduction With the emergence of new types of information, in this case

More information

Entanglement Measures and Monotones Pt. 2

Entanglement Measures and Monotones Pt. 2 Entanglement Measures and Monotones Pt. 2 PHYS 500 - Southern Illinois University April 8, 2017 PHYS 500 - Southern Illinois University Entanglement Measures and Monotones Pt. 2 April 8, 2017 1 / 13 Entanglement

More information

Quantum Information Theory and Cryptography

Quantum Information Theory and Cryptography Quantum Information Theory and Cryptography John Smolin, IBM Research IPAM Information Theory A Mathematical Theory of Communication, C.E. Shannon, 1948 Lies at the intersection of Electrical Engineering,

More information

Chapter 14: Quantum Information Theory and Photonic Communications

Chapter 14: Quantum Information Theory and Photonic Communications Chapter 14: Quantum Information Theory and Photonic Communications Farhan Rana (MIT) March, 2002 Abstract The fundamental physical limits of optical communications are discussed in the light of the recent

More information

Introduction to Quantum Computing

Introduction to Quantum Computing Introduction to Quantum Computing Petros Wallden Lecture 3: Basic Quantum Mechanics 26th September 2016 School of Informatics, University of Edinburgh Resources 1. Quantum Computation and Quantum Information

More information

Lecture: Quantum Information

Lecture: Quantum Information Lecture: Quantum Information Transcribed by: Crystal Noel and Da An (Chi Chi) November 10, 016 1 Final Proect Information Find an issue related to class you are interested in and either: read some papers

More information

Ensembles and incomplete information

Ensembles and incomplete information p. 1/32 Ensembles and incomplete information So far in this course, we have described quantum systems by states that are normalized vectors in a complex Hilbert space. This works so long as (a) the system

More information

The Adaptive Classical Capacity of a Quantum Channel,

The Adaptive Classical Capacity of a Quantum Channel, The Adaptive Classical Capacity of a Quantum Channel, or Information Capacities of Three Symmetric Pure States in Three Dimensions Peter W. Shor 1 AT&T Labs Research Florham Park, NJ 07932 02139 1 Current

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis. Vector spaces DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_fall17/index.html Carlos Fernandez-Granda Vector space Consists of: A set V A scalar

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

CSCI 2570 Introduction to Nanocomputing

CSCI 2570 Introduction to Nanocomputing CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication

More information

Quantum Hadamard channels (II)

Quantum Hadamard channels (II) Quantum Hadamard channels (II) Vlad Gheorghiu Department of Physics Carnegie Mellon University Pittsburgh, PA 15213, USA August 5, 2010 Vlad Gheorghiu (CMU) Quantum Hadamard channels (II) August 5, 2010

More information

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel Introduction to Coding Theory CMU: Spring 2010 Notes 3: Stochastic channels and noisy coding theorem bound January 2010 Lecturer: Venkatesan Guruswami Scribe: Venkatesan Guruswami We now turn to the basic

More information

Exercise 1. = P(y a 1)P(a 1 )

Exercise 1. = P(y a 1)P(a 1 ) Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a

More information

On some special cases of the Entropy Photon-Number Inequality

On some special cases of the Entropy Photon-Number Inequality On some special cases of the Entropy Photon-Number Inequality Smarajit Das, Naresh Sharma and Siddharth Muthukrishnan Tata Institute of Fundamental Research December 5, 2011 One of the aims of information

More information

Chapter 2 The Density Matrix

Chapter 2 The Density Matrix Chapter 2 The Density Matrix We are going to require a more general description of a quantum state than that given by a state vector. The density matrix provides such a description. Its use is required

More information

Ph 219/CS 219. Exercises Due: Friday 3 November 2006

Ph 219/CS 219. Exercises Due: Friday 3 November 2006 Ph 9/CS 9 Exercises Due: Friday 3 November 006. Fidelity We saw in Exercise. that the trace norm ρ ρ tr provides a useful measure of the distinguishability of the states ρ and ρ. Another useful measure

More information

Chapter 9 Fundamental Limits in Information Theory

Chapter 9 Fundamental Limits in Information Theory Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For

More information

Lecture 19 October 28, 2015

Lecture 19 October 28, 2015 PHYS 7895: Quantum Information Theory Fall 2015 Prof. Mark M. Wilde Lecture 19 October 28, 2015 Scribe: Mark M. Wilde This document is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike

More information

Some Bipartite States Do Not Arise from Channels

Some Bipartite States Do Not Arise from Channels Some Bipartite States Do Not Arise from Channels arxiv:quant-ph/0303141v3 16 Apr 003 Mary Beth Ruskai Department of Mathematics, Tufts University Medford, Massachusetts 0155 USA marybeth.ruskai@tufts.edu

More information

Concentration of Measure Effects in Quantum Information. Patrick Hayden (McGill University)

Concentration of Measure Effects in Quantum Information. Patrick Hayden (McGill University) Concentration of Measure Effects in Quantum Information Patrick Hayden (McGill University) Overview Superdense coding Random states and random subspaces Superdense coding of quantum states Quantum mechanical

More information

Free probability and quantum information

Free probability and quantum information Free probability and quantum information Benoît Collins WPI-AIMR, Tohoku University & University of Ottawa Tokyo, Nov 8, 2013 Overview Overview Plan: 1. Quantum Information theory: the additivity problem

More information

Problem Set: TT Quantum Information

Problem Set: TT Quantum Information Problem Set: TT Quantum Information Basics of Information Theory 1. Alice can send four messages A, B, C, and D over a classical channel. She chooses A with probability 1/, B with probability 1/4 and C

More information

Optimal Sequences, Power Control and User Capacity of Synchronous CDMA Systems with Linear MMSE Multiuser Receivers

Optimal Sequences, Power Control and User Capacity of Synchronous CDMA Systems with Linear MMSE Multiuser Receivers Optimal Sequences, Power Control and User Capacity of Synchronous CDMA Systems with Linear MMSE Multiuser Receivers Pramod Viswanath, Venkat Anantharam and David.C. Tse {pvi, ananth, dtse}@eecs.berkeley.edu

More information

Shannon s Noisy-Channel Coding Theorem

Shannon s Noisy-Channel Coding Theorem Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 13, 2015 Lucas Slot, Sebastian Zur Shannon s Noisy-Channel Coding Theorem February 13, 2015 1 / 29 Outline 1 Definitions and Terminology

More information

PHY305: Notes on Entanglement and the Density Matrix

PHY305: Notes on Entanglement and the Density Matrix PHY305: Notes on Entanglement and the Density Matrix Here follows a short summary of the definitions of qubits, EPR states, entanglement, the density matrix, pure states, mixed states, measurement, and

More information

Topics in Quantum Information Theory

Topics in Quantum Information Theory Topics in Quantum Information Theory Lent Term 006 Sebastian Ahnert Lecture notes on: http://www.tcm.phy.cam.ac.uk/ sea31/ Corrections and comments welcome under: sea31@cam.ac.uk Warning: These notes are

More information

Quantum Entanglement- Fundamental Aspects

Quantum Entanglement- Fundamental Aspects Quantum Entanglement- Fundamental Aspects Debasis Sarkar Department of Applied Mathematics, University of Calcutta, 92, A.P.C. Road, Kolkata- 700009, India Abstract Entanglement is one of the most useful

More information

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12 Lecture 1: The Multiple Access Channel Copyright G. Caire 12 Outline Two-user MAC. The Gaussian case. The K-user case. Polymatroid structure and resource allocation problems. Copyright G. Caire 13 Two-user

More information

Structure of Unital Maps and the Asymptotic Quantum Birkhoff Conjecture. Peter Shor MIT Cambridge, MA Joint work with Anand Oza and Dimiter Ostrev

Structure of Unital Maps and the Asymptotic Quantum Birkhoff Conjecture. Peter Shor MIT Cambridge, MA Joint work with Anand Oza and Dimiter Ostrev Structure of Unital Maps and the Asymptotic Quantum Birkhoff Conjecture Peter Shor MIT Cambridge, MA Joint work with Anand Oza and Dimiter Ostrev The Superposition Principle (Physicists): If a quantum

More information

Information Dimension

Information Dimension Information Dimension Mina Karzand Massachusetts Institute of Technology November 16, 2011 1 / 26 2 / 26 Let X would be a real-valued random variable. For m N, the m point uniform quantized version of

More information

Mathematical Methods for Quantum Information Theory. Part I: Matrix Analysis. Koenraad Audenaert (RHUL, UK)

Mathematical Methods for Quantum Information Theory. Part I: Matrix Analysis. Koenraad Audenaert (RHUL, UK) Mathematical Methods for Quantum Information Theory Part I: Matrix Analysis Koenraad Audenaert (RHUL, UK) September 14, 2008 Preface Books on Matrix Analysis: R. Bhatia, Matrix Analysis, Springer, 1997.

More information

Lecture 4: Postulates of quantum mechanics

Lecture 4: Postulates of quantum mechanics Lecture 4: Postulates of quantum mechanics Rajat Mittal IIT Kanpur The postulates of quantum mechanics provide us the mathematical formalism over which the physical theory is developed. For people studying

More information

A Holevo-type bound for a Hilbert Schmidt distance measure

A Holevo-type bound for a Hilbert Schmidt distance measure Journal of Quantum Information Science, 205, *,** Published Online **** 204 in SciRes. http://www.scirp.org/journal/**** http://dx.doi.org/0.4236/****.204.***** A Holevo-type bound for a Hilbert Schmidt

More information

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet. EE376A - Information Theory Final, Monday March 14th 216 Solutions Instructions: You have three hours, 3.3PM - 6.3PM The exam has 4 questions, totaling 12 points. Please start answering each question on

More information

Lecture Notes 1: Vector spaces

Lecture Notes 1: Vector spaces Optimization-based data analysis Fall 2017 Lecture Notes 1: Vector spaces In this chapter we review certain basic concepts of linear algebra, highlighting their application to signal processing. 1 Vector

More information

5. Communication resources

5. Communication resources 5. Communication resources Classical channel Quantum channel Entanglement How does the state evolve under LOCC? Properties of maximally entangled states Bell basis Quantum dense coding Quantum teleportation

More information

Useful Concepts from Information Theory

Useful Concepts from Information Theory Chapter 2 Useful Concepts from Information Theory 2.1 Quantifying Information 2.1.1 The entropy It turns out that there is a way to quantify the intuitive notion that some messages contain more information

More information

Quantum Hadamard channels (I)

Quantum Hadamard channels (I) .... Quantum Hadamard channels (I) Vlad Gheorghiu Department of Physics Carnegie Mellon University Pittsburgh, PA 15213, U.S.A. July 8, 2010 Vlad Gheorghiu (CMU) Quantum Hadamard channels (I) July 8, 2010

More information

Multiplicativity of Maximal p Norms in Werner Holevo Channels for 1 < p 2

Multiplicativity of Maximal p Norms in Werner Holevo Channels for 1 < p 2 Multiplicativity of Maximal p Norms in Werner Holevo Channels for 1 < p 2 arxiv:quant-ph/0410063v1 8 Oct 2004 Nilanjana Datta Statistical Laboratory Centre for Mathematical Sciences University of Cambridge

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

Quantum Fisher Information. Shunlong Luo Beijing, Aug , 2006

Quantum Fisher Information. Shunlong Luo Beijing, Aug , 2006 Quantum Fisher Information Shunlong Luo luosl@amt.ac.cn Beijing, Aug. 10-12, 2006 Outline 1. Classical Fisher Information 2. Quantum Fisher Information, Logarithm 3. Quantum Fisher Information, Square

More information

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is

More information

Introduction to quantum information processing

Introduction to quantum information processing Introduction to quantum information processing Measurements and quantum probability Brad Lackey 25 October 2016 MEASUREMENTS AND QUANTUM PROBABILITY 1 of 22 OUTLINE 1 Probability 2 Density Operators 3

More information

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute ENEE 739C: Advanced Topics in Signal Processing: Coding Theory Instructor: Alexander Barg Lecture 6 (draft; 9/6/03. Error exponents for Discrete Memoryless Channels http://www.enee.umd.edu/ abarg/enee739c/course.html

More information

Lecture 8: Shannon s Noise Models

Lecture 8: Shannon s Noise Models Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have

More information

Ping Pong Protocol & Auto-compensation

Ping Pong Protocol & Auto-compensation Ping Pong Protocol & Auto-compensation Adam de la Zerda For QIP seminar Spring 2004 02.06.04 Outline Introduction to QKD protocols + motivation Ping-Pong protocol Security Analysis for Ping-Pong Protocol

More information

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing Lecture 7 Agenda for the lecture M-ary hypothesis testing and the MAP rule Union bound for reducing M-ary to binary hypothesis testing Introduction of the channel coding problem 7.1 M-ary hypothesis testing

More information

Lecture 18: Shanon s Channel Coding Theorem. Lecture 18: Shanon s Channel Coding Theorem

Lecture 18: Shanon s Channel Coding Theorem. Lecture 18: Shanon s Channel Coding Theorem Channel Definition (Channel) A channel is defined by Λ = (X, Y, Π), where X is the set of input alphabets, Y is the set of output alphabets and Π is the transition probability of obtaining a symbol y Y

More information

Entanglement-Assisted Capacity of a Quantum Channel and the Reverse Shannon Theorem

Entanglement-Assisted Capacity of a Quantum Channel and the Reverse Shannon Theorem IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 10, OCTOBER 2002 2637 Entanglement-Assisted Capacity of a Quantum Channel the Reverse Shannon Theorem Charles H. Bennett, Peter W. Shor, Member, IEEE,

More information

Information measures, entanglement and quantum evolution

Information measures, entanglement and quantum evolution Information measures, entanglement and quantum evolution Claudia Zander Faculty of Natural & Agricultural Sciences University of Pretoria Pretoria Submitted in partial fulfilment of the requirements for

More information

Lecture 4 Channel Coding

Lecture 4 Channel Coding Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity

More information

Chapter 5. Density matrix formalism

Chapter 5. Density matrix formalism Chapter 5 Density matrix formalism In chap we formulated quantum mechanics for isolated systems. In practice systems interect with their environnement and we need a description that takes this feature

More information

Error Exponent Region for Gaussian Broadcast Channels

Error Exponent Region for Gaussian Broadcast Channels Error Exponent Region for Gaussian Broadcast Channels Lihua Weng, S. Sandeep Pradhan, and Achilleas Anastasopoulos Electrical Engineering and Computer Science Dept. University of Michigan, Ann Arbor, MI

More information

Quantum Stochastic Maps and Frobenius Perron Theorem

Quantum Stochastic Maps and Frobenius Perron Theorem Quantum Stochastic Maps and Frobenius Perron Theorem Karol Życzkowski in collaboration with W. Bruzda, V. Cappellini, H.-J. Sommers, M. Smaczyński Institute of Physics, Jagiellonian University, Cracow,

More information

Noisy-Channel Coding

Noisy-Channel Coding Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/05264298 Part II Noisy-Channel Coding Copyright Cambridge University Press 2003.

More information

Quantum Entanglement and Measurement

Quantum Entanglement and Measurement Quantum Entanglement and Measurement Haye Hinrichsen in collaboration with Theresa Christ University of Würzburg, Germany 2nd Workhop on Quantum Information and Thermodynamics Korea Institute for Advanced

More information

Lecture 6: Quantum error correction and quantum capacity

Lecture 6: Quantum error correction and quantum capacity Lecture 6: Quantum error correction and quantum capacity Mark M. Wilde The quantum capacity theorem is one of the most important theorems in quantum hannon theory. It is a fundamentally quantum theorem

More information

Optimal Sequences and Sum Capacity of Synchronous CDMA Systems

Optimal Sequences and Sum Capacity of Synchronous CDMA Systems Optimal Sequences and Sum Capacity of Synchronous CDMA Systems Pramod Viswanath and Venkat Anantharam {pvi, ananth}@eecs.berkeley.edu EECS Department, U C Berkeley CA 9470 Abstract The sum capacity of

More information

arxiv:quant-ph/ v2 14 May 2002

arxiv:quant-ph/ v2 14 May 2002 arxiv:quant-ph/0106052v2 14 May 2002 Entanglement-Assisted Capacity of a Quantum Channel and the Reverse Shannon Theorem Charles H. Bennett, Peter W. Shor, John A. Smolin, and Ashish V. Thapliyal Abstract

More information

Introduction To Information Theory

Introduction To Information Theory Introduction To Information Theory Edward Witten PiTP 2018 We will start with a very short introduction to classical information theory (Shannon theory). Suppose that you receive a message that consists

More information

Remarks on the Additivity Conjectures for Quantum Channels

Remarks on the Additivity Conjectures for Quantum Channels Contemporary Mathematics Remarks on the Additivity Conjectures for Quantum Channels Christopher King Abstract. In this article we present the statements of the additivity conjectures for quantum channels,

More information

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I

More information

Quantum rate distortion, reverse Shannon theorems, and source-channel separation

Quantum rate distortion, reverse Shannon theorems, and source-channel separation Quantum rate distortion, reverse Shannon theorems, and source-channel separation ilanjana Datta, Min-Hsiu Hsieh, and Mark M. Wilde arxiv:08.4940v3 [quant-ph] 9 Aug 202 Abstract We derive quantum counterparts

More information

CS/Ph120 Homework 4 Solutions

CS/Ph120 Homework 4 Solutions CS/Ph10 Homework 4 Solutions November 3, 016 Problem 1: Robustness of GHZ and W states, part Solution: Due to Bolton Bailey a For the GHZ state, we have T r N GHZ N GHZ N = 1 0 N 1 0 N 1 + 1 N 1 1 N 1

More information

A One-to-One Code and Its Anti-Redundancy

A One-to-One Code and Its Anti-Redundancy A One-to-One Code and Its Anti-Redundancy W. Szpankowski Department of Computer Science, Purdue University July 4, 2005 This research is supported by NSF, NSA and NIH. Outline of the Talk. Prefix Codes

More information

1. Basic rules of quantum mechanics

1. Basic rules of quantum mechanics 1. Basic rules of quantum mechanics How to describe the states of an ideally controlled system? How to describe changes in an ideally controlled system? How to describe measurements on an ideally controlled

More information

16.36 Communication Systems Engineering

16.36 Communication Systems Engineering MIT OpenCourseWare http://ocw.mit.edu 16.36 Communication Systems Engineering Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. 16.36: Communication

More information

National University of Singapore Department of Electrical & Computer Engineering. Examination for

National University of Singapore Department of Electrical & Computer Engineering. Examination for National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:

More information

Gilles Brassard. Université de Montréal

Gilles Brassard. Université de Montréal Gilles Brassard Université de Montréal Gilles Brassard Université de Montréal VOLUME 76, NUMBER 5 P H Y S I C A L R E V I E W L E T T E R S 29 JANUARY 1996 Purification of Noisy Entanglement and Faithful

More information

Exemple of Hilbert spaces with dimension equal to two. *Spin1/2particle

Exemple of Hilbert spaces with dimension equal to two. *Spin1/2particle University Paris-Saclay - IQUPS Optical Quantum Engineering: From fundamentals to applications Philippe Grangier, Institut d Optique, CNRS, Ecole Polytechnique. Eemple of Hilbert spaces with dimension

More information

Kapitza Fall 2017 NOISY QUANTUM MEASUREMENT

Kapitza Fall 2017 NOISY QUANTUM MEASUREMENT Kapitza Fall 2017 NOIY QUANTUM MEAUREMENT Ben altzman Q.I.T. TERM PAPER Professor arada Raeev December 10 th 2017 1 Introduction Not only does God play dice but... where they can t be seen. he sometimes

More information

Physics 239/139 Spring 2018 Assignment 3 Solutions

Physics 239/139 Spring 2018 Assignment 3 Solutions University of California at San Diego Department of Physics Prof. John McGreevy Physics 239/139 Spring 2018 Assignment 3 Solutions Due 12:30pm Monday, April 23, 2018 1. Mutual information bounds correlations.

More information

Algebraic Theory of Entanglement

Algebraic Theory of Entanglement Algebraic Theory of (arxiv: 1205.2882) 1 (in collaboration with T.R. Govindarajan, A. Queiroz and A.F. Reyes-Lega) 1 Physics Department, Syracuse University, Syracuse, N.Y. and The Institute of Mathematical

More information

Lecture 2. Capacity of the Gaussian channel

Lecture 2. Capacity of the Gaussian channel Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN

More information

C.M. Liu Perceptual Signal Processing Lab College of Computer Science National Chiao-Tung University

C.M. Liu Perceptual Signal Processing Lab College of Computer Science National Chiao-Tung University Quantization C.M. Liu Perceptual Signal Processing Lab College of Computer Science National Chiao-Tung University http://www.csie.nctu.edu.tw/~cmliu/courses/compression/ Office: EC538 (03)5731877 cmliu@cs.nctu.edu.tw

More information

Multi User Detection I

Multi User Detection I January 12, 2005 Outline Overview Multiple Access Communication Motivation: What is MU Detection? Overview of DS/CDMA systems Concept and Codes used in CDMA CDMA Channels Models Synchronous and Asynchronous

More information

ELEC546 MIMO Channel Capacity

ELEC546 MIMO Channel Capacity ELEC546 MIMO Channel Capacity Vincent Lau Simplified Version.0 //2004 MIMO System Model Transmitter with t antennas & receiver with r antennas. X Transmitted Symbol, received symbol Channel Matrix (Flat

More information

arxiv:quant-ph/ v2 3 Sep 2002

arxiv:quant-ph/ v2 3 Sep 2002 Quantum Theory and Classical Information arxiv:quant-ph/009v 3 Sep 00 Göran Einarsson Royal Institute of Technology, Telecommunication Theory, Dept. of Signals, Sensors and Systems, 0044 Stockholm, Sweden.

More information

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7 Electrical and Information Technology Information Theory Problems and Solutions Contents Problems.......... Solutions...........7 Problems 3. In Problem?? the binomial coefficent was estimated with Stirling

More information

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5. Channel capacity Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5. Exercices Exercise session 11 : Channel capacity 1 1. Source entropy Given X a memoryless

More information

Appendix B Information theory from first principles

Appendix B Information theory from first principles Appendix B Information theory from first principles This appendix discusses the information theory behind the capacity expressions used in the book. Section 8.3.4 is the only part of the book that supposes

More information

Introduction to Quantum Information Processing QIC 710 / CS 768 / PH 767 / CO 681 / AM 871

Introduction to Quantum Information Processing QIC 710 / CS 768 / PH 767 / CO 681 / AM 871 Introduction to Quantum Information Processing QIC 710 / CS 768 / PH 767 / CO 681 / AM 871 Lecture 9 (2017) Jon Yard QNC 3126 jyard@uwaterloo.ca http://math.uwaterloo.ca/~jyard/qic710 1 More state distinguishing

More information