Secret Key Agreement: General Capacity and Second-Order Asymptotics

Size: px
Start display at page:

Download "Secret Key Agreement: General Capacity and Second-Order Asymptotics"

Transcription

1 Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe 1 Abstract We revisit the problem of secret key agreement using interactive public communication for two parties and propose a new secret key agreement protocol. The protocol attains the secret key capacity for general observations and attains the second-order asymptotic term in the maximum length of a secret key for independent and identically distributed observations. Our converse proofs rely on a recently established upper bound for secret key lengths. I. INTRODUCTION Two parties observing random variables (RVs) X and Y seek to agree on a secret key. They can communicate interactively over an error-free, authenticated, albeit insecure, communication channel of unlimited capacity. The secret key must be concealed from an eavesdropper with access to the communication and an additional side information Z. What is the maximum length S(X, Y Z) of a secret key that the parties can agree upon? A study of this question was initiated by Maurer [14] and Ahlswede and Csiszár [1] for the case where the observations of the parties and the eavesdropper consist of n independent and identically distributed (IID) repetitions (X n, Y n, Z n ) of RVs (X, Y, Z). For the case when X Y Z form a Markov chain, it was shown in [14], [1] that S(X n, Y n Z n ) = ni(x Y Z) + o(n). The Graduate School of Mathematics, Nagoya University, Japan, and The Center for Quantum Technologies, National University of Singapore, Singapore. masahito@math.nagoya-u.ac.jp Information Theory and Applications (ITA) Centre, University of California, San Diego, La Jolla, CA 92093, USA. htyagi@eng.ucsd.edu Department of Information Science and Intelligent Systems, University of Tokushima, Tokushima , Japan, and Institute for Systems Research, University of Maryland, College Park, MD 20742, USA. shun-wata@is.tokushima-u.ac.jp An initial version of this paper was presented at the IEEE International Symposium on Information Theory, Hawaii, USA, 2014.

2 2 However, in several applications (see, for instance, [5]) the observed data is not IID or even if the observations are IID, the observation length n is limited and a more precise asymptotic analysis is needed. In this paper, we address the secret key agreement problem for these two important practical situations. First, when the observations consist of general sources [7] (X n, Y n, Z n ) such that X n Y n Z n is a Markov chain, we show that S(X n, Y n Z n ) = ni(x Y Z) + o(n), where I(X Y Z) is the inf-conditional information of X and Y given Z. Next, for the IID case with X Y Z, we identify the second-order asymptotic term 1 in S(X n, Y n Z n ). Specifically, denoting by S ɛ,δ (X, Y Z) the maximum length of a secret key over which the parties agree with probability greater than 1 ɛ and with secrecy parameter less than δ, we show that S ɛ,δ (X n, Y n Z n ) = ni(x Y Z) nv Q 1 (ɛ + δ) + o( n), where Q is the tail probability of the standard Gaussian distribution and [ V := Var log P XY Z (X, Y Z) P X Z (X Z) P Y Z (Y Z) In particular, our bounds allow us to evaluate the gap to secret key capacity at a finite blocklength n. In Figure 1 we illustrate this gap between the maximum rate of a secret key at a fixed n and the secret key capacity for the case where Z is a random bit, Y is obtained by flipping Z with probability 0.25 and X given by flipping Y with probability 0.125; see Example 1 for details. Motivating this work, and underlying our converse proof, is a recently established single-shot upper bound on secret key lengths for the multiparty secret key agreement problem [25] (see, also, [24]). In spirit, this result can be regarded as a multiterminal variant of the meta-converse of Polyanskiy, Poor, and Verdú [17], [16] (see, also, [10], [27]). For the achievability proofs, we present new single-shot secret key agreement protocols. Note that the single-shot protocol suggested in [19] does not yield the claimed asymptotic bounds on secret key lengths. The difficulty lies in the spread of the information ]. spectrum of X different direction of bounds on P X are needed for information reconciliation and privacy amplification. To remedy this, in our scheme we slice the information spectrum appropriately before agreeing on a secret key. In fact, we present two different protocols: The first is based on slicing 1 Following the pioneering work of Strassen [21], study of these second-order terms in coding theorems has been revived recently by Hayashi [8], [9] and Polyanskiy, Poor, and Verdú [17].

3 3 Fig. 1. Gap to secret key capacity at finite n for ɛ + δ = 0.01, 0.05, 0.1. the spectrum of P X and is noninteractive, while the second is based on slicing the spectrum of P X Y and requires interactive communication. The basic concepts of secret key agreement are given in the next section. In Section III, we review the single-shot upper bound of [25] for the two party case. Our new secret key agreement protocols are presented in Section IV. The single-shot results are applied to general sources in Section V and to IID sources in Section VI. The final section contains a discussion on the role of interaction in our secret key agreement protocols. II. SECRET KEYS We consider the problem of secret key agreement using interactive public communication by two (trusted) parties observing RVs X and Y taking values in countable sets X and Y, respectively. Upon making these observations, the parties communicate interactively over a public communication channel that is accessible by an eavesdropper. We assume that the communication channel is error-free and authenticated. Specifically, the communication is sent over r rounds of interaction. In the jth round of communication, 1 j r, each party sends a communication which is a function of its observation, locally generated randomness denoted by 2 U x and U y, and the previously observed communication. The 2 The RVs U x and U y are mutually independent and independent jointly of (X, Y ).

4 4 overall interactive communication is denoted by F. In addition to F, the eavesdropper observes a RV Z taking values in a countable set Z. The joint distribution P XY Z is known to all parties. Using the interactive communication F and their local observations, the parties agree on a secret key. A RV K constitutes a secret key if the two parties form estimates that agree with K with probability close to 1 and K is concealed, in effect, from an eavesdropper with access to (F, Z). Formally, a RV K with range K constitutes an (ɛ, δ)-secret key ((ɛ, δ)-sk) if there exist functions K x and K y of (U x, X, F) and (U y, Y, F), respectively, such that the following two conditions are satisfied P (K x = K y = K) 1 ɛ, (1) P KFZ P unif P FZ 1 δ, (2) where P unif is the uniform distribution on K and P Q 1 = 1 P(u) Q(u). 2 u The first condition above represents the reliable recovery of the secret key and the second condition guarantees secrecy. Definition 1. Given 0 ɛ, δ 1, the supremum over the lengths log K of an (ɛ, δ)-sk is denoted by S ɛ,δ (X, Y Z). Remark 1. Note that K 1 =constant constitutes a (0, 1)-SK and K 2 P unif (K) for arbitrary K constitutes a (1, 0)-SK. For ɛ + δ 1, the RV K which equals K 1 with probability (1 ɛ) and K 2 with porbability ɛ constitutes (ɛ, 1 ɛ)-sk, and therefore, also an (ɛ, δ)-sk. Since the (0, 1)-SK K 2 can be of arbitrarily long length, S ɛ,δ (X, Y Z) =. Thus, the only case of interest is ɛ + δ < 1. Remark 2. A more restrictive secrecy requirement enforces one of the estimates K x or K y itself to be secure, i.e., P KxFZ P unif P FZ 1 δ or PKyFZ P unif P 1 FZ δ While the two definitions are closely related, there is a subtle difference. We shall elaborate more on this difference in Section IV. In the following two sections we shall present upper and lower bounds on S ɛ,δ (X, Y Z). Subsequently, for the case when X Y Z form a Markov chain, these bounds will be used to establish the secret key capacity for general sources and the second-order asymptotics of secret key rates when the underlying

5 5 observations (X, Y, Z) (X n, Y n, Z n ) are IID. III. UPPER BOUND ON S ɛ,δ (X, Y Z) We recall the conditional independence testing upper bound on S ɛ,δ (X, Y Z), which was established recently in [25], [26]. In fact, the general upper bound in [25], [26] is a single-shot upper bound on the secret key length for a multiparty secret key agreement problem. Here, we recall a specialization of the general result to the case at hand. In order to state our result, we need the following concept from binary hypothesis testing. Consider a binary hypothesis testing problem with null hypothesis P and alternative hypothesis Q, where P and Q are distributions on the same alphabet V. Upon observing a value v V, the observer needs to decide if the value was generated by the distribution P or the distribution Q. To this end, the observer applies a stochastic test T, which is a conditional distribution on {0, 1} given an observation v V. When v V is observed, the test T chooses the null hypothesis with probability T(0 v) and the alternative hypothesis with probability T (1 v) = 1 T (0 v). For 0 ɛ < 1, denote by β ɛ (P, Q) the infimum of the probability of error of type II given that the probability of error of type I is less than ɛ, i.e., where β ɛ (P, Q) := inf Q[T], (3) T : P[T] 1 ɛ P[T] = v Q[T] = v P(v)T(0 v), Q(v)T(0 v). Our upper bound applies to an alternative definition of secret key, which conveniently combines the recovery and secrecy conditions; the next result relates our original definition to the alternative definition. Lemma 1. Given 0 ɛ, δ 1 and an (ɛ, δ)-sk K, the local estimates K x and K y satisfy P KxK yfz P unif (2) P FZ 1 ɛ + δ, (4) where, for a pmf P on X, P (m) denotes its extension to X m given by P (m) (x 1,..., x m ) = P (x) 1(x 1 =... = x m ), (x 1,..., x m ) X m. Conversely, if K x and K y satisfy (4) with ɛ in place of ɛ + δ, either K x or K y constitutes an (ɛ, ɛ)-sk.

6 6 Proof. For an (ɛ, δ)-sk K, Since P KxK yfz P (2) 1 unif P FZ P KKxK yfz P (3) 1 unif P FZ P KKxK yfz P (3) 1 K FZ P FZ + P (3) K FZ P FZ P (3) 1 unif P FZ. P Q 1 = P({x : P(x) > Q(x)}) Q({x : P(x) > Q(x)}), the first term on the right-side above satisfies P KKxK yfz P K FZ (3) P FZ 1 = 1 P (K = K x = K y ) ɛ, (5) and the second term equals PK FZ P FZ P unif P FZ 1 δ, which gives the claimed bound. For the second claim, ɛ-secrecy is immediate and ɛ-recoverability follows in a similar manner as (5). The following upper bound was established in [25], [26] under the alternative definition of secrecy (4); below we note its consequence for our case. Theorem 2 (Conditional independence testing bound). Given 0 ɛ + δ < 1, 0 < η < 1 ɛ δ, the following bound holds: S ɛ,δ (X, Y Z) log β ɛ+δ+η ( PXY Z, Q X Z Q Y Z Q Z ) + 2 log(1/η), for all joint distributions Q on X Y Z that render X and Y conditionally independent given Z. IV. PROTOCOLS FOR SECRET KEY AGREEMENT In this section we present our secret key agreement protocols, which will be used in all the achievability results of this paper. A typical secret key agreement protocol for two parties entails sharing the observations of one of the party, referred to as information reconciliation, and then extracting a secret key out of the shared observations, referred to as privacy amplification (cf. [14], [1], [18]). In another interpretation, the parties first communicate to establish a common randomness [2] and then extract a

7 7 secret key from the common randomness 3. Our protocols below, too, have these two components but the rate of the communication for information reconciliation and the rate of the randomness extracted by privacy amplification have to be chosen carefully. In Remark 1, for the trivial case ɛ + δ > 1 we exhibited a high reliability (0, 1)-SK and a high secrecy (1, 0)-SK. The two constructions together sufficed to characterize S ɛ,δ (X, Y Z). Following a similar approach for the regime ɛ + δ < 1, we can construct a high reliability (ɛ + δ, 0)-SK and a high secrecy (0, ɛ + δ)-sk and randomize over the two keys with probabilities ɛ/(ɛ + δ) and δ/(ɛ + δ) to obtain an (ɛ, δ)-sk. The result below shows that this is not needed and a high reliability construction will suffice. Proposition 3. Given an (ɛ, δ)-sk, there exists an (ɛ + δ, 0)-SK. Remark 3. Note that the proposition does not hold for the restrictive secrecy definition of Remark 2. Indeed, when X = Y = Z, the rv K = X constitutes a (0, 1)-SK under the secrecy definition of Remark 2, but the only possible (1, 0)-SK is K = constant. Proof. Let K be an (ɛ, δ)-sk using interactive communication F, with local estimates K x and K y. For each fixed realization of (F, Z), let P KK F,Z be the maximal coupling of P K F,Z and P unif. Then P K FZ = P unif P FZ, and since K is an (ɛ, δ)-sk, we get by the maximal coupling property (cf. [22]) that P ( K K ) = P KFZ P unif P FZ 1 δ, which further gives P ( K x = K y = K ) 1 ɛ δ. Thus, K constitutes an (ɛ + δ, 0)-SK. Therefore, it suffices to construct a (0, ɛ+δ)-sk since then by Proposition 3 a (ɛ+δ, 0)-SK exists, and we obtain an (ɛ, δ)-sk by randomizing over these two secret keys. Nevertheless, for the special case of constant Z, we present both a high reliability secret key agreement protocol with almost perfect recovery and a high secrecy protocol with almost perfect secrecy. By randomizing over these two protocols, an (ɛ, δ)-sk can be generated even under the restrictive secrecy of Remark 2; for details, see Remark 5. Further, we extend the high reliability protocol to an arbitrary Z. We apply these constrcutions in subsequent sections to derive asymptotically tight bounds on secret key length. A common feature of both the protocols is their reliance on spectrum slicing, a technique introduced in [7] to mitigate rate 3 For an interpretation of secrecy agreement in terms of common randomness decomposition, see [3], [4], [23].

8 8 losses due to the spread of information spectrum. However, the exact choice of the spectrum to split is different for the two protocols. Before describing our protocols, we recall two well-known results that will be used in our analysis. The first result is a version of the distributed source coding theorem for two sources [15], [7, Lemma 7.2.1] (see, also, [13]). Lemma 4 (Slepian-Wolf Coding). Given two distributions P XY and Q XY on X Y, for every γ > 0 there exists a code (e, d) of size M with encoder e : X {1,..., M}, and a decoder d : {1,..., M} Y X, such that P XY ({(x, y) : x d(e(x), y)}) P XY ( {(x, y) : log QX Y (x y) log M γ} ) + 2 γ. For privacy amplification, we will rely on the leftover hash lemma [12], [19]. Let F be a 2-universal family of mappings f : X K, i.e., for each x x, the family F satisfies 1 1(f(x) = f(x )) 1 F K. f F Lemma 5 (Leftover Hash). Consider RVs X, Z, and V taking values in X, Z and V, respectively, where X and Z are countable and V is finite. Let S be a random seed such that f S is uniformly distributed over a 2-universal family as above. Then, for K = f S (X) and for any Q Z satisfying supp(p Z ) supp(q Z ), we have P KV ZS P unif P V ZS K V 2 Hmin(PXZ QZ), where P unif is the uniform distribution on K and A. Protocols for Z = constant H min (P XZ Q Z ) = log sup x,z:q Z(z)>0 P XZ (x, z). Q Z (z) We start with protocols for the special case when Z = constant, which was addressed partly in [11]. The dilemma in choosing which spectrum to slice is apparent from Lemmas 4 and 5: while the former calls for the slicing of the spectrum of P X, the latter requires the slicing of the spectrum of P X Y. Below we present two different protocols, each based on spectrum slicing applied to the spectrum of either P X or of P X Y ; the former yields almost perfect secrecy while compromising recovery, while the latter yields almost perfect recovery while compromising secrecy.

9 9 1) The high secrecy protocol: Our first protocol is based on slicing the spectrum of P X and is described in the proof of the next result, which characterizes the performance of the protocol. Denote by i XY (x, y) the information density Theorem 6. For λ min, λ max, > 0 with λ max > λ min, let i XY (x, y) := log P XY (x, y) P X (x) P Y (y). (6) L = λ max λ min. Then, for every γ > 0 and 0 λ λ min, there exists an (ɛ, δ)-sk K taking values in K with ɛ P (i XY (X, Y ) λ + γ + ) + P XY (X 0 ) + 2 γ + 1 L, δ 1 K 2 2 (λ 2 log L), where X 0 := {(x, y) : log P X (x) λ max or log P X (x) λ min } Proof. We first slice the spectrum of P X into L + 1 parts. Specifically, for 1 i L, let λ i = λ min + (i 1) and define X i := {x : λ i log P X (x) < λ i + }. We divide the indices 0 i L into good indices I g and the bad indices I b = Ig, c where I g = {i : i > 0 and P I (i) 1L } 2. Denote by P i the conditional distribution of X, Y given I = i, i.e., P i (x, y) = P XY (x, y) P X (X i ) 1(x X i), y Y, 1 i L. Let (e i, d i ) be the Slepian-Wolf code of Lemma 4 for P XY = P i and Q XY = P XY. Our secret key agreement protocol is as follows: Protocol A1. Party observing X finds the index I {0, 1,..., L} such that X X I and sends it to the other party. If I I g, the first party further sends e I (X) and the second party decodes X as

10 10 ˆX = d I (Y, e I ). The first party generates the secret key K as K S (X), I I g, K = unif(k), otherwise, where K S is a randomly chosen member of a 2-universal family of mappings on X of range size K. The second party does the same with ˆX in place of X. In effect, if I I b, the protocol declares an error. To bound the error in information reconciliation, note that for all i I g by Lemma 4 with P XY = P i and Q XY = P XY P i ({(x, y) : x d i (e i (x), y)}) 2 γ ( P i {(x, y) : log PX Y (x y) log M i γ} ) }) = P i ({(x, y) : log P X (x) i XY (x, y) log M i γ }) P i ({(x, y) : λ i + i XY (x, y) log M i γ, where the previous inequality uses the definition of P i. On choosing log M i = λ i λ, we get P i ({(x, y) : x d i (e i (x), y)}) P i ({(x, y) : i XY (x, y) λ + γ + }) + 2 γ. An error in information reconciliation occurs if either I / I g or if i I g and X d i (e i (X), Y ). From the bound above ɛ P XY (I / I g ) + 2 γ + i I g P I (i) P i ({(x, y) : i XY (x, y) λ + γ + }) P XY (I / I g ) + 2 γ + P XY ({(x, y) : i XY (x, y) λ + γ + }), which using P I (I b ) = i I b P I (i) P I (0) + 1 L

11 11 gives ɛ P X (X 0 ) + 2 γ + 1 L + P XY ({(x, y) : i XY (x, y) λ + γ + }), proving the reliability bound of the theorem. Next, denoting by P i,x the marginal on X induced by P i, note that for each i I g log P i,x (x) = log P X (x) P X (X i ) λ i 2 log L, where the last inequality uses the definition of X i and I g. It follows that H min (P i,x ) λ i 2 log L. Therefore, for each i I g an application of Lemma 5 implies that for a RV X i with distribution P i,x PKei(X i) P unif P ei(x i) 1 1 K M i 2 2 Hmin(Pi,X) 1 K 2 2 (λ 2 log L), i I g, and so, denoting the overall communication (I, e I (X)) by F, we get P KF P unif P F 1 = P I (i).0 + P I (i). PKie i(x i) P unif P ei(x i) 1 i/ I g i I g 1 2 K 2 (λ 2 log L), which proves the secrecy bound claimed in the theorem. While any high reliability protocol can be converted into a high secrecy protocol using Proposition 3, it is unclear if a high secrecy protocol can be converted to a high reliability protocol in general. However, high secrecy Protocol A1 can be converted into a high reliability protocol as follows: The second party upon decoding X computes the indicator of the error event E i { log P X Y (X Y ) log M i γ} and sends it back to the first party. If E i occurs, the secret key K is as in the protocol above. Otherwise, K is chosen to be a constant. For this modified secret key, the event E i is accounted for in the secrecy parameter δ and not in ɛ as earlier. Thus, the modified secret key has a high reliability. In the next section we present an alternative high reliability protocol which, while requiring more feedback, extends to a

12 12 general Z. 2) The high reliability protocol: The bottleneck in the performance of the protocol of Theorem 6 is its reliability parameter ɛ, which is limited by the probability P (i XY (X, Y ) λ + γ + ), i.e., the rate λ of the secret key must be (approximately) a lower bound on i XY (X, Y ) with probability greater than 1 ɛ. Our next protocol, which slices the spectrum of P X Y, shifts this bottleneck to the secrecy parameter δ. Interestingly, unlike the previous protocol, the next one requires communication from both parties and is interactive. In particular, the party observing X starts by transmitting at low rate and then gradually increases the rate until the other party reports a correct decoding. Theorem 7. For λ min, λ max, > 0 with λ max > λ min, let L = λ max λ min. Then, for every γ > 0 and 0 λ, there exists an (ɛ, δ)-sk K taking values in K with ɛ P XY (T 0 ) + L2 γ, δ P (i XY (X, Y ) λ + ) K 2 (λ γ 3 log L) + 1 L + P XY (T 0 ) + L2 γ, where T 0 := { (x, y) : log P X Y (x y) λ max or log P X Y (x y) λ min } Proof. For 1 i L, the ith slice of the spectrum of P X Y is defined as follows T i = {(x, y) : λ i log P XY (x y) < λ i + }, where λ i = λ min + (i 1). For each 1 i L, we have {x : (x, y) T i } exp(λ i + ) for every y. For information reconciliation, randomly bin the elements of X successively into bins of sizes M 1,..., M L where λ γ, i = 1 log M i =, 1 < i L.

13 13 We now describe the information reconciliation stage of our protocol. Protocol A2: Information reconciliation stage. The protocol can have up to L rounds of communication. At the ith round, the first party sends B i = F 1i (X), the random bin of X into M i bins. If there exists a unique ˆX such that ( ˆX, Y ) T i and F 1j ( ˆX) = B j, 1 j i, the second party decoded X as ˆX and sends back an ACK F 2i = 1 to the first party; otherwise it sends back a NACK F 2i = 0. The protocol stops if either F 2i = 1 or if i = L. Otherwise, the (i + 1)th round begins. Let ρ(x, Y ) denote the number of rounds after which the protocol stops when the observations are (X, Y ). An error occurs if (X, Y ) T 0, or if there exists a ˆX X such that ( ˆX, Y ) T i and F 1j (X) = F 1j ( ˆX) for 1 j i for some i ρ(x, Y ). Therefore, the probability of error is bounded above as P e P XY (T 0 ) + x,y P XY (T 0 ) + x,y = P XY (T 0 ) + x,y P XY (x, y) P XY (x, y) ρ(x,y) i=1 ρ(x,y) i=1 P (F 1j (x) = F 1j (ˆx), 1 j i) 1 ( ) (ˆx, y) T i ˆx x ˆx x 1 1 ( ) (ˆx, y) T i M 1...M i ρ(x,y) P XY (x, y) 2 λi γ {ˆx : (ˆx, y) T i } i=1 P XY (T 0 ) + L 2 γ, where we have used the fact that log M 1...M i = λ 1 + i + γ = λ i + + γ, which proves reliability of the scheme. The next step is the generation of the secret key K, which depends on ρ(x, Y ), the number of rounds taken for information reconciliation. Note that if there is no error in the information reconciliation step above, then (X, Y ) T ρ(x,y ). Denote by E 1 the set of (x, y) for which an error occurs in information reconciliation and by E 2 the set {(x, y) : i XY (x, y) λ + }. Define the RV I taking values in the set {0, 1,..., L} as follows: I = l(x, Y ) = 0 if (X, Y ) T 0 E 1 E 2,

14 14 and for 1 i L I = l(x, Y ) = i if (X, Y ) T i E c 1 E c 2. Denote by P i the probability distribution of X, Y given I = i, i.e., P i (x, y) = P XY (x, y) 1 (l(x, y) = i), x X, y Y, 0 i L. P (l(x, Y ) = i) As before, we divide the indices 0 i L into good indices I g and the bad indices I b = Ig c by defining I g = {i : i > 0 and P I (i) 1L } 2. Clearly, the size of the overall communication sent in the information reconciliation stage up to ith round, i.e., F i := {(F 1j, F 2j ), 1 j i}, satisfies log F i λ i + + γ + log i, where F i denotes the number of values taken by the RV F i. We now describe the privacy amplification stage of the protocol. Protocol A2: Privacy amplification stage. Let K S be a randomly chosen member of a 2-universal family of mappings on X of range size K. The secret key K is generated by applying K S to X at the first party and to ˆX at the second party. We proceed to the secrecy analysis of this protocol. With P i,x denoting the marginal on X induced by P i, for all i I g we have y log P i,x (x) = log P XY (x, y) 1 (l(x, y) = i) P (l(x, Y ) = i) y P log 2 λi Y (y) 1 (l(x, y) = i) P (l(x, Y ) = i) y:l(x,y)=i log 2 λi λ P Y X (y x) P (l(x, Y ) = i) λ i λ + 2 log L, or equivalently, H min (P i,x ) λ i + λ + 2 log L. Lemma 5 implies that for a RV X i with distribution P i,x and P F i denoting the distribution of the

15 15 communication F given I = i, we get PK(Xi)F P unifp i F i 1 1 K F 2 i 2 Hmin(Pi,X) 1 K 2 2 (λ γ 3 log L), i I g. Therefore, for K = K(X) and the overall communication F = F ρ(x,y ), we have P KF P unif P F 1 P KFI P unif P FI 1 P (I I b ) + i I g P (I = i) PK(Xi)F i P unifp F i P XY (T 0 E 1 ) + P XY (E 2 ) + 1 L + i I g P (I = i) PK(Xi)F i P unifp F i P XY (T 0 E 1 ) + P XY (E 2 ) + 1 L K 2 (λ γ 3 log L), 1 which gives the claimed secrecy by using the definition of E 2 and bounding P XY (T 0 E 1 ) as in the error analysis for information reconciliation. B. High reliability protocol for general Z We exhibit a high reliability protocol for a general Z. In fact, we use Protocol A2 of Theorem 7, with a modified analysis of secrecy. Theorem 8. For λ min, λ max, > 0 with λ max λ min, let L = λ max λ min. Then, for every γ > 0 and 0 λ, there exists an (ɛ, δ)-sk K taking values in K with ɛ P XY (T 0 ) + L2 γ, (7) δ P (i XY (X, Y ) i XZ (X, Z) λ + ) + 1 K 2 2 (λ γ 3 log L) + 1 L + P XY (T 0 ) + L2 γ, (8) where T 0 := { (x, y) : log P X Y (x y) λ max or log P X Y (x y) λ min } Proof. The proof of Theorem 7 guarantees the reliability of Protocol A2. For secrecy, define E 1 in the same manner as before, i.e., the set of (x, y) for which an error occur in information reconciliation.

16 16 Define the set E 2 as E 2 {(x, y, z) : i XY (x, y) i XZ (x, z) λ + }, which is the same as { } 1 (x, y, z) : log P X Z (x z) log 1 P X Y (x y) λ + Let RV I taking values in the set {0, 1,..., L} be defined as follows: I = l(x, Y, Z) = 0 if (X, Y ) T 0 E 1 or (X, Y, Z) E 2, and for 1 i L I = l(x, Y, Z) = i if (X, Y ) T i E c 1 and (X, Y, Z) E c 2. Let P i be the probability distribution of X, Y, Z given I = i, i.e., P i (x, y, z) P XY Z (x, y, z) 1 (l(x, y, z) = i), x X, y Y, z Z, 0 i L. P (l(x, Y, Z) = i) Let I g be the set of good indices as in the proof of Theorem 7, and let I b = I c g. With P i,xz denoting the marginal on X Z induced by P i, for all i I g, we have log P i,xz (x, z) y = log P XY Z (x, y, z) 1 (l(x, y, z) = i) P Z (z) P (l(x, Y, Z) = i) P Z (z) y log 2 λ P X Y (x y) P Y XZ (y x, z) 1 (l(x, y, z) = i) P (l(x, Y, Z) = i) y log 2 λi λ P Y XZ (y x, z) 1 (l(x, y, z) = i) P (l(x, Y, Z) = i) λ i λ + 2 log L, where the first inequality holds since l(x, y, z) > 0 implies (x, y, z) E2 c, and the second inequality holds since l(x, y, z) = i implies (x, y) T i. This is equivalent to H min (P i,xz P Z ) λ i + λ + 2 log L. Thus, Lemma 5 and similar manipulations as in the proof of Theorem 7 lead to the desired bound on secrecy.

17 17 Remark 4. In order to get the best possible bound for S ɛ,δ (X, Y Z), we optimize the bound in Theorem 8 over the parameter a = L2 γ. Specifically, time-sharing between the protocol in Theorem 8 and the corresponding high secrecy protocol obtained using Proposition 3 yields an (ɛ, δ)-sk K such that ɛ + δ 2P XY (T 0 ) + 2L2 γ + 1 L K 2 (λ γ 3 log L) + P (i XY (X, Y ) i XZ (X, Z) λ + ). Note that the function f(a) = 2a + a 1/2 L 2 2 K 2 λ has minimum value 3 ( 2 L 4 K 2 λ) 1 3. Therefore, there exists a secret key K such that ɛ + δ 2P XY (T 0 ) + 1 L ( L 4 K 2 λ) P (i XY (X, Y ) i XZ (X, Z) λ + ). (9) V. SECRET KEY CAPACITY FOR GENERAL SOURCES In this section we will establish the secret key capacity for a sequence of general sources (X n, Y n, Z n ) with joint distribution 4 P XnY nz n. The secret key capacity for general sources is defined as follows [14], [1], [3]. Definition 2. The secret key capacity C is defined as where the sup is over all ɛ n, δ n 0 such that 1 C := sup lim inf ɛ n,δ n n n S ɛ n,δ n (X n, Y n Z n ), lim ɛ n + δ n = 0. n To state our result, we need the following concepts from the information spectrum method; see [7] for a detailed account. For RVs (X n, Y n, Z n ) n=1, the inf-conditional entropy rate H(X Y) and the sup-conditional entropy rate H(X Y) are defined as follows: { ( H(X Y) = sup α lim P 1 ) n n log P X n Y n (X n Y n ) < α { ( H(X Y) = inf α lim P 1 ) n n log P X n Y n (X n Y n ) > α } = 0, } = 0. 4 The distributions P XnY nz n need not satisfy the consistency conditions.

18 18 Similarly, the inf-conditional information rate I(X Y Z) is defined as { I(X Y Z) = sup α ( ) } 1 lim P n n i(x n, Y n Z n ) < α = 0, where, with a slight abuse of notation, i(x n, Y n Z n ) denotes the conditional information density i(x n, Y n Z n ) = log We also need the following result credited to Verdú. P XnY n Z n (X n, Y n Z n ) P Xn Z n (X n Z n ) P Yn Z n (Y n Z n ). Lemma 9. [7, Theorem 4.1.1] For every ɛ n such that it holds that lim inf n where β ɛn is defined in (3). lim ɛ n = 0, n 1 n log β ɛ n ( PXnY nz n, P Xn Z n P Yn Z n P Zn ) I(X Y Z), Our result below characterizes the secret key capacity C for general sources for the special case when X n Y n Z n is a Markov chain. Theorem 10. For a sequence of sources {X n, Y n, Z n } n=1 such that X n Y n Z n form a Markov chain for all n, the secret key capacity C is given by C = I(X Y Z). Proof. Applying Theorem 2 with η = η n = n 1, along with Lemma 9, gives C I(X Y Z). For the other direction, we construct a sequence of (ɛ n, δ n )-SKs K = K n with ɛ n, δ n 0 and rate

19 19 approximately I(X Y Z). Indeed, in Theorem 8 choose λ max = n ( H(X Y) + ), λ min = n (H(X Y) ), γ = γ n = n /2, λ = λ n = n (I (X Y Z) ) ; thus, L = L n = n ( H(X Y) H(X Y) + 2 ). Since i XY (X, Y ) i XZ (X, Z) = i(x, Y Z) if X Y Z form a Markov chain, there exists an (ɛ n, δ n )-SK K n of rate given by 1 n log K = 1 n (λ n 3 log L n ) = I (X Y Z) 2 o(n), such that ɛ n, δ n 0 as n. Rates arbitrarily close to I(X Y Z) are achieved by this scheme as > 0 is arbitrary. VI. SECOND-ORDER ASYMPTOTICS OF SECRET KEY RATES The results of the previous section show that for with ɛ n, δ n 0, the largest length S ɛn,δ n (X n, Y n Z n ) of an (ɛ n, δ n )-SK K is sup S ɛn,δ n (X n, Y n Z n ) = ni(x Y Z) + o(n), (10) ɛ n,δ n if X n Y n Z n form a Markov chain. For the case when (X n, Y n, Z n ) = (X n, Y n, Z n ) is the n-iid repetition of (X, Y, Y ) where X Y Z, we have I(X Y Z) = I(X Y Z). Furthermore, (10) holds even without ɛ n, δ n 0. In fact, a finer asymptotic analysis is possible and the second-order asymptotic term in the maximum length of an (ɛ, δ)-sk can be established; this is the subject-matter of the current section.

20 20 Let V := Var [i(x, Y Z)], and let Q(a) := a [ ] 1 exp t2 dt 2π 2 be the tail probability of the standard Gaussian distribution. Under the assumptions V X Y := Var[ log P X Y (X Y )] <, (11) [ T := E i(x, Y Y ) I(X Y Z) 3] <, (12) the result below establishes the second-order asymptotic term in S ɛ,δ (X n, Y n Z n ). Theorem 11. For every ɛ, δ > 0 such that ɛ + δ < 1 and IID RVs (X n, Y n, Z n ) such that X Y Z is a Markov chain, we have S ɛ,δ (X n, Y n Z n ) = ni(x Y Z) nv Q 1 (ɛ + δ) + o( n), Proof. For the converse part, we proceed along the lines of [17, Lemma 58]. Recall the following simple bound for β ɛ (P, Q): ( ({ log β ɛ (P, Q) λ log P x : log P(x) }) ) Q(x) λ ɛ. Thus, applying Theorem 2 with P XY Z = P Xn Y n Z n, Q XY Z = P Xn Z np Y n Z n, and η = η n = n 1/2, and choosing λ = ni(x Y Z) nv Q 1 (ɛ + δ θ n ), where θ n = 1 T 3 + n 2V 3/2 n, we get by the Berry-Esséen theorem (cf. [6], [20]) that ( S ɛ,δ (X n, Y n Z n ) λ log P (i XY (X, Y ) i XZ (X, Z) λ) ɛ δ n 1/2) + log n ni(x Y Z) nv Q 1 (ɛ + δ θ n ) + log which gives the converse by using Taylor approximation of Q( ) to remove θ n. n 1 2(ɛ + δ), (13)

21 21 For the direct part, we use the bound (9) in Remark 4 with λ max = n(h(x Y ) + /2), λ min = n(h(x Y ) /2), λ = ni(x Y Z) nv Q 1 (ɛ + δ θ n), where 0 < < 2H(X Y ), and θ n = 8V X Y n 2 + T 3 2V 3/2 n + 1 n n. Note that L = n. Therefore, upon bounding the term P XY (T 0 ) on the right-side of (9) by 4VX Y n 2 using Chebyshev s inequality and the last term by the Berry-Esséen theorem, it follows that there exists an (ɛ, δ)-sk K such that log K = ni(x Y Z) nv Q 1 (ɛ + δ θ n) 11 2 The direct part follows by using Taylor approximation of Q( ) to remove θ n. log n. (14) Remark 5. Note that the approach above does not yield the second-order asymptotic term in S ɛ,δ (X n, Y n Z n ) under the restricted secrecy definition of Remark 2. However, when Z is a constant, the precise second-order term is obtained by a convex combination of the high secrecy and the high reliability protocol of Section IV-A. Remark 6. Note that a standard noninteractive secret key agreement protocol based on information reconciliation and privacy amplification (cf. [19]) only gives the following suboptimal achievability bound on the second-order asymptotic term: S ɛ,δ (X n, Y n Z n ) ni(x Y Z) nv X Y Q 1 (ɛ) nv X Z Q 1 (δ) + o( n), where V X Y and V X Z are the variances of the conditional log-likelihoods of X given Y and Z respectively (cf. (11)). We close this section with a numerical example that illustrates the utility of our bounds in characterizing the gap to secret key capacity at a fixed n. Example 1 (Gap to secret key capacity). For 0 < α 0, α 1 < 1/2, let B 0 and B 1 be independent random bits taking value 1 with probability α 0 and α 1, respectively. Consider binary X, Y, Z where Z is a

22 22 uniform random bit independent jointly of B 0 and B 1, Y = Z B 0, and X = Y B 1. We consider the rate S ɛ,δ (X n, Y n Z n )/n of an (ɛ, δ)-sk that can be generated using n IID copies of X and Y when the eavesdropper observes Z n. The following quantities, needed to evaluate (13) and (14), can be easily evaluated: I(X Y Z) = h(α 0 α 1 ) h(α 1 ), V = µ 2, T = µ 3, V X Y = α 1 (log α 1 h(α 1 )) 2 + (1 α 1 )(log(1 α 1 ) h(α 1 )) 2, where µ r is the rth central moment of i(x, Y Z) and is given by α 1 r µ r = α 0 α 1 log I(X Y Z) 1 α 0 α 1 + (1 α 0 )(1 α 1 ) log 1 α 1 I(X Y Z) 1 α 0 α 1 α 1 r + (1 α 0 )α 1 log I(X Y Z) α 0 α 1 + α 0 (1 α 1 ) log 1 α 1 r I(X Y Z) α 0 α 1, h(x) = x log x (1 x) log(1 x) is the binary entropy function and α 0 α 1 = α 0 (1 α 1 )+(1 α 0 )α 1. In Figure 1 (given in Section I), we plot the upper bound on S ɛ,δ (X n, Y n Z n )/n resulting from (13) and the lower bound resulting from (13) with = 1 for α 0 = 0.25 and α 1 = r VII. DISCUSSION: IS INTERACTION NECESSARY? In contrast to the protocols in [14], [1], [19], our proposed protocols are interactive. Below we present an example where none of the known secret key agreement protocols achieves the general capacity of Theorem 10, suggesting that perhaps interaction is necessary. For i = 1, 2, let (Xi n, Y i n, Zn i ) be IID with X = Y = Z = {0, 1} such that P X n i Y n i Z n i (xn, y n, z n ) = 1 2 n W n i (y n x n )V n (z n y n ), where W i and V, respectively, are binary symmetric channels with crossover probabilities p i and q. Let (X n, Y n, Z n ) be the mixed source given by P XnY nz n (x n, y n, z n ) = 1 2 P X1 n Y1 n Z1 n (xn, y n, z n ) P X2 n Y n ] = 1 2 n [ 1 2 W n 1 (y n x n ) W n 2 (y n x n ) 2 Z n 2 (xn, y n, z n ) V n (z n y n ).

23 23 Note that X n Y n Z n forms a Markov chain. Suppose that 0 < p 1 < p 2 < 1 2. Then, we have I(X Y Z) = min[h(x 1 Z 1 ) H(X 1 Y 1 ), H(X 2 Z 2 ) H(X 2 Y 2 )] = min[h(p 1 q) h(p 1 ), h(p 2 q) h(p 2 )] = h(p 2 q) h(p 2 ), where h( ) is the binary entropy function and is binary convolution. Using a standard noninteractive secret key agreement protocol based on information reconciliation and privacy amplification (cf. [19]), we can achieve H(X Z) H(X Y) = min[h(x 1 Z 1 ), H(X 2 Z 2 )] max[h(x 1 Y 1 ), H(X 2 Y 2 )] = H(X 1 Z 1 ) H(X 2 Y 2 ) = h(p 1 q) h(p 2 ), which is less than the general secret key capacity of Theorem 10. Proving a precise limitation result for noninteractive protocols is a direction for future research. ACKNOWLEDGMENTS MH is partially supported by a MEXT Grant-in-Aid for Scientific Research (A) No MH is also partially supported by the National Institute of Information and Communication Technology (NICT), Japan. The Centre for Quantum Technologies is funded by the Singapore Ministry of Education and the National Research Foundation as part of the Research Centres of Excellence program. REFERENCES [1] R. Ahlswede and I. Csiszár, Common randomness in information theory and cryptography part i: Secret sharing, IEEE Trans. Inf. Theory, vol. 39, no. 4, pp , July [2], Common randomness in information theory and cryptography part ii: CR capacity, IEEE Trans. Inf. Theory, vol. 44, no. 1, pp , January [3] I. Csiszár and P. Narayan, Secrecy capacities for multiple terminals, IEEE Trans. Inf. Theory, vol. 50, no. 12, pp , December [4], Secrecy capacities for multiterminal channel models, IEEE Trans. Inf. Theory, vol. 54, no. 6, pp , June [5] Y. Dodis, R. Ostrovsky, L. Reyzin, and A. Smith, Fuzzy extractors: How to generate strong keys from biometrics and other noisy data, SIAM Journal on Computing, vol. 38, no. 1, pp , 2008.

24 24 [6] W. Feller, An Introduction to Probability Theory and its Applications, Volume II. 2nd edition. John Wiley & Sons Inc., UK, [7] T. S. Han, Information-Spectrum Methods in Information Theory [English Translation]. Series: Stochastic Modelling and Applied Probability, Vol. 50, Springer, [8] M. Hayashi, Second-order asymptotics in fixed-length source coding and intrinsic randomness, IEEE Trans. Inf. Theory, vol. 54, no. 10, pp , Oct [9], Information spectrum approach to second-order coding rate in channel coding, IEEE Trans. Inf. Theory, vol. 55, no. 11, pp , Novemeber [10] M. Hayashi and H. Nagaoka, General formulas for capacity of classical-quantum channels, IEEE Trans. Inf. Theory, vol. 49, no. 7, pp , July [11] M. Hayashi, H. Tyagi, and S. Watanabe, Secret key agreement: General capacity and second-order asymptotics, To appear, Proc. IEEE International Symposium on Information Theory, [12] R. Impagliazzo, L. A. Levin, and M. Luby, Pseudo-random generation from one-way functions, in Proc. ACM Symposium on Theory of Computing (STOC), 1989, pp [13] S. Kuzuoka, On the redundancy of variable-rate slepian-wolf coding, Proc. International Symposium on Information Theory and its Applications (ISITA), pp , [14] U. M. Maurer, Secret key agreement by public discussion from common information, IEEE Trans. Inf. Theory, vol. 39, no. 3, pp , May [15] S. Miyake and F. Kanaya, Coding theorems on correlated general sources, IIEICE Trans. Fundamental, vol. E78-A, no. 9, pp , September [16] Y. Polyanskiy, Channel coding: non-asymptotic fundamental limits, Ph. D. Dissertation, Princeton University, [17] Y. Polyanskiy, H. V. Poor, and S. Verdú, Channel coding rate in the finite blocklength regime, IEEE Trans. Inf. Theory, vol. 56, no. 5, pp , May [18] R. Renner, Security of quantum key distribution, Ph. D. Dissertation, ETH Zurich, [19] R. Renner and S. Wolf, Simple and tight bounds for information reconciliation and privacy amplification, in Proc. ASIACRYPT, 2005, pp [20] I. Shevstova, On the absolute constants in the berry-esseen type inequalities for identically distributed summands, CoRR, vol. arxiv: , [21] V. Strassen, Asymptotische abschätzungen in Shannon s informationstheorie, Third Prague Conf. Inf. Theory, pp , 1962, English translation: pmlut/strassen.pdf. [22], The existence of probability measures with given marginals, The Annals of Mathematical Statistics, vol. 36, no. 2, pp , [23] H. Tyagi, Common randomness principles of secrecy, Ph. D. Dissertation, Univeristy of Maryland, College Park, [24] H. Tyagi and P. Narayan, How many queries will resolve common randomness? IEEE Trans. Inf. Theory, vol. 59, no. 9, pp , September [25] H. Tyagi and S. Watanabe, A bound for multiparty secret key agreement and implications for a problem of secure computing, in EUROCRYPT, 2014, pp [26], Converses for secret key agreement and secure computing, CoRR, vol. abs/ , [27] L. Wang and R. Renner, One-shot classical-quantum capacity and hypothesis testing, Phys. Rev. Lett., vol. 108, no. 20, p , May 2012.

Secret Key Agreement: General Capacity and Second-Order Asymptotics

Secret Key Agreement: General Capacity and Second-Order Asymptotics Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe Abstract We revisit the problem of secret key agreement using interactive public communication

More information

Secret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe

Secret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe Two party secret key agreement Maurer 93, Ahlswede-Csiszár 93 X F Y K x K y ArandomvariableK

More information

Interactive Communication for Data Exchange

Interactive Communication for Data Exchange Interactive Communication for Data Exchange Himanshu Tyagi Indian Institute of Science, Bangalore Joint work with Pramod Viswanath and Shun Watanabe The Data Exchange Problem [ElGamal-Orlitsky 84], [Csiszár-Narayan

More information

Interactive Communication for Data Exchange

Interactive Communication for Data Exchange Interactive Communication for Data Exchange Himanshu Tyagi, Member, IEEE, Pramod Viswanath, Fellow, IEEE, and Shun Watanabe, Member, IEEE Abstract Two parties observing correlated data seek to exchange

More information

Common Randomness Principles of Secrecy

Common Randomness Principles of Secrecy Common Randomness Principles of Secrecy Himanshu Tyagi Department of Electrical and Computer Engineering and Institute of Systems Research 1 Correlated Data, Distributed in Space and Time Sensor Networks

More information

A Bound For Multiparty Secret Key Agreement And Implications For A Problem Of Secure Computing. Himanshu Tyagi and Shun Watanabe

A Bound For Multiparty Secret Key Agreement And Implications For A Problem Of Secure Computing. Himanshu Tyagi and Shun Watanabe A Bound For Multiparty Secret Key Agreement And Implications For A Problem Of Secure Computing Himanshu Tyagi and Shun Watanabe Multiparty Secret Key Agreement COMMUNICATION NETWORK F X 1 X 2 X m K 1 K

More information

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/41 Outline Two-terminal model: Mutual information Operational meaning in: Channel coding: channel

More information

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/40 Acknowledgement Praneeth Boda Himanshu Tyagi Shun Watanabe 3/40 Outline Two-terminal model: Mutual

More information

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability

More information

Converses For Secret Key Agreement and Secure Computing

Converses For Secret Key Agreement and Secure Computing Converses For Secret Key Agreement and Secure Computing Himanshu Tyagi and Shun Watanabe Abstract We consider information theoretic secret key agreement and secure function computation by multiple parties

More information

Secret Key and Private Key Constructions for Simple Multiterminal Source Models

Secret Key and Private Key Constructions for Simple Multiterminal Source Models Secret Key and Private Key Constructions for Simple Multiterminal Source Models arxiv:cs/05050v [csit] 3 Nov 005 Chunxuan Ye Department of Electrical and Computer Engineering and Institute for Systems

More information

Secret Key Generation and Secure Computing

Secret Key Generation and Secure Computing Secret Key Generation and Secure Computing Himanshu Tyagi Department of Electrical and Computer Engineering and Institute of System Research University of Maryland, College Park, USA. Joint work with Prakash

More information

Dispersion of the Gilbert-Elliott Channel

Dispersion of the Gilbert-Elliott Channel Dispersion of the Gilbert-Elliott Channel Yury Polyanskiy Email: ypolyans@princeton.edu H. Vincent Poor Email: poor@princeton.edu Sergio Verdú Email: verdu@princeton.edu Abstract Channel dispersion plays

More information

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview AN INTRODUCTION TO SECRECY CAPACITY BRIAN DUNN. Overview This paper introduces the reader to several information theoretic aspects of covert communications. In particular, it discusses fundamental limits

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks

A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks Marco Tomamichel, Masahito Hayashi arxiv: 1208.1478 Also discussing results of: Second Order Asymptotics for Quantum

More information

Non-Asymptotic and Asymptotic Analyses on Markov Chains in Several Problems

Non-Asymptotic and Asymptotic Analyses on Markov Chains in Several Problems Non-Asymptotic and Asymptotic Analyses on Markov Chains in Several Problems Masahito Hayashi and Shun Watanabe Graduate School of Mathematics, Nagoya University, Japan, and Centre for Quantum Technologies,

More information

A Formula for the Capacity of the General Gel fand-pinsker Channel

A Formula for the Capacity of the General Gel fand-pinsker Channel A Formula for the Capacity of the General Gel fand-pinsker Channel Vincent Y. F. Tan Institute for Infocomm Research (I 2 R, A*STAR, Email: tanyfv@i2r.a-star.edu.sg ECE Dept., National University of Singapore

More information

Correlation Detection and an Operational Interpretation of the Rényi Mutual Information

Correlation Detection and an Operational Interpretation of the Rényi Mutual Information Correlation Detection and an Operational Interpretation of the Rényi Mutual Information Masahito Hayashi 1, Marco Tomamichel 2 1 Graduate School of Mathematics, Nagoya University, and Centre for Quantum

More information

Second-Order Asymptotics in Information Theory

Second-Order Asymptotics in Information Theory Second-Order Asymptotics in Information Theory Vincent Y. F. Tan (vtan@nus.edu.sg) Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) National Taiwan University November 2015

More information

Simple and Tight Bounds for Information Reconciliation and Privacy Amplification

Simple and Tight Bounds for Information Reconciliation and Privacy Amplification Simple and Tight Bounds for Information Reconciliation and Privacy Amplification Renato Renner 1 and Stefan Wolf 2 1 Computer Science Department, ETH Zürich, Switzerland. renner@inf.ethz.ch. 2 Département

More information

MULTITERMINAL SECRECY AND TREE PACKING. With Imre Csiszár, Sirin Nitinawarat, Chunxuan Ye, Alexander Barg and Alex Reznik

MULTITERMINAL SECRECY AND TREE PACKING. With Imre Csiszár, Sirin Nitinawarat, Chunxuan Ye, Alexander Barg and Alex Reznik MULTITERMINAL SECRECY AND TREE PACKING With Imre Csiszár, Sirin Nitinawarat, Chunxuan Ye, Alexander Barg and Alex Reznik Information Theoretic Security A complementary approach to computational security

More information

Frans M.J. Willems. Authentication Based on Secret-Key Generation. Frans M.J. Willems. (joint work w. Tanya Ignatenko)

Frans M.J. Willems. Authentication Based on Secret-Key Generation. Frans M.J. Willems. (joint work w. Tanya Ignatenko) Eindhoven University of Technology IEEE EURASIP Spain Seminar on Signal Processing, Communication and Information Theory, Universidad Carlos III de Madrid, December 11, 2014 : Secret-Based Authentication

More information

PERFECTLY secure key agreement has been studied recently

PERFECTLY secure key agreement has been studied recently IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 45, NO. 2, MARCH 1999 499 Unconditionally Secure Key Agreement the Intrinsic Conditional Information Ueli M. Maurer, Senior Member, IEEE, Stefan Wolf Abstract

More information

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing Lecture 7 Agenda for the lecture M-ary hypothesis testing and the MAP rule Union bound for reducing M-ary to binary hypothesis testing Introduction of the channel coding problem 7.1 M-ary hypothesis testing

More information

Secret-Key Agreement over Unauthenticated Public Channels Part I: Definitions and a Completeness Result

Secret-Key Agreement over Unauthenticated Public Channels Part I: Definitions and a Completeness Result Secret-Key Agreement over Unauthenticated Public Channels Part I: Definitions and a Completeness Result Ueli Maurer, Fellow, IEEE Stefan Wolf Abstract This is the first part of a three-part paper on secret-key

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

Perfect Omniscience, Perfect Secrecy and Steiner Tree Packing

Perfect Omniscience, Perfect Secrecy and Steiner Tree Packing Perfect Omniscience, Perfect Secrecy and Steiner Tree Packing S. Nitinawarat and P. Narayan Department of Electrical and Computer Engineering and Institute for Systems Research University of Maryland College

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University) Performance-based Security for Encoding of Information Signals FA9550-15-1-0180 (2015-2018) Paul Cuff (Princeton University) Contributors Two students finished PhD Tiance Wang (Goldman Sachs) Eva Song

More information

Upper Bounds on the Capacity of Binary Intermittent Communication

Upper Bounds on the Capacity of Binary Intermittent Communication Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,

More information

Simple Channel Coding Bounds

Simple Channel Coding Bounds ISIT 2009, Seoul, Korea, June 28 - July 3,2009 Simple Channel Coding Bounds Ligong Wang Signal and Information Processing Laboratory wang@isi.ee.ethz.ch Roger Colbeck Institute for Theoretical Physics,

More information

EECS 750. Hypothesis Testing with Communication Constraints

EECS 750. Hypothesis Testing with Communication Constraints EECS 750 Hypothesis Testing with Communication Constraints Name: Dinesh Krithivasan Abstract In this report, we study a modification of the classical statistical problem of bivariate hypothesis testing.

More information

Information Masking and Amplification: The Source Coding Setting

Information Masking and Amplification: The Source Coding Setting 202 IEEE International Symposium on Information Theory Proceedings Information Masking and Amplification: The Source Coding Setting Thomas A. Courtade Department of Electrical Engineering University of

More information

Minimum Energy Per Bit for Secret Key Acquisition Over Multipath Wireless Channels

Minimum Energy Per Bit for Secret Key Acquisition Over Multipath Wireless Channels Minimum Energy Per Bit for Secret Key Acquisition Over Multipath Wireless Channels Tzu-Han Chou Email: tchou@wisc.edu Akbar M. Sayeed Email: akbar@engr.wisc.edu Stark C. Draper Email: sdraper@ece.wisc.edu

More information

Lossy Distributed Source Coding

Lossy Distributed Source Coding Lossy Distributed Source Coding John MacLaren Walsh, Ph.D. Multiterminal Information Theory, Spring Quarter, 202 Lossy Distributed Source Coding Problem X X 2 S {,...,2 R } S 2 {,...,2 R2 } Ẑ Ẑ 2 E d(z,n,

More information

A New Metaconverse and Outer Region for Finite-Blocklength MACs

A New Metaconverse and Outer Region for Finite-Blocklength MACs A New Metaconverse Outer Region for Finite-Blocklength MACs Pierre Moulin Dept of Electrical Computer Engineering University of Illinois at Urbana-Champaign Urbana, IL 680 Abstract An outer rate region

More information

Information Theoretic Limits of Randomness Generation

Information Theoretic Limits of Randomness Generation Information Theoretic Limits of Randomness Generation Abbas El Gamal Stanford University Shannon Centennial, University of Michigan, September 2016 Information theory The fundamental problem of communication

More information

The Communication Complexity of Correlation. Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan

The Communication Complexity of Correlation. Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan The Communication Complexity of Correlation Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan Transmitting Correlated Variables (X, Y) pair of correlated random variables Transmitting

More information

Secret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper

Secret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper Secret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper Mohsen Bahrami, Ali Bereyhi, Mahtab Mirmohseni and Mohammad Reza Aref Information Systems and Security

More information

Gaussian Estimation under Attack Uncertainty

Gaussian Estimation under Attack Uncertainty Gaussian Estimation under Attack Uncertainty Tara Javidi Yonatan Kaspi Himanshu Tyagi Abstract We consider the estimation of a standard Gaussian random variable under an observation attack where an adversary

More information

Entropy and Ergodic Theory Lecture 4: Conditional entropy and mutual information

Entropy and Ergodic Theory Lecture 4: Conditional entropy and mutual information Entropy and Ergodic Theory Lecture 4: Conditional entropy and mutual information 1 Conditional entropy Let (Ω, F, P) be a probability space, let X be a RV taking values in some finite set A. In this lecture

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

Ahmed S. Mansour, Rafael F. Schaefer and Holger Boche. June 9, 2015

Ahmed S. Mansour, Rafael F. Schaefer and Holger Boche. June 9, 2015 The Individual Secrecy Capacity of Degraded Multi-Receiver Wiretap Broadcast Channels Ahmed S. Mansour, Rafael F. Schaefer and Holger Boche Lehrstuhl für Technische Universität München, Germany Department

More information

Distributed Lossless Compression. Distributed lossless compression system

Distributed Lossless Compression. Distributed lossless compression system Lecture #3 Distributed Lossless Compression (Reading: NIT 10.1 10.5, 4.4) Distributed lossless source coding Lossless source coding via random binning Time Sharing Achievability proof of the Slepian Wolf

More information

ECE 4400:693 - Information Theory

ECE 4400:693 - Information Theory ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential

More information

On Third-Order Asymptotics for DMCs

On Third-Order Asymptotics for DMCs On Third-Order Asymptotics for DMCs Vincent Y. F. Tan Institute for Infocomm Research (I R) National University of Singapore (NUS) January 0, 013 Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs

More information

Soft Covering with High Probability

Soft Covering with High Probability Soft Covering with High Probability Paul Cuff Princeton University arxiv:605.06396v [cs.it] 20 May 206 Abstract Wyner s soft-covering lemma is the central analysis step for achievability proofs of information

More information

Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel

Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel Shahid Mehraj Shah and Vinod Sharma Department of Electrical Communication Engineering, Indian Institute of

More information

Common Randomness and Secret Key Generation with a Helper

Common Randomness and Secret Key Generation with a Helper 344 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 46, NO. 2, MARCH 2000 Common Romness Secret Key Generation with a Helper Imre Csiszár, Fellow, IEEE, Prakash Narayan, Senior Member, IEEE Abstract We consider

More information

On the Duality between Multiple-Access Codes and Computation Codes

On the Duality between Multiple-Access Codes and Computation Codes On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

Asymptotic redundancy and prolixity

Asymptotic redundancy and prolixity Asymptotic redundancy and prolixity Yuval Dagan, Yuval Filmus, and Shay Moran April 6, 2017 Abstract Gallager (1978) considered the worst-case redundancy of Huffman codes as the maximum probability tends

More information

Arimoto Channel Coding Converse and Rényi Divergence

Arimoto Channel Coding Converse and Rényi Divergence Arimoto Channel Coding Converse and Rényi Divergence Yury Polyanskiy and Sergio Verdú Abstract Arimoto proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code

More information

Information-theoretic Secrecy A Cryptographic Perspective

Information-theoretic Secrecy A Cryptographic Perspective Information-theoretic Secrecy A Cryptographic Perspective Stefano Tessaro UC Santa Barbara WCS 2017 April 30, 2017 based on joint works with M. Bellare and A. Vardy Cryptography Computational assumptions

More information

On Function Computation with Privacy and Secrecy Constraints

On Function Computation with Privacy and Secrecy Constraints 1 On Function Computation with Privacy and Secrecy Constraints Wenwen Tu and Lifeng Lai Abstract In this paper, the problem of function computation with privacy and secrecy constraints is considered. The

More information

Network coding for multicast relation to compression and generalization of Slepian-Wolf

Network coding for multicast relation to compression and generalization of Slepian-Wolf Network coding for multicast relation to compression and generalization of Slepian-Wolf 1 Overview Review of Slepian-Wolf Distributed network compression Error exponents Source-channel separation issues

More information

A new converse in rate-distortion theory

A new converse in rate-distortion theory A new converse in rate-distortion theory Victoria Kostina, Sergio Verdú Dept. of Electrical Engineering, Princeton University, NJ, 08544, USA Abstract This paper shows new finite-blocklength converse bounds

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

Lecture 4 Channel Coding

Lecture 4 Channel Coding Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

MGMT 69000: Topics in High-dimensional Data Analysis Falll 2016

MGMT 69000: Topics in High-dimensional Data Analysis Falll 2016 MGMT 69000: Topics in High-dimensional Data Analysis Falll 2016 Lecture 14: Information Theoretic Methods Lecturer: Jiaming Xu Scribe: Hilda Ibriga, Adarsh Barik, December 02, 2016 Outline f-divergence

More information

Guesswork Subject to a Total Entropy Budget

Guesswork Subject to a Total Entropy Budget Guesswork Subject to a Total Entropy Budget Arman Rezaee, Ahmad Beirami, Ali Makhdoumi, Muriel Médard, and Ken Duffy arxiv:72.09082v [cs.it] 25 Dec 207 Abstract We consider an abstraction of computational

More information

Group Secret Key Agreement over State-Dependent Wireless Broadcast Channels

Group Secret Key Agreement over State-Dependent Wireless Broadcast Channels Group Secret Key Agreement over State-Dependent Wireless Broadcast Channels Mahdi Jafari Siavoshani Sharif University of Technology, Iran Shaunak Mishra, Suhas Diggavi, Christina Fragouli Institute of

More information

Secure Modulo Zero-Sum Randomness as Cryptographic Resource

Secure Modulo Zero-Sum Randomness as Cryptographic Resource Secure Modulo Zero-Sum Randomness as Cryptographic Resource Masahito Hayashi 12 and Takeshi Koshiba 3 1 Graduate School of Mathematics, Nagoya University masahito@math.nagoya-u.ac.jp 2 Centre for Quantum

More information

Lecture 7 Introduction to Statistical Decision Theory

Lecture 7 Introduction to Statistical Decision Theory Lecture 7 Introduction to Statistical Decision Theory I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 20, 2016 1 / 55 I-Hsiang Wang IT Lecture 7

More information

ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016

ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016 ECE598: Information-theoretic methods in high-dimensional statistics Spring 06 Lecture : Mutual Information Method Lecturer: Yihong Wu Scribe: Jaeho Lee, Mar, 06 Ed. Mar 9 Quick review: Assouad s lemma

More information

Explicit Capacity-Achieving Coding Scheme for the Gaussian Wiretap Channel. Himanshu Tyagi and Alexander Vardy

Explicit Capacity-Achieving Coding Scheme for the Gaussian Wiretap Channel. Himanshu Tyagi and Alexander Vardy Explicit Capacity-Achieving Coding Scheme for the Gaussian Wiretap Channel Himanshu Tyagi and Alexander Vardy The Gaussian wiretap channel M Encoder X n N(0,σ 2 TI) Y n Decoder ˆM N(0,σ 2 WI) Z n Eavesdropper

More information

Extractors and the Leftover Hash Lemma

Extractors and the Leftover Hash Lemma 6.889 New Developments in Cryptography March 8, 2011 Extractors and the Leftover Hash Lemma Instructors: Shafi Goldwasser, Yael Kalai, Leo Reyzin, Boaz Barak, and Salil Vadhan Lecturer: Leo Reyzin Scribe:

More information

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@cam.ac.uk Alfonso Martinez Universitat Pompeu Fabra alfonso.martinez@ieee.org

More information

Channels with cost constraints: strong converse and dispersion

Channels with cost constraints: strong converse and dispersion Channels with cost constraints: strong converse and dispersion Victoria Kostina, Sergio Verdú Dept. of Electrical Engineering, Princeton University, NJ 08544, USA Abstract This paper shows the strong converse

More information

The Method of Types and Its Application to Information Hiding

The Method of Types and Its Application to Information Hiding The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,

More information

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute ENEE 739C: Advanced Topics in Signal Processing: Coding Theory Instructor: Alexander Barg Lecture 6 (draft; 9/6/03. Error exponents for Discrete Memoryless Channels http://www.enee.umd.edu/ abarg/enee739c/course.html

More information

Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel

Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel Pritam Mukherjee Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 074 pritamm@umd.edu

More information

Interactive Channel Capacity

Interactive Channel Capacity Electronic Colloquium on Computational Complexity, Report No. 1 (2013 Interactive Channel Capacity Gillat Kol Ran Raz Abstract We study the interactive channel capacity of an ɛ-noisy channel. The interactive

More information

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,

More information

arxiv: v4 [cs.it] 17 Oct 2015

arxiv: v4 [cs.it] 17 Oct 2015 Upper Bounds on the Relative Entropy and Rényi Divergence as a Function of Total Variation Distance for Finite Alphabets Igal Sason Department of Electrical Engineering Technion Israel Institute of Technology

More information

A Single-letter Upper Bound for the Sum Rate of Multiple Access Channels with Correlated Sources

A Single-letter Upper Bound for the Sum Rate of Multiple Access Channels with Correlated Sources A Single-letter Upper Bound for the Sum Rate of Multiple Access Channels with Correlated Sources Wei Kang Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College

More information

Secure Sketch for Multi-Sets

Secure Sketch for Multi-Sets Secure Sketch for Multi-Sets Ee-Chien Chang Vadym Fedyukovych Qiming Li March 15, 2006 Abstract Given the original set X where X = s, a sketch P is computed from X and made public. From another set Y where

More information

Variable-Rate Universal Slepian-Wolf Coding with Feedback

Variable-Rate Universal Slepian-Wolf Coding with Feedback Variable-Rate Universal Slepian-Wolf Coding with Feedback Shriram Sarvotham, Dror Baron, and Richard G. Baraniuk Dept. of Electrical and Computer Engineering Rice University, Houston, TX 77005 Abstract

More information

Lecture 3: Channel Capacity

Lecture 3: Channel Capacity Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.

More information

Towards control over fading channels

Towards control over fading channels Towards control over fading channels Paolo Minero, Massimo Franceschetti Advanced Network Science University of California San Diego, CA, USA mail: {minero,massimo}@ucsd.edu Invited Paper) Subhrakanti

More information

Block 2: Introduction to Information Theory

Block 2: Introduction to Information Theory Block 2: Introduction to Information Theory Francisco J. Escribano April 26, 2015 Francisco J. Escribano Block 2: Introduction to Information Theory April 26, 2015 1 / 51 Table of contents 1 Motivation

More information

On Common Information and the Encoding of Sources that are Not Successively Refinable

On Common Information and the Encoding of Sources that are Not Successively Refinable On Common Information and the Encoding of Sources that are Not Successively Refinable Kumar Viswanatha, Emrah Akyol, Tejaswi Nanjundaswamy and Kenneth Rose ECE Department, University of California - Santa

More information

Homework Set #2 Data Compression, Huffman code and AEP

Homework Set #2 Data Compression, Huffman code and AEP Homework Set #2 Data Compression, Huffman code and AEP 1. Huffman coding. Consider the random variable ( x1 x X = 2 x 3 x 4 x 5 x 6 x 7 0.50 0.26 0.11 0.04 0.04 0.03 0.02 (a Find a binary Huffman code

More information

A Comparison of Superposition Coding Schemes

A Comparison of Superposition Coding Schemes A Comparison of Superposition Coding Schemes Lele Wang, Eren Şaşoğlu, Bernd Bandemer, and Young-Han Kim Department of Electrical and Computer Engineering University of California, San Diego La Jolla, CA

More information

On Scalable Coding in the Presence of Decoder Side Information

On Scalable Coding in the Presence of Decoder Side Information On Scalable Coding in the Presence of Decoder Side Information Emrah Akyol, Urbashi Mitra Dep. of Electrical Eng. USC, CA, US Email: {eakyol, ubli}@usc.edu Ertem Tuncel Dep. of Electrical Eng. UC Riverside,

More information

A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback

A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback Vincent Y. F. Tan (NUS) Joint work with Silas L. Fong (Toronto) 2017 Information Theory Workshop, Kaohsiung,

More information

Lecture 11: Quantum Information III - Source Coding

Lecture 11: Quantum Information III - Source Coding CSCI5370 Quantum Computing November 25, 203 Lecture : Quantum Information III - Source Coding Lecturer: Shengyu Zhang Scribe: Hing Yin Tsang. Holevo s bound Suppose Alice has an information source X that

More information

Minentropy and its Variations for Cryptography

Minentropy and its Variations for Cryptography 1 Minentropy and its Variations for Cryptography Leonid Reyzin May 23, 2011 5 th International Conference on Information Theoretic Security 2 guessability and entropy Many ays to measure entropy If I ant

More information

Channel Dispersion and Moderate Deviations Limits for Memoryless Channels

Channel Dispersion and Moderate Deviations Limits for Memoryless Channels Channel Dispersion and Moderate Deviations Limits for Memoryless Channels Yury Polyanskiy and Sergio Verdú Abstract Recently, Altug and Wagner ] posed a question regarding the optimal behavior of the probability

More information

MMSE Dimension. snr. 1 We use the following asymptotic notation: f(x) = O (g(x)) if and only

MMSE Dimension. snr. 1 We use the following asymptotic notation: f(x) = O (g(x)) if and only MMSE Dimension Yihong Wu Department of Electrical Engineering Princeton University Princeton, NJ 08544, USA Email: yihongwu@princeton.edu Sergio Verdú Department of Electrical Engineering Princeton University

More information

Keyless authentication in the presence of a simultaneously transmitting adversary

Keyless authentication in the presence of a simultaneously transmitting adversary Keyless authentication in the presence of a simultaneously transmitting adversary Eric Graves Army Research Lab Adelphi MD 20783 U.S.A. ericsgra@ufl.edu Paul Yu Army Research Lab Adelphi MD 20783 U.S.A.

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality

A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality 0 IEEE Information Theory Workshop A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality Kumar Viswanatha, Emrah Akyol and Kenneth Rose ECE Department, University of California

More information

EE 4TM4: Digital Communications II. Channel Capacity

EE 4TM4: Digital Communications II. Channel Capacity EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.

More information

Entropies & Information Theory

Entropies & Information Theory Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information

More information

Secret Key Agreement Using Asymmetry in Channel State Knowledge

Secret Key Agreement Using Asymmetry in Channel State Knowledge Secret Key Agreement Using Asymmetry in Channel State Knowledge Ashish Khisti Deutsche Telekom Inc. R&D Lab USA Los Altos, CA, 94040 Email: ashish.khisti@telekom.com Suhas Diggavi LICOS, EFL Lausanne,

More information

LECTURE 3. Last time:

LECTURE 3. Last time: LECTURE 3 Last time: Mutual Information. Convexity and concavity Jensen s inequality Information Inequality Data processing theorem Fano s Inequality Lecture outline Stochastic processes, Entropy rate

More information