Converses For Secret Key Agreement and Secure Computing

Size: px
Start display at page:

Download "Converses For Secret Key Agreement and Secure Computing"

Transcription

1 Converses For Secret Key Agreement and Secure Computing Himanshu Tyagi and Shun Watanabe Abstract We consider information theoretic secret key agreement and secure function computation by multiple parties observing correlated data, with access to an interactive public communication channel. Our main result is an upper bound on the secret key length, which is derived using a reduction of binary hypothesis testing to multiparty secret key agreement. Building on this basic result, we derive new converses for multiparty secret key agreement. Furthermore, we derive converse results for the oblivious transfer problem and the bit commitment problem by relating them to secret key agreement. Finally, we derive a necessary condition for the feasibility of secure computing by trusted parties that seek to compute a function of their collective data, using interactive public communication that by itself does not give away the value of the function. In many cases, we strengthen and improve upon previously known converse bounds. Our results are single-shot and use only the given joint distribution of the correlated observations. For the case when the correlated observations consist of independent and identically distributed (in time) sequences, we derive strong versions of previously known converses. I. INTRODUCTION Information theoretic cryptography relies on the availability of correlated random observations to the parties. Neither multiparty secret key (SK) agreement nor secure computing is feasible if the observation of the parties are mutually independent. In fact, SK agreement is not feasible even when the observations are independent across some partition of the set of parties. As an extension of this principle, we can expect that the efficiency of a cryptographic primitive is related to how far the joint distribution of the observations is from a distribution that renders the observations independent (across some partition of the set of parties). We formalize this heuristic principle and leverage it to bound the efficiency of using correlated sources to implement SK agreement and secure computing. We present single-shot converse With restricted interpretations of feasibility, these observations appear across the vast literature on SK agreement and secure computing; see, for instance, [34], [], [4], [45], [32], [62], [36].

2 2 results; in particular, we do not assume that the observations of parties consist of long sequences generated by an independent and identically distributed (IID) random process 2. In multiparty SK agreement, a set of parties observing correlated random variables (RVs) seek to agree on shared random bits that remain concealed from an eavesdropper with access to a correlated side information. The parties may communicate with each other over a noiseless public channel, but the transmitted communication will be available to the eavesdropper. The main tool for deriving our converse results is a reduction argument that relates multiparty SK agreement to binary hypothesis testing 3. For an illustration of our main idea, consider the two party case when the eavesdropper observes only the communication between the legitimate parties and does not observe any additional side information. Clearly, if the observations of the legitimate parties are independent, a SK cannot be generated. We upper bound the length of SKs that can be generated in terms of how far is the joint distribution of the observations of the parties and from a distribution that renders their observations independent. Specifically, for this special case, we show that the maximum length S (X,X 2 ) of a SK (for a given security index ) is bounded above as S (X,X 2 ) apple log + P XX 2, P X P X2 + 2 log(/), where P XX 2, P X P X2 is the optimal probability of error of type II for testing the null hypothesis P XX 2 with the alternative P X P X2, given that the probability of error of type I is smaller than ; this serves as a proxy for distance between P XX 2 and P X P X2. Similarly, in the general case of an arbitrary number of parties with correlated side information at the eavesdropper, our main result in Theorem 3 bounds the secret key length in terms of the distance between the joint distribution of the observations of the parties and the eavesdropper and a distribution that renders the observations of the parties conditionally independent across some partition, when conditioned on the eavesdropper s side information. This bound is a manifestation of the aforementioned heuristic principle and is termed the conditional independence testing bound. Our approach brings out a structural connection between SK agreement and binary hypothesis testing. This is in the spirit of [39], where a connection between channel coding and binary hypothesis testing was used to establish an upper bound on the rate of good channel codes (see, also, [58], [24]). Also, our upper bound is reminiscent of the measure of entanglement for a quantum state proposed in [57], 2 Throughout this paper, IID observations refer to observations that are IID in time; at each instant t, the observations of the parties are correlated. 3 This basic result was reported separately in [56].

3 3 namely the minimum distance between the density matrix of the state and that of a disentangled state. This measure of entanglement was shown to be an upper bound on the entanglement of distillation in [57], where the latter is the largest proportion of maximally entangled states that can be distilled using a purification process [6]. Using our basic result, we obtain new converses for SK agreement, and also, for secure computing by reducing SK agreement to oblivious transfer and bit commitment. In many cases, we strengthen and improve upon previously known results. Our main contributions are summarized below. A. SK agreement For two parties, the problem of SK agreement from correlated observations is well-studied. The problem was introduced by Maurer [34] and Ahlswede and Csiszár [], who considered the case where the parties observe IID sequences. However, in certain applications it is of interest to consider observations arising from a single realization of correlated RVs. For instance, in applications such as biometric and hardware authentication (cf. [38], [7]), the correlated observations consist of different versions of the biometric and hardware signatures, respectively, recorded at the registration and the authentication stages. To this end, Renner and Wolf [45] derived bounds on the length of a SK that can be generated by two parties observing a single realization of correlated RVs, using one-side communication. The problem of SK agreement with multiple parties, for the IID setup, was introduced in [4] (also, see [9] for an early formulation). In this work, we consider the SK agreement problem for multiple parties observing a single realization of correlated RVs. Our conditional independence testing bound is a singleshot upper bound on the length of SKs that can be generated by multiple parties observing correlated data, using interactive public communication 4. Unlike the single-shot upper bound in [45], which is restricted to two parties with one-way communication, we allow arbitrary interactive communication between multiple parties. Asymptotically our bound is tight its application to the IID case recovers some previously known (tight) bounds on the asymptotic SK rates. In fact, we strengthen the previously known asymptotic results since we do not require the probability of error in SK agreement or the security index to be asymptotically 5 0. See Section IV for a detailed discussion. 4 A single-shot upper bound using Fano s inequality for the length of a multiparty SK, obtained as a straightforward extension of [4], [5], was reported in [55]. 5 Such bounds that do not require the probability of error to vanish to 0 are called strong converse bounds [3].

4 4 B. Secure computing Secure computing by two parties was introduced by Yao in [65]. Two (mutually untrusting) parties seek to compute a function of their collective data, without sharing anything more about their data than what is given away by the function value. Several specific instances of this general problem have been studied. We consider the problems of oblivious transfer (OT) and bit commitment (BC), which constitute two basic primitives for secure computing. OT between two parties is a mode of message transmission where the sender does not know whether the recipient actually received the information [4]. In this paper, we consider the one-of-two OT problem [8] where the first party observes two strings K 0 and K of length l each, and the second party seeks the value of the Bth string, B 2{0, }. The goal is to accomplish this task in such a manner that B and K B remain concealed, respectively, from the first and the second party. This simply stated problem is at the heart of secure function computation as it is well-known [3] that any secure function computation task can be accomplished using the basic OT protocol repeatedly (for recent results on the complexity of secure function computation using OT, see [4]). Unfortunately, information theoretically secure OT is not feasible in the absence of additional resources. On the bright side, if the parties share a noisy communication channel or if they observe correlated randomness, OT can be accomplished (cf. [],[2] [36]). In this paper, we consider the latter case where, as an additional resource, the parties observe correlated RVs X and X 2. Based on reduction arguments relating OT to SK agreement, we derive upper bounds on the length l of OT that can be accomplished for given RVs X,X 2. The resulting bound is, in general, tighter than that obtained in [6]. Furthermore, an application of our bound to the case of IID observations shows that the upper bound on the rate of OT length derived in [36] and [2] is strong, i.e., the bound holds even without requiring asymptotically perfect recovery. We now turn to the BC problem, the first instance of which was introduced by Blum in [7] as the problem of flipping a coin over a telephone, when the parties do not trust each other. A bit commitment protocol has two phases. In the first phase the committing party generates a random bit string K, its coin flip. Subsequently, the two parties communicate with each other, which ends the first phase. In the second phase, the committing party reveals K. A bit commitment protocol must forbid the committing party from cheating and changing K in the second phase. As in the case of OT, information theoretically secure BC is not possible without additional resources. We consider a version where two parties observing correlated observations X and X 2 want to implement information theoretically secure BC using interactive public communication. The goal is to maximize the length of the committed string K. By reducing SK agreement

5 5 to BC, we derive an upper bound on BC length which improves upon the bound in [42]. Furthermore, for the case of IID observations, we derive a strong converse for BC capacity; the latter is the maximum rate of BC length and was characterized in [62]. C. Secure computing with trusted parties In a different direction, we relate our result to the following problem of secure function computation with trusted parties introduced in [53] (for an early version of the problem, see [37]): Multiple parties observing correlated data seek to compute a function of their collective data. To this end, they communicate interactively over a public communication channel, which is assumed to be authenticated and error-free. It is required that the value of the function be concealed from an eavesdropper with access to the communication. When is such a secure computation of a given function feasible? In contrast to the traditional secure computing problem discussed above, this setup is appropriate for applications such as sensor networks where the legitimate parties are trusted and are free to extract any information about each other s data from the shared communication. Using the conditional independence testing bound, we derive a necessary condition for the existence of a communication protocol that allows the parties to reliably recover the value of a given function, while keeping this value concealed from an eavesdropper with access to (only) the communication. In [53], matching necessary and sufficient conditions for secure computability of a given function were derived for the case of IID observations. In contrast, our necessary condition for secure computability is single-shot and does not rely on the observations being IID. D. Outline of paper The next section reviews some basic concepts that will be used throughout this work. The conditional independence testing bound is derived in Section III. In the subsequent three sections, we present the implications of this bound: Section IV addresses strong converses for SK capacity; Section V addresses converse results for the OT and the bit commitment problem; and Section VI contains converse results for the secure computing problem with trusted parties. The final section contains a brief discussion of possible extensions. E. Notations For brevity, we use abbreviations SK, RV, and IID for secret key, random variable, and independent and identically distributed, respectively; a plural form will be indicated by appending an s to the abbreviation. The RVs are denoted by capital letters and the corresponding range sets are denoted by

6 6 calligraphic letters. The distribution of a RV U is given by P U, when there is no confusion we drop the subscript U. The set of all parties {,...,m} is denoted by M. For a collection of RVs {U,..,U m } and a subset A of M, U A denotes the RVs {U i,i2a}. For a RV U, U n denotes n IID repetitions of the RV U. Similarly, P n denotes the distribution corresponding to the n IID repetitions generated from P. All logarithms in this paper are to the base 2. II. PRELIMINARIES A. Secret keys Consider SK agreement using interactive public communication by m (trusted) parties. The ith party observes a discrete RV X i taking values in a finite set X i, apple i apple m. 6 Upon making these observations, the parties communicate interactively over a public communication channel that is accessible by an eavesdropper, who additionally observes a RV Z such that the RVs (X M,Z) have a distribution P XMZ. We assume that the communication is error-free and each party receives the communication from every other party. Furthermore, we assume that the public communication is authenticated and the eavesdropper cannot tamper with it. Specifically, the communication is sent over r rounds of interaction. In the jth round of communication, apple j apple r, the ith party sends F ij, which is a function of its observation X i, a locally generated randomness 7 U i and the previously observed communication F,...,F m,f 2,...,F m2,...,f j,...,f (i )j. The overall interactive communication F,...,F m,...,f r,...,f mr is denoted by F. Using their local observations and the interactive communication F, the parties agree on a SK. Formally, a SK is a collection of RVs K,...,K m, where the ith party gets K i, that agree with probability close to and are concealed, in effect, from an eavesdropper. Formally, the ith party computes a function K i of (U i,x i, F). Traditionally, the RVs K,...,K m with a common range K constitute an (, )-SK if the following two conditions are satisfied (for alternative definitions of secrecy, see [34], [2], [4]) P(K = = K m ), () d (P KFZ, P unif P FZ ) apple, (2) 6 The conditional independence testing bound given in Theorem 3 remains valid for RVs taking countably many values. 7 The RVs U,...,U m are mutually independent and independent jointly of (X M,Z).

7 7 where P unif is the uniform distribution on K and d (P, Q) is the variational distance between P and Q given by d (P, Q) = X P(x) Q(x). 2 x The first condition above represents the reliable recovery of the SK and the second condition guarantees security. In this work, we use the following alternative definition of a SK, which conveniently combines the recoverability and the security conditions (cf. [43]): The RVs K,...,K m above constitute an -SK with common range K if where d P KMFZ, P (M) unif P FZ P (M) unif (k M)= (k = = k m ). K In fact, the two definitions above are closely related 8. Proposition. Given 0 apple, an ( + )-SK under (3). apple, (3) apple, if K M constitute an (, )-SK under () and (2), then they constitute Conversely, if K M constitute an -SK under (3), then they constitute an (, )-SK under () and (2). Therefore, by the composition theorem in [8], the complex cryptographic protocols using such SKs instead of perfect SKs are secure. 9 We are interested in characterizing the maximum length log K of an -SK. Definition. Given 0 apple <, denote by S (X M Z) the maximum length log K of an -SK K M with common range K. Our upper bound is based on relating the SK agreement problem to a binary hypothesis testing problem; below we review some basic concepts in hypothesis testing that will be used. 8 Note that a SK agreement protocol that satisfies (3) universally composable-emulates an ideal SK agreement protocol (see [8] for a definition).the emulation is with emulation slack, for an environment of unbounded computational complexity. 9 A perfect SK refers to unbiased shared bits that are independent of eavesdropper s observations.

8 8 B. Hypothesis testing Consider a binary hypothesis testing problem with null hypothesis P and alternative hypothesis Q, where P and Q are distributions on the same alphabet X. Upon observing a value x 2X, the observer needs to decide if the value was generated by the distribution P or the distribution Q. To this end, the observer applies a stochastic test T, which is a conditional distribution on {0, } given an observation x 2X. When x 2X is observed, the test T chooses the null hypothesis with probability T(0 x) and the alternative hypothesis with probability T ( x) = T (0 x). For 0 apple <, denote by (P, Q) the infimum of the probability of error of type II given that the probability of error of type I is less than, i.e., where (P, Q) := inf Q[T], (4) T:P[T] P[T] = X x Q[T] = X x P(x)T(0 x), Q(x)T(0 x). We note two important properties of the quantity (P, Q). ) Data processing inequality. Let W be a stochastic mapping from X to Y, i.e., for each x 2X, W ( x) is a distribution on Y. Then, where (P W )(y) = P x P(x) W (y x). (P, Q) apple (P W, Q W ), (5) 2) Stein s Lemma. (cf. [33, Theorem 3.3]) For every 0 < <, we have lim n! n log (P n, Q n )=D(PkQ), (6) where D(PkQ) is the Kullback-Leibler divergence given by with the convention 0 log(0/0) = 0. D(PkQ) = X x2x P(x) log P(x) Q(x), We close with a discussion on evaluating (P, Q). Note that the expression for (P, Q) in (4) is a linear program, solving which has a polynomial complexity in the size of the observation space. A

9 9 simple manipulation yields the following computationally more tractable bound: log (P, Q) apple inf log P log P(X) Q(X) apple. (7) When P and Q correspond to IID RVs, the tail probability in (7) can be numerically evaluated directly or can be approximated by the Bérry-Esséen theorem (cf. [9]). On the other hand, numerical evaluation of the tail probability is rather involved when P and Q correspond to Markov chains. For this case, a computationally tractable and asymptotically tight bound on (P, Q) was established recently in [60]. Also, by setting = D (P, Q) + log( 0 ), where D (P, Q) is the Rényi s divergence of order > and given by [46] D (P, Q) = log X x2x P(x) Q(x), the following simple, closed-form bound on (P, Q) is obtained 0 : log (P, Q) apple D (P, Q) + log( 0 ) log 0. While this bound is not tight in general, as its corollary we obtain Stein s lemma (see (6)). Finally, we remark that when the condition is satisfied with probability under P, the bound in (7) implies log P(X) = D(PkQ) (8) Q(X) log (P, Q) apple D(PkQ) + log(/( )). (9) C. Smooth minimum entropy and smooth maximum divergence Given two RVs X and Y, a central question of information theoretic secrecy is (cf. [27], [28], [5]): How many unbiased, independent bits can be extracted from X that are unavailable to an observer of Y? When the underlying distribution is IID, the optimum rate of extracted bits can be expressed in terms of Shannon entropies and is given by H(X Y ). However, for our single-shot setup, smooth minimum entropy introduced in [45], [43] is a more relevant measure of randomness. We use the definition of smooth min entropy introduced in [43]; for a review of other variations, see [48]. 0 For other connections between and Rényi s divergence, see [40]. A review of the notion of smooth minimum entropy without the notations from quantum information theory can be also found in [59].

10 0 We also review the leftover hash lemma [27], [5], which brings out the central role of smooth minimum entropy in the answer to the question above. Also, as a change of measure companion for smooth minimum entropy, we define smooth maximum divergence and note that it satisfies the data processing inequality. Definition 2. (Minimum entropy) The minimum entropy of P is defined as H min (P) := min log x P(x). For distributions P XY and Q Y, the conditional minimum entropy of P XY given Q Y is defined as H min (P XY Q Y ):= min log Q Y (y) x 2X,y2supp(Q Y ) P XY (x, y). Finally, the conditional minimum entropy 2 of P XY given Y is defined as H min (P XY Y ):=sup Q Y H min (P XY Q Y ), where the sup is over all Q Y such that supp(p Y ) supp(q Y ). The definition of minimum entropy and conditional minimum entropy above remain valid for all subnormalized, nonnegative functions P XY, i.e., P XY such that X P XY (x, y) apple. x,y We need this extension and the concept of smoothing, defined next, to derive tight bounds. Definition 3. (Smooth minimum entropy) Given P XY given Y is defined as 0, the -smooth conditional minimum entropy of H min(p XY Y ):= sup H min ( P XY Y ), P XY : d(p XY, P XY ) apple where the sup is over all subnormalized, nonnegative functions P XY. When Y is a constant, the -smooth minimum entropy is denoted by H min (P X). We now state the leftover hash lemma, which says that we can extract H min (P XY Y ) unbiased, independent bits from X that are effectively concealed from an observer of Y. 2 There is no consensus on the definition of conditional minimum entropy. The form here is appropriate for our purpose.

11 Lemma 2. (Leftover hash) [43] Given a joint distribution P XY, for every 0 apple 2 < and 0 <there exists a mapping 3 K : X!Kwith log K = bh min (P XY Y ) 2 log(/2)c such that d P K(X)Y, P unif P Y apple 2 +. Finally, we review smooth maximum divergence, which was introduced first in [6] for a quantum setting. The method of smoothing in the following definition is slightly different from the one in [6] and is tailored to our purpose. Definition 4. (Smooth maximum divergence) The maximum divergence between two distributions P and Q is defined as D max (PkQ) := max x log P(x) Q(x), with the convention log(0/0) = 0, and for 0 < <, the -smooth maximum divergence between P and Q is defined as D max(pkq) := inf D max ( PkQ), P apple P: P(X ) where the inf is over all subnormalized, nonnegative functions P such that P(x) apple P(x) for all x 2X and P x P(x). The following two properties of smooth maximum divergence will be used: ) Data processing inequality. For every stochastic mapping W : X!Y, D max(p W kq W ) apple D max(pkq). (0) Indeed, for every P such that P(x) apple P(x) for all x 2X and P x P(x), the following hold ( P W )(Y), ( P W )(y) apple (P W )(y), 8 y 2Y. 3 A randomly chosen function from a 2-universal hash family suffices.

12 2 The property follows upon noting that for every y 2Y since max i (a i /b i ) ( P i a i/ P i b i). D max ( PkQ) = max x log P(x) Q(x) log ( P W )(y) (Q W )(y), 2) Convergence to Kullback-Leibler divergence. For IID distributions P n and Q n, lim n! n D max(p n kq n )=D(PkQ), 8 0 < <. The inequality apple follows upon choosing P n (x) =P n (x) (x 2T n ), where T n is the typical set for P n. For the other direction, given a P n apple P n with P n (X n ), we have P n (T n ) ( )/2, () for all n sufficiently large. Thus, max x log P n (x) Q n (x) max log P n (x) x2t n Q n (x) X P n (x) Pn (x) log Q n (x) x2t n P n (T n ) log P n (T n ) Q n (T n ) ( ) 2 log Q n (T n ) + o(n) where the third inequality is by log-sum inequality [3, Lemma 3.], and the last inequality uses (). The proof is completing upon noting that log Q n (T n )= nd(pkq) + o(n). III. THE CONDITIONAL INDEPENDENCE TESTING BOUND Converse results of this paper are based on an upper bound on the maximum length S (X M Z) of an -SK. We present this basic result here 4. Consider a (nontrivial) partition = {,..., l } of the set M. Heuristically, if the underlying distribution of the observations P XMZ is such that X M are conditionally independent across the partition 4 The results of this section were presented in [56].

13 3 given Z, the length of a SK that can be generated is 0. Our approach is to bound the length of a generated SK in terms of how far is the distribution P XMZ from another distribution Q X MZ that renders X M conditionally independent across the partition given Z the closeness of the two distributions is measured by P XMZ, Q X MZ. Specifically, for a partition with factorize as follows: 2 parts, let Q( ) be the set of all distributions Q X MZ that Q X (x Y M Z,...,x m z) = Q X i Z (x i z). (2) Theorem 3 (Conditional independence testing bound). Given 0 apple <, 0 << partition of M. It holds that apple S (X M Z) apple for all Q X MZ 2Q( ). i=, and a log + P XMZ, Q X MZ + log(/) (3) Remarks. (i) Renner and Wolf [45] derived a bound on the length of a SK that can be generated by two parties using one-way communication. A comparison of this bound with the general bound in Theorem 3 is unavailable, since the former involves auxiliary RVs and is difficult to evaluate. (ii) For m =2and Z = constant, the upper bound on the length of a SK in Theorem 3 is related closely to the meta-converse of Polyanskiy, Poor, and Verdú [39]. Indeed, a code for reliable transmission of a message M over a point-to-point channel yields a SK for the sender and the receiver; the length of this SK can be bounded by Theorem 3. However, the resulting bound is slightly weaker than the meta-converse and does not yield the correct third order asymptotic term (the coefficient of log n) in the optimal size of transmission codes [49]. (iii) The proof of Theorem 3 below remains valid even when the security condition (3) is replaced by the following more general condition: d P KMFZ, P (M) unif Q FZ apple, for some distribution Q FZ. In particular, upper bound (3) holds even under the relaxed security criterion above. To prove Theorem 3, we first relate the SK length to the exponent of the probability of error of type II in a binary hypothesis testing problem where an observer of (K M, F,Z) seeks to find out if the

14 4 underlying distribution was P XMZ or Q X MZ. This result is stated next. Lemma 4. For an -SK K M with a common range K generated using an interactive communication F, let W KMF X MZ be the resulting conditional distribution on (K M, F) given (X M,Z). Then, for every 0 << and every Q X MZ 2Q( ), we have apple log K apple log + P KMFZ, Q K MFZ + log(/), (4) where P KMFZ is the marginal of (K M, F,Z) for the joint distribution P KMFX MZ =P XMZW KMF X MZ, and Q K MFZ is the corresponding marginal for the joint distribution Q K MFX MZ =Q X MZW KMF X MZ. Also, we need the following basic property of interactive communication from [52], which will be used throughout this paper (see, also, [3, Lemma 7.8]). Lemma 5 (Interactive communication property). Given Q X MZ 2Q( ) and an interactive communication F, the following holds: Q X (x Y M FZ M f,z) = Q X i FZ (x i f,z), i= i.e., conditionally independent observations remain so when conditioned additionally on an interactive communication. In particular, if Q XX 2 Z =Q X ZQ X2 Z, then Q XX 2 FZ =Q X FZ Q X2 FZ. Proof of Lemma 4. We establish (4) by constructing a test for the hypothesis testing problem with null hypothesis P=P KMFZ and alternative hypothesis Q=Q K MFZ. Specifically, we use a deterministic test 5 with the following acceptance region (for the null hypothesis) 6 : ( P (M) unif A := (k M,f,z) : log (k M) Q K (k M FZ M f,z) ), 5 In fact, we use a simple threshold test on the log-likelihood ratio but with P (M) unif P FZ in place of P KM FZ, since the two distributions are close to each other by the security condition (3). 6 The values (k M,f,z) with Q K M FZ(k M f,z) =0are included in A.

15 5 where =( ) log K log(/). For this test, the probability of error of type II is bounded above as Q K MFZ(A) = X f,z Q FZ(f,z) X k M : (k M,f,z)2A X apple 2 Q FZ(f,z) X f,z k M Q K M FZ (k M f,z) P (M) unif (k M) = K. (5) On the other hand, the probability of error of type I is bounded above as P KMFZ (A c ) apple d P KMFZ, P (M) unif P FZ +P (M) unif P ZF (A c ) apple +P (M) unif P FZ (A c ), (6) where the first inequality follows from the definition of variational distance, and the second is a consequence of the security condition (3) satisfied by the -SK K M. The second term above can be expressed as follows: P (M) unif P FZ (A c ) = X f,z = X f,z P FZ (f,z) X K k P FZ (f,z) X K k ((k,f,z) 2A c ) Q K M FZ (k f,z) K >, (7) where k =(k,..., k). The inner sum can be further upper bounded as X k Q K M FZ (k f,z) K > apple X k = K X k Q KM FZ (k f,z) K Q K M FZ (k f,z) = K X k Y i= Q K i FZ (k f,z), (8) where the previous equality uses Lemma 5 and the fact that given F, K i is a function of (X i,u i ).

16 6 Next, an application of Hölder s inequality to the sum on the right-side of (8) yields k X Y Q K i FZ (k f,z) i= apple apple Y i= i= X k Q K i FZ (k f,z)! 0 X Q K i FZ (k i f,z) A k i =. (9) Upon combining (7)-(9) we obtain P (M) unif P FZ (A c ) apple, which along with (6) gives P KMFZ (A c ) apple +. (20) It follows from (20) and (5) that + P KMFZ, Q K MFZ apple K, which completes the proof. Proof of Theorem 3. Using the data processing inequality (5) with P=P XMZ, Q=Q X MZ, and W = W KMF X MZ, we get + P XMZ, Q X MZ apple + P KMFZ, Q K MFZ, which along with Lemma 4 gives Theorem 3. IV. IMPLICATIONS FOR SK CAPACITY For the SK agreement problem, a special case of interest is when the observations consist of n length IID sequences, i.e., the ith party observes (X i,...,x in ) and the eavesdropper observes (Z,...,Z n ) such that the RVs {X Mt,Z t } n t= are IID. For this case, it is well known that a SK of length proportional to n can be generated; the maximum rate (log K n /n) of a SK is called the SK capacity [34], [], [4]. To present the results of this section at full strength, we need to take recourse to the original definition of (, )-SK given in () and (2). In the manner of Definition, denote by S, (X M Z) the maximum length of an (, )-SK. It follows from Proposition that S, (X M Z) apple S + (X M Z).

17 7 Definition 5. (SK capacity) Given 0 <, <, the (, )-SK capacity C, (X M Z) is defined by C, (X M Z) := liminf n! n S, (X n M Z n ), where the RVs {X Mt,Z t } are IID for apple t apple n, with a common distribution P XMZ. The SK capacity C (X M Z) is defined as the limit C (X M Z) := lim C, (X M Z). +!0 For the case when the eavesdropper does not observe any side information, i.e., Z = constant, the SK capacity for two parties was characterized by Maurer [34] and Ahlswede and Csiszár []. Later, the SK capacity for a multiparty model, with Z =constant was characterized by Csiszár and Narayan [4]. The general problem of characterizing the SK capacity for arbitrary Z remains open. Several upper bounds for SK capacity are known [34], [], [35], [44], [4], [5], [23], which are tight for special cases. In this section, we derive a single-shot version of the Gohari-Anantharam bound [23] on the SK capacity for two parties, which is the best known bound for this case. Furthermore, for multiple parties, we establish a strong converse for SK capacity, which shows that, surprisingly, we cannot improve the rate of a SK by relaxing the recoverability requirement () or the security requirement (2). A. Converse results for two parties It was shown in [23] that for two parties, C(X,X 2 Z) apple min U I (X ^ X 2 U)+I(X,X 2 ^ U Z). (2) The proof in [23] relied critically on the assumption that the RVs {(X Mt,Z t )} n t= are IID and does not apply to the single-shot setup. The result below is a single-shot version of (2) and is proved by relying only on the structure of the SKs, without recourse to the potential function approach 7 of [23]. Theorem 6. For 0 <, with +2 <, S, (X,X 2 Z) apple S,2 + (X,X 2 Z, U)+D max P XX 2ZUkP XX 2ZP U Z + 2 log(/2( )) +, for every RV U and every 0 apple << 2. 7 In fact, a simple proof of (2) follows upon noting that for an optimum rate SK (K,K 2) recoverable from a communication F, the SK capacity C(X,X 2 Z) approximately equals (/n)h(k F,Z n ) apple (/n)h(k F,U n,z n )+(/n)i(k, F ^ U n Z n ), which is further bounded above by C(X,X 2 U)+I(X,X 2 ^ U Z).

18 8 As corollaries, we obtain a single-shot version and a strong version of the upper bound in (2), which does not require perfect asymptotic recovery or perfect asymptotic security. Corollary 7 (Single-shot bound for SK length). For 0 <, with +2 <, S, (X,X 2 Z) apple log +2 + (P XX 2ZU, P X ZUP X2ZU) for every RV U and every 0 apple + 2 << 2. + D max P XX 2ZUkP XX 2ZP U Z + 4 log(/( 2 )) +, Corollary 8 (Strong bound for SK capacity). For 0 apple, with +2 <, C, (X,X 2 Z) apple min I (X ^ X 2 U)+I(X,X 2 ^ U Z). U We conclude this section with proofs. The core of Theorem 6 is contained in the following lemma. Lemma 9. Let (K,K 2 ) be an (, )-SK taking values in K, recoverable from a communication F. Then, H + /2 min (P KFZU FZU) log K D max P KFZUkP KFZP U Z for every RV U and every 0 apple < 2. Proof of Theorem 6. Let (K,K 2 ) be an (, )-SK taking values in K. Then, by Lemma 9 and the data processing property of smooth maximum divergence (0), we get H + /2 min (P KFZU FZU) log K D max P XX 2ZUkP XX 2ZP U Z. By the leftover hash lemma (see Section II-C), there exists a mapping K 0 of K taking at least log K D max P XX 2ZUkP XX 2ZP U Z 2 log(/2( )) values and satisfying d P K0 (K )FZU, P unif P FZU apple 2 +. Therefore, (K 0 (K ),K 0 (K 2 )) constitutes an (, 2 observes (Z, U) and so, + )-SK for X and X 2, when the eavesdropper log K D max P XX 2ZUkP XX 2ZP U Z 2 log(/2( )) apple S,2 + (X,X 2 Z, U). Corollary 7 follows by Theorem 3.

19 9 Proof of Corollary 8. The result follows by Corollary 7 upon using Stein s lemma (see Section II-B), along with the convergence property of smooth maximum divergence (see Section II-C). Proof of Lemma 9. By definitions of H + /2 min T :(k,f,z,u) 7! [0, ] such that X k,f,z,u and D max, it suffices to show that for every mapping P(k,f,z,u) T (k,f,z,u), (22) there exist a subnormalized nonnegative function Q KFZU and a distribution Q FZU satisfying the following: d (P KFZU, Q KFZU) apple + /2, (23) H min Q KFZU Q FZU = log K D max P KFZUT kp KFZP U Z. (24) To that end, note apple P(k,u f,z) P(k f,z,u) =P(k f,z) P(k f,z)p(u f,z), and let apple P(k,u f,z) Q(k,f,z,u):=P unif (k ) P(k f,z)p(u f,z) Q FZU := P FZ P U Z. P(f,z,u) T (k,f,z,u), Since T (k,f,z,u) apple, it follows that X Q(k,f,z,u) apple, k,f,z,u

20 20 and so Q is a valid subnormalized nonnegative function. Also, 2d (P, Q) = X f,z,u P(f,z,u) X k P(k f,z,u) Q(k f,z,u) apple X P(k,f,z,u) k,f,z,u T (k,f,z,u) + X f,z,u P(f,z,u) X k P(k f,z,u) T (k,f,z,u) Q(k f,z,u) apple + X f,z,u P(f,z,u) X k P(k f,z,u) T (k,f,z,u) Q(k f,z,u), (25) where the previous inequality uses (22). For the second term above, from the definition of Q and security condition (2), we have X P(f,z,u) X P(k f,z,u) T (k,f,z,u) Q(k f,z,u) k f,z,u apple X k,f,z,u P(f,z)P(u k,f,z) P(k f,z) P unif (k ) apple X P(f,z) P(k f,z) P unif (k ) k,f,z apple 2, which together with (25) yields (23). Next, observe that Q(k,f,z,u) Q(f,z,u) apple P(u k,f,z) =P unif (k) P(u z) T (k,f,z,u), and so, H min Q KFZU Q FZU = log K max log P(k,f,z,u) T (k,f,z,u), k,f,z,u P(k,f,z)P(u z) which is the same as (24). B. Strong converse for multiple parties Now we move to the m terminal case where the eavesdropper gets no side information, i.e., Z = constant. With this simplification, the SK capacity C (X M ) for multiple parties was characterized by

21 2 Csiszár and Narayan [4]. Furthermore, they introduced the remarkable expression on the right-side of (26) below as an upper bound for C (X M ), and showed its tightness for m =2, 3. Later, the tightness of the upper bound for arbitrary m was shown in [0]; we summarize these developments in the result below. Theorem 0. [4], [0] The SK capacity C for the case when eavesdropper s side information Z = constant is given by C (X M )=min where the min is over all partitions of M. P D Y XM P X i, (26) i= This generalized the classic result of Maurer [34] and Ahlswede and Csiszár [], which established that for two parties, C (X,X 2 )=D (P XX 2 kp X P X2 )=I (X ^ X 2 ). The converse part of Theorem 0 relied critically on the fact that n + n! 0 as n! 0. Below we strengthen the converse and show that the upper bound for SK rates implied by Theorem 0 holds even when ( n, n) is fixed. Specifically, for 0 <, with + < and Z = constant, an application of Theorem 3 to the IID rvs XM n, with Q X = Q M n i= Pn X i, yields 2 0 S, (X n,...,xm) n Y apple 4 log + n X M, i= P n X i where <. Therefore, using Stein s Lemma (see (6)) we get 0 C, (X M ) apple lim inf n! n log + n X M, = P D Y XM P X i. 3 A + log(/) 5, Y P n X i i= Also, note that if + >, the SK rate can be infinity. Indeed, even for + =the SK rate is infinity, as is seen by time sharing between an infinite rate (, 0)-SK of rate infinity and (0, )-SK of rate infinity. Thus, we have established the following strong converse for the SK capacity when Z = constant. Corollary (Strong converse for SK capacity). Given 0 <, <, the (, )-SK capacity C, (X M ) i= A

22 22 is given by and C, (X M )=min P D Y XM P X i, if + <, i= C, (X M )=, if +. V. IMPLICATIONS FOR SECURE COMPUTING In this section, we consider secure computing by two (mutually untrusting) parties. First introduced by Yao in [65], these problems have propelled the research in cryptography over the last three decades. In particular, we will consider the oblivious transfer and the bit commitment problem, the two basic primitives for secure computing. We will look at the information theoretic versions of these problems where, as an additional resource, the parties observe correlated RVs X and X 2. Our converse results are based on reduction arguments which relate these problems to the SK agreement problem, enabling the application of Theorem 3. To state our results, we need the notions of maximum common function and minimum sufficient statistic; their role in bounding the performance of secure computing protocols was first highlighted in [64]. Specifically, for RVs X,X 2, denote by mcf(x,x 2 ) the maximum common function of X and X 2 [2] (see, also, [54]). Also, denote by mss(x 2 X ) the minimum sufficient statistic for X 2 given X, i.e., the minimal function g(x ) such that the Markov chain X g(x ) X 2 holds. Specifically, mss(x 2 X ) is given by the function resulting from the following equivalence relation on X (cf. [20], [29], [5]): x x 0, P X2 X (x 2 x )=P X2 X x 2 x 0, for all x 2 2X 2. A. Oblivious transfer We present bounds on the efficiency of implementing information theoretically secure one-of-two OT using correlated randomness. Formally, suppose the first party observes K 0 and K, distributed uniformly over {0, } l, and the second party observes a random bit B. The RVs K 0,K, and B are mutually independent. Furthermore, party i observes the RV X i, i =, 2, where RVs (X,X 2 ) are independent jointly of (K 0,K,B). The second party seeks to compute K B without giving away B to the first party. At the same time, the first party does not want to give away K B to the second party.

23 23 Definition 6. (Oblivious transfer) An (,, 2)-OT of length l consists of an interactive communication protocol F and ˆK = ˆK(X 2,B,F) such that the following conditions hold 8 : P K B 6= ˆK apple, (27) d P KB X 2BF, P KB P X2BF apple, (28) d (P BK0K X F, P B P K0K X F) apple 2, (29) where B = B. The first condition above denotes the reliability of OT, while the second and the third conditions ensure security for party and 2, respectively. Denote by L,, 2 (X,X 2 ) the largest length l of an (,, 2)-OT. When the underlying observations X,X 2 consist of n-length IID sequences X n,xn 2 with common distribution P XX 2, it is known that L,, 2 (X n,xn 2 ) may grow linearly with n (cf. [36], [2]); the largest rate of growth is called the OT capacity. Definition 7 (OT capacity). For 0 < <, the -OT capacity of (X,X 2 ) is defined 9 as C (X,X 2 )=liminf n sup n, 2n n L, n, 2n (X n,x2 n ), where the sup is over all n, 2n! 0 as n!. The OT capacity is defined as C(X,X 2 )= inf C (X,X 2 ). 0< < The main result of this section is an upper bound on L,, 2 (X,X 2 ). Consequently, we recover the upper bound on C(X,X 2 ) due to Ahlswede and Csiszár derived in [2]. In fact, we show that the upper bound is strong and applies to C (X,X 2 ) for every 0 < <. Theorem 2 (Single-shot bound for OT length). For RVs X,X 2, V 0 = mcf(x,x 2 ) and V = mss(x 2 X ), the following inequalities hold: L,, 2 (X,X 2 ) apple log P XX 2V 0, P X V 0 P X2 V 0 P V0 + 2 log(/ ), (30) L,, 2 (X,X 2 ) apple log P VV X 2, P V X 2 P V X 2 P X2 + 2 log(/ ), (3) 8 Strictly speaking, OT refers to the problem where the strings K 0,K and the bit B are fixed. The randomized version here is sometimes referred as oblivious key transfer (see [3], [63]) and is equivalent to OT. 9 For brevity, we use the same notation for SK capacity and OT capacity; the meaning will be clear from the context. Similarly, the notation L, used here to denote the optimal OT length, is also used to denote the optimal BC length in the next section.

24 24 for all >0 with = <. Corollary 3 (Strong bound for OT capacity). For 0 < <, the -OT capacity of (X,X 2 ) satisfies C (X,X 2 ) apple min{i(x ^ X 2 V 0 ),H(V X 2 )}, where V 0 =mcf(x,x 2 ) and V =mss(x 2 X ). The proof of Theorem 2 entails reducing two SK agreement problems to OT 20. The bound (30) is obtained by recovering K B as a SK, while (3) is obtained by recovering K B as a SK; we note these two reductions as separate lemmas below. Lemma 4 (Reduction of SK agreement to OT). Consider SK agreement for two parties observing X and X 2, respectively, with the eavesdropper observing V 0 =mcf(x,x 2 ). Given an (,, 2)-OT of length l, there exists a protocol for generating an ( )-SK of length l. In particular, L,, 2 (X,X 2 ) apple S (X,X 2 V 0 ). Lemma 5 (Reduction 2 of SK agreement to OT). Consider two party SK agreement where the first party observes X, the second party observes (V,X 2 )=(mss(x 2 X ),X 2 ) and the eavesdropper observes X 2. Given an (,, 2)-OT of length l, there exists a protocol for generating an ( )-SK of length l. In particular, L,, 2 (X,X 2 ) apple S (X, (V,X 2 ) X 2 ). Remarks. (i) Underlying the proof of C(X,X 2 ) apple I(X ^X 2 ) in [2] was a reduction of SK agreement to OT, which is extended in our proof below to prove (30). In contrast, the proof of the bound C(X,X 2 ) apple H(X X 2 ) in [2] relied on manipulations of entropy terms. Below we give an alternative reduction argument to prove (3). (ii) In general, our bounds are stronger than those presented in [6]. For instance, the latter is loose when the observations consist of mixtures of IID RVs. Further, while both (3) and [6, Theorem 5] (specialized to OT) suffice to obtain the second bound in Corollary 3, in contrast to (30), [6, Theorem 2] does not yield the first bound in Corollary 3. (iii) For simplicity of presentation, we did not allow local randomization in the formulation above. 20 A reduction of SK to OT in a computational security setup appeared in [22].

25 25 However, it can be easily included as a part of X and X 2 by replacing X i with (X i,u i ), i =, 2, where U,U 2, (X,X 2 ) are mutually independent. Since our proofs are based on reduction of SK agreement to OT, by noting that mss(x 2,U 2 X,U )=mss(x 2 X ) and that the availability of local randomness does not change our upper bound on SK length in Theorem 3, the results above remain valid even when local randomness is available. (iv) An (,, 2)-OT capacity can be defined, without requiring n, 2n to go to 0 as in the definition of C (X,X 2 ). The problem of characterizing (,, 2)-OT capacity for all 0 <,, 2 < remains open. We prove Lemmas 4 and 5 next. The proof of Theorem 2 follows by Theorem 3, along with the Markov relation X V X 2 and the data processing inequality (5); the corollary follows by Stein s Lemma (see Section II-B). Proof. of Lemma 4. Let ˆK be the estimate of KB formed by the second party. The following protocol generates an ( )-SK of length l (i) The first party generates two random strings K 0 and K of length l, and the second party generates a random bit B. Two parties run the OT protocol. (ii) The second party sends B over the public channel. (iii) Using B, the first party computes K B. The RVs K B, ˆK constitute an ( )-SK. Since both parties agree on K B with probability greater than, by Proposition and remark (iii) following Theorem 3, it suffices to show that for some distribution Q V0FB (see Remark (iii) following Theorem 3), d (P KBV 0FB, P unif Q V0FB) apple Observe that condition (29) is the same as d P K0K X F B=0, P K0K X F B= apple 2 2. (32)

26 26 Let Q V0FB (v, f, b) =P V0F B v, f b P B (b). Then, d (P KBV 0FB, P unif Q V0FB) = X d P 2 KbV 0F B=b, P unif Q V0F B=b b = X d P 2 KbV 0F B=b, P unif P V0F B=b b apple X h i d P 2, P KbV 0F B=b unif P V0F B=b + d P KbV 0F B=b, P KbV 0F B=b b = d P KB V 0FB, P unif P V0FB + X d P 2 KbV 0F B=b, P KbV 0F B=b apple +2 2, b where the last inequality uses (28) and (32), together with the fact that V 0 is a function of X 2 as well as X. Proof. of Lemma 5. The following protocol generates an ( )-SK of length l. (i) The first party generates two random strings K 0 and K of length l, and the second party generates a random bit B. Two parties run the OT protocol. (ii) Upon observing F, the second party samples X 2 according to the distribution P X2 V BF V, B,F. (iii) The second party sends B over the public channel. (iv) The first party computes K B and the second party computes K = ˆK( X 2, B,F). The RVs K B, K constitute an ( )-SK. Heuristically, this protocol entails the second party emulating X 2, pretending that the protocol was executed for B instead of B. Since the communication of the first party is oblivious of the value of B, plugging X 2 into ˆK will lead to an estimate of K B provided that the emulated X 2 preserves the joint distribution. By Proposition and (28), it suffices to show that P K B 6= K apple (33)

27 27 To that end, note P K B 6= K = X P 2 Kb V F B (k, v, f b)p ˆK(X2, b, f) 6= k V = v, B = b, F = f k, b, v,f apple X P 2 Kb V F B k, v, f b P ˆK(X2, b, f) 6= k V = v, B = b, F = f +2 2 = 2 =P k, b, v,f X k, b, v,f P KbV F B (k, v, f b)p ˆK(X2,b,f) 6= k V = v, B = b, F = f +2 2 K B 6= ˆK where the inequality uses (32) and the last equality uses the Markov relation X 2 V BF K 0 K, which holds in the view of the interactive communication property of Lemma 5; (33) follows by (27). B. Bit commitment Two parties observing correlated observations X and X 2 want to implement information theoretically secure BC using interactive public communication, i.e., the first party seeks to report to the second the results of a series of coin tosses that it conducted at its end in such a manner that, at a later stage, the second party can detect if the first party was lying [7]. Formally, a BC protocol consists of two phases: the commit phase and the reveal phase. In the commit phase, the first party generates a random string K, distributed uniformly over {0, } l and independent jointly of (X,X 2 ). Furthermore, the two parties communicate interactively with each other. In the reveal phase, the first party reveals its data, i.e., it sends X 0 and K0, claiming these were its initial choices of X and K, respectively. Subsequently, the second party applies a (randomized) test function T = T (K 0,X 0,X 2, F), where T =0and T =, respectively, indicate K 0 = K and K 0 6= K. Definition 8 (Bit commitment). An (,, 2)-BC of length l consists of a secret K unif{0, } l, an interactive communication F (sent during the commit phase), and a {0, }-valued randomized test function T = T (K 0,X 0,X 2, F) such that the following hold: P(T (K, X,X 2, F) 6= 0) apple, (34) d (P KX2F, P K P X2F) apple, (35) P T (K 0,X 0,X 2, F) =0,K 0 6= K apple 2, (36)

28 28 where RVs X 0,K0 are arbitrary. The first condition above is the soundness condition, which captures the reliability of BC. The next condition is the hiding condition, which ensures that the second party cannot ascertain the secret in the commit phase. Finally, the binding condition in (36) restricts the probability with which the first party can cheat in the reveal phase. Denote by L,, 2 (X,X 2 ) the largest length l of an (,, 2)-BC. For n-length IID sequences X n,xn 2 called the BC capacity. generated from P X X 2, the largest rate of L,, 2 (X,X 2 ) is Definition 9 (BC capacity). For 0 <,, 2 <, the (,, 2)-BC capacity of (X,X 2 ) is defined as The BC capacity is defined as C,, 2 (X,X 2 )=liminf n n L,, 2 (X n,x n 2 ). C(X,X 2 )= lim C,, 2 (X,X 2 ).,, 2!0 The following result of Winters, Nascimento, and Imai [62] (see, also, [50, Chapter 8]) gives a simple formula for C(X,X 2 ). Theorem 6. [62] For RVs X,X 2, let V =mss(x 2 X ). The BC capacity is given by C(X,X 2 )=H(V X 2 ). In this section, we present an upper bound on L,, 2 (X,X 2 ), which in turn leads to a strong converse for BC capacity. Theorem 7 (Single-shot bound for BC length). Given 0 <,, 2, <, for RVs X,X 2 and V =mss(x X 2 ), the following inequality holds: L,, 2 (X,X 2 ) apple log P VV X 2, P V X 2 P V X 2 P X2 + 2 log(/ ), for all with = Corollary 8 (Strong converse for BC capacity). For 0 <,, 2, <, the (,, 2)-BC capacity satisfies C,, 2 (X,X 2 ) apple H(V X 2 ),

A Bound For Multiparty Secret Key Agreement And Implications For A Problem Of Secure Computing. Himanshu Tyagi and Shun Watanabe

A Bound For Multiparty Secret Key Agreement And Implications For A Problem Of Secure Computing. Himanshu Tyagi and Shun Watanabe A Bound For Multiparty Secret Key Agreement And Implications For A Problem Of Secure Computing Himanshu Tyagi and Shun Watanabe Multiparty Secret Key Agreement COMMUNICATION NETWORK F X 1 X 2 X m K 1 K

More information

Secret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe

Secret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe Two party secret key agreement Maurer 93, Ahlswede-Csiszár 93 X F Y K x K y ArandomvariableK

More information

Secret Key Agreement: General Capacity and Second-Order Asymptotics

Secret Key Agreement: General Capacity and Second-Order Asymptotics Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe Abstract We revisit the problem of secret key agreement using interactive public communication

More information

Common Randomness Principles of Secrecy

Common Randomness Principles of Secrecy Common Randomness Principles of Secrecy Himanshu Tyagi Department of Electrical and Computer Engineering and Institute of Systems Research 1 Correlated Data, Distributed in Space and Time Sensor Networks

More information

Interactive Communication for Data Exchange

Interactive Communication for Data Exchange Interactive Communication for Data Exchange Himanshu Tyagi Indian Institute of Science, Bangalore Joint work with Pramod Viswanath and Shun Watanabe The Data Exchange Problem [ElGamal-Orlitsky 84], [Csiszár-Narayan

More information

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/40 Acknowledgement Praneeth Boda Himanshu Tyagi Shun Watanabe 3/40 Outline Two-terminal model: Mutual

More information

Secret Key Agreement: General Capacity and Second-Order Asymptotics

Secret Key Agreement: General Capacity and Second-Order Asymptotics Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe 1 Abstract We revisit the problem of secret key agreement using interactive public communication

More information

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/41 Outline Two-terminal model: Mutual information Operational meaning in: Channel coding: channel

More information

MULTITERMINAL SECRECY AND TREE PACKING. With Imre Csiszár, Sirin Nitinawarat, Chunxuan Ye, Alexander Barg and Alex Reznik

MULTITERMINAL SECRECY AND TREE PACKING. With Imre Csiszár, Sirin Nitinawarat, Chunxuan Ye, Alexander Barg and Alex Reznik MULTITERMINAL SECRECY AND TREE PACKING With Imre Csiszár, Sirin Nitinawarat, Chunxuan Ye, Alexander Barg and Alex Reznik Information Theoretic Security A complementary approach to computational security

More information

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview AN INTRODUCTION TO SECRECY CAPACITY BRIAN DUNN. Overview This paper introduces the reader to several information theoretic aspects of covert communications. In particular, it discusses fundamental limits

More information

The Method of Types and Its Application to Information Hiding

The Method of Types and Its Application to Information Hiding The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,

More information

PERFECTLY secure key agreement has been studied recently

PERFECTLY secure key agreement has been studied recently IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 45, NO. 2, MARCH 1999 499 Unconditionally Secure Key Agreement the Intrinsic Conditional Information Ueli M. Maurer, Senior Member, IEEE, Stefan Wolf Abstract

More information

Secret Key and Private Key Constructions for Simple Multiterminal Source Models

Secret Key and Private Key Constructions for Simple Multiterminal Source Models Secret Key and Private Key Constructions for Simple Multiterminal Source Models arxiv:cs/05050v [csit] 3 Nov 005 Chunxuan Ye Department of Electrical and Computer Engineering and Institute for Systems

More information

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing Lecture 7 Agenda for the lecture M-ary hypothesis testing and the MAP rule Union bound for reducing M-ary to binary hypothesis testing Introduction of the channel coding problem 7.1 M-ary hypothesis testing

More information

Oblivious Transfer from Weak Noisy Channels

Oblivious Transfer from Weak Noisy Channels Oblivious Transfer from Weak Noisy Channels Jürg Wullschleger University of Bristol, UK j.wullschleger@bristol.ac.uk April 9, 009 Abstract Various results show that oblivious transfer can be implemented

More information

The Communication Complexity of Correlation. Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan

The Communication Complexity of Correlation. Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan The Communication Complexity of Correlation Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan Transmitting Correlated Variables (X, Y) pair of correlated random variables Transmitting

More information

Secret Key Generation and Secure Computing

Secret Key Generation and Secure Computing Secret Key Generation and Secure Computing Himanshu Tyagi Department of Electrical and Computer Engineering and Institute of System Research University of Maryland, College Park, USA. Joint work with Prakash

More information

Common Randomness and Secret Key Generation with a Helper

Common Randomness and Secret Key Generation with a Helper 344 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 46, NO. 2, MARCH 2000 Common Romness Secret Key Generation with a Helper Imre Csiszár, Fellow, IEEE, Prakash Narayan, Senior Member, IEEE Abstract We consider

More information

A Full Characterization of Functions that Imply Fair Coin Tossing and Ramifications to Fairness

A Full Characterization of Functions that Imply Fair Coin Tossing and Ramifications to Fairness A Full Characterization of Functions that Imply Fair Coin Tossing and Ramifications to Fairness Gilad Asharov Yehuda Lindell Tal Rabin February 25, 2013 Abstract It is well known that it is impossible

More information

On Achieving the Best of Both Worlds in Secure Multiparty Computation

On Achieving the Best of Both Worlds in Secure Multiparty Computation On Achieving the Best of Both Worlds in Secure Multiparty Computation Yuval Ishai Jonathan Katz Eyal Kushilevitz Yehuda Lindell Erez Petrank Abstract Two settings are traditionally considered for secure

More information

Perfect Omniscience, Perfect Secrecy and Steiner Tree Packing

Perfect Omniscience, Perfect Secrecy and Steiner Tree Packing Perfect Omniscience, Perfect Secrecy and Steiner Tree Packing S. Nitinawarat and P. Narayan Department of Electrical and Computer Engineering and Institute for Systems Research University of Maryland College

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

Interactive Communication for Data Exchange

Interactive Communication for Data Exchange Interactive Communication for Data Exchange Himanshu Tyagi, Member, IEEE, Pramod Viswanath, Fellow, IEEE, and Shun Watanabe, Member, IEEE Abstract Two parties observing correlated data seek to exchange

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability

More information

Parallel Coin-Tossing and Constant-Round Secure Two-Party Computation

Parallel Coin-Tossing and Constant-Round Secure Two-Party Computation Parallel Coin-Tossing and Constant-Round Secure Two-Party Computation Yehuda Lindell Dept. of Computer Science and Applied Math. The Weizmann Institute of Science Rehovot 76100, Israel. lindell@wisdom.weizmann.ac.il

More information

On Function Computation with Privacy and Secrecy Constraints

On Function Computation with Privacy and Secrecy Constraints 1 On Function Computation with Privacy and Secrecy Constraints Wenwen Tu and Lifeng Lai Abstract In this paper, the problem of function computation with privacy and secrecy constraints is considered. The

More information

Information-theoretic Secrecy A Cryptographic Perspective

Information-theoretic Secrecy A Cryptographic Perspective Information-theoretic Secrecy A Cryptographic Perspective Stefano Tessaro UC Santa Barbara WCS 2017 April 30, 2017 based on joint works with M. Bellare and A. Vardy Cryptography Computational assumptions

More information

Chapter 2: Entropy and Mutual Information. University of Illinois at Chicago ECE 534, Natasha Devroye

Chapter 2: Entropy and Mutual Information. University of Illinois at Chicago ECE 534, Natasha Devroye Chapter 2: Entropy and Mutual Information Chapter 2 outline Definitions Entropy Joint entropy, conditional entropy Relative entropy, mutual information Chain rules Jensen s inequality Log-sum inequality

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

Password Cracking: The Effect of Bias on the Average Guesswork of Hash Functions

Password Cracking: The Effect of Bias on the Average Guesswork of Hash Functions Password Cracking: The Effect of Bias on the Average Guesswork of Hash Functions Yair Yona, and Suhas Diggavi, Fellow, IEEE Abstract arxiv:608.0232v4 [cs.cr] Jan 207 In this work we analyze the average

More information

EECS 750. Hypothesis Testing with Communication Constraints

EECS 750. Hypothesis Testing with Communication Constraints EECS 750 Hypothesis Testing with Communication Constraints Name: Dinesh Krithivasan Abstract In this report, we study a modification of the classical statistical problem of bivariate hypothesis testing.

More information

Lecture 2: August 31

Lecture 2: August 31 0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 2: August 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy

More information

Secret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper

Secret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper Secret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper Mohsen Bahrami, Ali Bereyhi, Mahtab Mirmohseni and Mohammad Reza Aref Information Systems and Security

More information

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157 Lecture 6: Gaussian Channels Copyright G. Caire (Sample Lectures) 157 Differential entropy (1) Definition 18. The (joint) differential entropy of a continuous random vector X n p X n(x) over R is: Z h(x

More information

Keyless authentication in the presence of a simultaneously transmitting adversary

Keyless authentication in the presence of a simultaneously transmitting adversary Keyless authentication in the presence of a simultaneously transmitting adversary Eric Graves Army Research Lab Adelphi MD 20783 U.S.A. ericsgra@ufl.edu Paul Yu Army Research Lab Adelphi MD 20783 U.S.A.

More information

Information measures in simple coding problems

Information measures in simple coding problems Part I Information measures in simple coding problems in this web service in this web service Source coding and hypothesis testing; information measures A(discrete)source is a sequence {X i } i= of random

More information

Soft Covering with High Probability

Soft Covering with High Probability Soft Covering with High Probability Paul Cuff Princeton University arxiv:605.06396v [cs.it] 20 May 206 Abstract Wyner s soft-covering lemma is the central analysis step for achievability proofs of information

More information

Information Theoretic Limits of Randomness Generation

Information Theoretic Limits of Randomness Generation Information Theoretic Limits of Randomness Generation Abbas El Gamal Stanford University Shannon Centennial, University of Michigan, September 2016 Information theory The fundamental problem of communication

More information

Arimoto Channel Coding Converse and Rényi Divergence

Arimoto Channel Coding Converse and Rényi Divergence Arimoto Channel Coding Converse and Rényi Divergence Yury Polyanskiy and Sergio Verdú Abstract Arimoto proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code

More information

Communication Efficient Secret Sharing

Communication Efficient Secret Sharing 1 Communication Efficient Secret Sharing Wentao Huang, Michael Langberg, Senior Member, IEEE, Joerg Kliewer, Senior Member, IEEE, and Jehoshua Bruck, Fellow, IEEE Abstract A secret sharing scheme is a

More information

Impossibility Results for Universal Composability in Public-Key Models and with Fixed Inputs

Impossibility Results for Universal Composability in Public-Key Models and with Fixed Inputs Impossibility Results for Universal Composability in Public-Key Models and with Fixed Inputs Dafna Kidron Yehuda Lindell June 6, 2010 Abstract Universal composability and concurrent general composition

More information

Group Secret Key Agreement over State-Dependent Wireless Broadcast Channels

Group Secret Key Agreement over State-Dependent Wireless Broadcast Channels Group Secret Key Agreement over State-Dependent Wireless Broadcast Channels Mahdi Jafari Siavoshani Sharif University of Technology, Iran Shaunak Mishra, Suhas Diggavi, Christina Fragouli Institute of

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

MGMT 69000: Topics in High-dimensional Data Analysis Falll 2016

MGMT 69000: Topics in High-dimensional Data Analysis Falll 2016 MGMT 69000: Topics in High-dimensional Data Analysis Falll 2016 Lecture 14: Information Theoretic Methods Lecturer: Jiaming Xu Scribe: Hilda Ibriga, Adarsh Barik, December 02, 2016 Outline f-divergence

More information

Broadcast and Verifiable Secret Sharing: New Security Models and Round-Optimal Constructions

Broadcast and Verifiable Secret Sharing: New Security Models and Round-Optimal Constructions Broadcast and Verifiable Secret Sharing: New Security Models and Round-Optimal Constructions Dissertation submitted to the Faculty of the Graduate School of the University of Maryland, College Park in

More information

1/p-Secure Multiparty Computation without an Honest Majority and the Best of Both Worlds

1/p-Secure Multiparty Computation without an Honest Majority and the Best of Both Worlds 1/p-Secure Multiparty Computation without an Honest Majority and the Best of Both Worlds Amos Beimel Department of Computer Science Ben Gurion University Be er Sheva, Israel Eran Omri Department of Computer

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

On Oblivious Transfer Capacity

On Oblivious Transfer Capacity On Oblivious Transfer Capacity Rudolph Ahlswede 1 and Imre Csiszár 2, 1 University of Bielefeld, Germany 2 Rényi Institute of Mathematics, Budapest, Hungary Abstract. Upper and lower bounds to the oblivious

More information

From Secure MPC to Efficient Zero-Knowledge

From Secure MPC to Efficient Zero-Knowledge From Secure MPC to Efficient Zero-Knowledge David Wu March, 2017 The Complexity Class NP NP the class of problems that are efficiently verifiable a language L is in NP if there exists a polynomial-time

More information

Lecture 4 Channel Coding

Lecture 4 Channel Coding Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity

More information

Hypothesis Testing with Communication Constraints

Hypothesis Testing with Communication Constraints Hypothesis Testing with Communication Constraints Dinesh Krithivasan EECS 750 April 17, 2006 Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 1 / 21 Presentation Outline

More information

Secret-Key Agreement over Unauthenticated Public Channels Part I: Definitions and a Completeness Result

Secret-Key Agreement over Unauthenticated Public Channels Part I: Definitions and a Completeness Result Secret-Key Agreement over Unauthenticated Public Channels Part I: Definitions and a Completeness Result Ueli Maurer, Fellow, IEEE Stefan Wolf Abstract This is the first part of a three-part paper on secret-key

More information

Inaccessible Entropy and its Applications. 1 Review: Psedorandom Generators from One-Way Functions

Inaccessible Entropy and its Applications. 1 Review: Psedorandom Generators from One-Way Functions Columbia University - Crypto Reading Group Apr 27, 2011 Inaccessible Entropy and its Applications Igor Carboni Oliveira We summarize the constructions of PRGs from OWFs discussed so far and introduce the

More information

A Framework for Non-Interactive Instance-Dependent Commitment Schemes (NIC)

A Framework for Non-Interactive Instance-Dependent Commitment Schemes (NIC) A Framework for Non-Interactive Instance-Dependent Commitment Schemes (NIC) Bruce Kapron, Lior Malka, Venkatesh Srinivasan Department of Computer Science University of Victoria, BC, Canada V8W 3P6 Email:bmkapron,liorma,venkat@cs.uvic.ca

More information

Correlation Detection and an Operational Interpretation of the Rényi Mutual Information

Correlation Detection and an Operational Interpretation of the Rényi Mutual Information Correlation Detection and an Operational Interpretation of the Rényi Mutual Information Masahito Hayashi 1, Marco Tomamichel 2 1 Graduate School of Mathematics, Nagoya University, and Centre for Quantum

More information

Entropies & Information Theory

Entropies & Information Theory Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information

More information

Minimum Energy Per Bit for Secret Key Acquisition Over Multipath Wireless Channels

Minimum Energy Per Bit for Secret Key Acquisition Over Multipath Wireless Channels Minimum Energy Per Bit for Secret Key Acquisition Over Multipath Wireless Channels Tzu-Han Chou Email: tchou@wisc.edu Akbar M. Sayeed Email: akbar@engr.wisc.edu Stark C. Draper Email: sdraper@ece.wisc.edu

More information

Lecture 11 September 30, 2015

Lecture 11 September 30, 2015 PHYS 7895: Quantum Information Theory Fall 015 Lecture 11 September 30, 015 Prof. Mark M. Wilde Scribe: Mark M. Wilde This document is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike

More information

1 Number Theory Basics

1 Number Theory Basics ECS 289M (Franklin), Winter 2010, Crypto Review 1 Number Theory Basics This section has some basic facts about number theory, mostly taken (or adapted) from Dan Boneh s number theory fact sheets for his

More information

A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback

A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback Vincent Y. F. Tan (NUS) Joint work with Silas L. Fong (Toronto) 2017 Information Theory Workshop, Kaohsiung,

More information

AN INFORMATION THEORY APPROACH TO WIRELESS SENSOR NETWORK DESIGN

AN INFORMATION THEORY APPROACH TO WIRELESS SENSOR NETWORK DESIGN AN INFORMATION THEORY APPROACH TO WIRELESS SENSOR NETWORK DESIGN A Thesis Presented to The Academic Faculty by Bryan Larish In Partial Fulfillment of the Requirements for the Degree Doctor of Philosophy

More information

On Expected Constant-Round Protocols for Byzantine Agreement

On Expected Constant-Round Protocols for Byzantine Agreement On Expected Constant-Round Protocols for Byzantine Agreement Jonathan Katz Chiu-Yuen Koo Abstract In a seminal paper, Feldman and Micali show an n-party Byzantine agreement protocol in the plain model

More information

Zero-Knowledge Against Quantum Attacks

Zero-Knowledge Against Quantum Attacks Zero-Knowledge Against Quantum Attacks John Watrous Department of Computer Science University of Calgary January 16, 2006 John Watrous (University of Calgary) Zero-Knowledge Against Quantum Attacks QIP

More information

Efficient Protocols Achieving the Commitment Capacity of Noisy Correlations

Efficient Protocols Achieving the Commitment Capacity of Noisy Correlations Efficient Protocols Achieving the Commitment Capacity of Noisy Correlations Hideki Imai, Kirill Morozov, Anderson C. A. Nascimento, Andreas Winter Department of Electrical, Electronic and Communication

More information

Notes on Zero Knowledge

Notes on Zero Knowledge U.C. Berkeley CS172: Automata, Computability and Complexity Handout 9 Professor Luca Trevisan 4/21/2015 Notes on Zero Knowledge These notes on zero knowledge protocols for quadratic residuosity are based

More information

Computer Science A Cryptography and Data Security. Claude Crépeau

Computer Science A Cryptography and Data Security. Claude Crépeau Computer Science 308-547A Cryptography and Data Security Claude Crépeau These notes are, largely, transcriptions by Anton Stiglic of class notes from the former course Cryptography and Data Security (308-647A)

More information

Convexity/Concavity of Renyi Entropy and α-mutual Information

Convexity/Concavity of Renyi Entropy and α-mutual Information Convexity/Concavity of Renyi Entropy and -Mutual Information Siu-Wai Ho Institute for Telecommunications Research University of South Australia Adelaide, SA 5095, Australia Email: siuwai.ho@unisa.edu.au

More information

Unequal Error Protection Querying Policies for the Noisy 20 Questions Problem

Unequal Error Protection Querying Policies for the Noisy 20 Questions Problem Unequal Error Protection Querying Policies for the Noisy 20 Questions Problem Hye Won Chung, Brian M. Sadler, Lizhong Zheng and Alfred O. Hero arxiv:606.09233v2 [cs.it] 28 Sep 207 Abstract In this paper,

More information

IN SECURE multiparty computation (MPC), mutually distrusting

IN SECURE multiparty computation (MPC), mutually distrusting 2560 IEEE TRANSACTIONS ON INFORMATION THEORY VOL 63 NO 4 APRIL 2017 Wiretapped Oblivious Transfer Manoj Mishra Student Member IEEE Bikash Kumar Dey Member IEEE Vinod M Prabhakaran Member IEEE Suhas N Diggavi

More information

Security Implications of Quantum Technologies

Security Implications of Quantum Technologies Security Implications of Quantum Technologies Jim Alves-Foss Center for Secure and Dependable Software Department of Computer Science University of Idaho Moscow, ID 83844-1010 email: jimaf@cs.uidaho.edu

More information

LECTURE 3. Last time:

LECTURE 3. Last time: LECTURE 3 Last time: Mutual Information. Convexity and concavity Jensen s inequality Information Inequality Data processing theorem Fano s Inequality Lecture outline Stochastic processes, Entropy rate

More information

9. Distance measures. 9.1 Classical information measures. Head Tail. How similar/close are two probability distributions? Trace distance.

9. Distance measures. 9.1 Classical information measures. Head Tail. How similar/close are two probability distributions? Trace distance. 9. Distance measures 9.1 Classical information measures How similar/close are two probability distributions? Trace distance Fidelity Example: Flipping two coins, one fair one biased Head Tail Trace distance

More information

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Silas Fong (Joint work with Vincent Tan) Department of Electrical & Computer Engineering National University

More information

Intermittent Communication

Intermittent Communication Intermittent Communication Mostafa Khoshnevisan, Student Member, IEEE, and J. Nicholas Laneman, Senior Member, IEEE arxiv:32.42v2 [cs.it] 7 Mar 207 Abstract We formulate a model for intermittent communication

More information

1 Secure two-party computation

1 Secure two-party computation CSCI 5440: Cryptography Lecture 7 The Chinese University of Hong Kong, Spring 2018 26 and 27 February 2018 In the first half of the course we covered the basic cryptographic primitives that enable secure

More information

Entropy Accumulation in Device-independent Protocols

Entropy Accumulation in Device-independent Protocols Entropy Accumulation in Device-independent Protocols QIP17 Seattle January 19, 2017 arxiv: 1607.01796 & 1607.01797 Rotem Arnon-Friedman, Frédéric Dupuis, Omar Fawzi, Renato Renner, & Thomas Vidick Outline

More information

Frans M.J. Willems. Authentication Based on Secret-Key Generation. Frans M.J. Willems. (joint work w. Tanya Ignatenko)

Frans M.J. Willems. Authentication Based on Secret-Key Generation. Frans M.J. Willems. (joint work w. Tanya Ignatenko) Eindhoven University of Technology IEEE EURASIP Spain Seminar on Signal Processing, Communication and Information Theory, Universidad Carlos III de Madrid, December 11, 2014 : Secret-Based Authentication

More information

Secret Key Agreement Using Asymmetry in Channel State Knowledge

Secret Key Agreement Using Asymmetry in Channel State Knowledge Secret Key Agreement Using Asymmetry in Channel State Knowledge Ashish Khisti Deutsche Telekom Inc. R&D Lab USA Los Altos, CA, 94040 Email: ashish.khisti@telekom.com Suhas Diggavi LICOS, EFL Lausanne,

More information

Quantum Error Correcting Codes and Quantum Cryptography. Peter Shor M.I.T. Cambridge, MA 02139

Quantum Error Correcting Codes and Quantum Cryptography. Peter Shor M.I.T. Cambridge, MA 02139 Quantum Error Correcting Codes and Quantum Cryptography Peter Shor M.I.T. Cambridge, MA 02139 1 We start out with two processes which are fundamentally quantum: superdense coding and teleportation. Superdense

More information

IN this paper, we consider the capacity of sticky channels, a

IN this paper, we consider the capacity of sticky channels, a 72 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 54, NO. 1, JANUARY 2008 Capacity Bounds for Sticky Channels Michael Mitzenmacher, Member, IEEE Abstract The capacity of sticky channels, a subclass of insertion

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

CS 630 Basic Probability and Information Theory. Tim Campbell

CS 630 Basic Probability and Information Theory. Tim Campbell CS 630 Basic Probability and Information Theory Tim Campbell 21 January 2003 Probability Theory Probability Theory is the study of how best to predict outcomes of events. An experiment (or trial or event)

More information

A Single-letter Upper Bound for the Sum Rate of Multiple Access Channels with Correlated Sources

A Single-letter Upper Bound for the Sum Rate of Multiple Access Channels with Correlated Sources A Single-letter Upper Bound for the Sum Rate of Multiple Access Channels with Correlated Sources Wei Kang Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College

More information

Computational Systems Biology: Biology X

Computational Systems Biology: Biology X Bud Mishra Room 1002, 715 Broadway, Courant Institute, NYU, New York, USA L#8:(November-08-2010) Cancer and Signals Outline 1 Bayesian Interpretation of Probabilities Information Theory Outline Bayesian

More information

ASPECIAL case of the general key agreement scenario defined

ASPECIAL case of the general key agreement scenario defined IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 49, NO 4, APRIL 2003 839 Secret-Key Agreement Over Unauthenticated Public Channels Part III: Privacy Amplification Ueli Maurer, Fellow, IEEE, and Stefan Wolf

More information

Cryptography CS 555. Topic 23: Zero-Knowledge Proof and Cryptographic Commitment. CS555 Topic 23 1

Cryptography CS 555. Topic 23: Zero-Knowledge Proof and Cryptographic Commitment. CS555 Topic 23 1 Cryptography CS 555 Topic 23: Zero-Knowledge Proof and Cryptographic Commitment CS555 Topic 23 1 Outline and Readings Outline Zero-knowledge proof Fiat-Shamir protocol Schnorr protocol Commitment schemes

More information

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University) Performance-based Security for Encoding of Information Signals FA9550-15-1-0180 (2015-2018) Paul Cuff (Princeton University) Contributors Two students finished PhD Tiance Wang (Goldman Sachs) Eva Song

More information

Topics. Probability Theory. Perfect Secrecy. Information Theory

Topics. Probability Theory. Perfect Secrecy. Information Theory Topics Probability Theory Perfect Secrecy Information Theory Some Terms (P,C,K,E,D) Computational Security Computational effort required to break cryptosystem Provable Security Relative to another, difficult

More information

Lecture 5 - Information theory

Lecture 5 - Information theory Lecture 5 - Information theory Jan Bouda FI MU May 18, 2012 Jan Bouda (FI MU) Lecture 5 - Information theory May 18, 2012 1 / 42 Part I Uncertainty and entropy Jan Bouda (FI MU) Lecture 5 - Information

More information

Interactive Channel Capacity

Interactive Channel Capacity Electronic Colloquium on Computational Complexity, Report No. 1 (2013 Interactive Channel Capacity Gillat Kol Ran Raz Abstract We study the interactive channel capacity of an ɛ-noisy channel. The interactive

More information

ECE 4400:693 - Information Theory

ECE 4400:693 - Information Theory ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential

More information

Other Topics in Quantum Information

Other Topics in Quantum Information p. 1/23 Other Topics in Quantum Information In a course like this there is only a limited time, and only a limited number of topics can be covered. Some additional topics will be covered in the class projects.

More information

Homework 3 Solutions

Homework 3 Solutions 5233/IOC5063 Theory of Cryptology, Fall 205 Instructor Prof. Wen-Guey Tzeng Homework 3 Solutions 7-Dec-205 Scribe Amir Rezapour. Consider an unfair coin with head probability 0.5. Assume that the coin

More information

An Achievable Error Exponent for the Mismatched Multiple-Access Channel

An Achievable Error Exponent for the Mismatched Multiple-Access Channel An Achievable Error Exponent for the Mismatched Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@camacuk Albert Guillén i Fàbregas ICREA & Universitat Pompeu Fabra University of

More information

Lecture 22. We first consider some constructions of standard commitment schemes. 2.1 Constructions Based on One-Way (Trapdoor) Permutations

Lecture 22. We first consider some constructions of standard commitment schemes. 2.1 Constructions Based on One-Way (Trapdoor) Permutations CMSC 858K Advanced Topics in Cryptography April 20, 2004 Lecturer: Jonathan Katz Lecture 22 Scribe(s): agaraj Anthapadmanabhan, Ji Sun Shin 1 Introduction to These otes In the previous lectures, we saw

More information

arxiv: v4 [cs.it] 17 Oct 2015

arxiv: v4 [cs.it] 17 Oct 2015 Upper Bounds on the Relative Entropy and Rényi Divergence as a Function of Total Variation Distance for Finite Alphabets Igal Sason Department of Electrical Engineering Technion Israel Institute of Technology

More information

THIS paper constitutes a systematic study of the capacity

THIS paper constitutes a systematic study of the capacity 4 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 1, JANUARY 2002 Capacities of TimeVarying MultipleAccess Channels With Side Information Arnab Das Prakash Narayan, Fellow, IEEE Abstract We determine

More information

Linear Codes, Target Function Classes, and Network Computing Capacity

Linear Codes, Target Function Classes, and Network Computing Capacity Linear Codes, Target Function Classes, and Network Computing Capacity Rathinakumar Appuswamy, Massimo Franceschetti, Nikhil Karamchandani, and Kenneth Zeger IEEE Transactions on Information Theory Submitted:

More information