Non-Asymptotic and Asymptotic Analyses on Markov Chains in Several Problems

Size: px
Start display at page:

Download "Non-Asymptotic and Asymptotic Analyses on Markov Chains in Several Problems"

Transcription

1 Non-Asymptotic and Asymptotic Analyses on Markov Chains in Several Problems Masahito Hayashi and Shun Watanabe Graduate School of Mathematics, Nagoya University, Japan, and Centre for Quantum Technologies, National University of Singapore, Singapore. Department of Information Science and Intelligent Systems, University of Tokushima, Japan, and Institute for Systems Research, University of Maryland, College Park. Abstract In this paper, we derive non-asymptotic achievability and converse bounds on the source coding with side-information and the random number generation with side-information. Our bounds are efficiently computable in the sense that the computational compleity does not depend on the block length. We also characterize the asymptotic behaviors of the large deviation regime and the moderate deviation regime by using our bounds, which implies that our bounds are asymptotically tight in those regimes. We also show the second order rates of those problems, and derive single letter forms of the variances characterizing the second order rates. I. INTRODUCTION The non-asymptotic analyses of coding problems are attracting a considerable attention recently ], 2]. In this paper, we further develop the non-asymptotic analyses for the fied length source coding with full) side-information at the decoder and the uniform random number generation with sideinformation. Particularly, we are interested in the cases such that underlying sources are Markov chains. A. Motivation To begin with, we shall eplain motivations of this paper. Although the problems treated in this paper are not the channel coding, we consider the channel coding here to eplain motivations. So far, quite many types of non-asymptotic achievability bounds have been proposed. For eample, Verdú and Han derived a non-asymptotic bound by using the information spectrum approach in order to derive the general formula 5] see also 6]), which we call the information-spectrum bound. One of the authors and Nagaoka derived a bound for the classical-quantum channel) by relating the error probability to the binary hypothesis testing 7, Remark 5] see also 8]), which we call the hypothesis testing bound. Polyanskiy et. al. derived the RCU random coding union) bound and the DT dependence testing) bound ] 2. There is also Gallager s bound 9]. The uniform random number generation with side-information is also known as the privacy amplification 3], 4]. 2 A bound slightly looser coefficients are worse) than the DT bound can be derived from the hypothesis testing bound of 7]. Although it is not stated eplicitly in any literatures, we believe that there are two important criteria for non-asymptotic bounds: Computational compleity, and Asymptotic optimality. Let us first consider the first criterion, i.e., the computational compleity. For the BSC, the computational compleity of the RCU bound is On 2 ) and that of the DT bound is On) 0]. However, the computational compleities of these bounds is much larger for general DMCs or channels with memory. It is known that the hypothesis testing bound can be described as a linear programming eg. see ], 2] 3 ), and can be efficiently computed under certain symmetry. However, the number of variables in the linear programming grows eponentially in the block length, and it is difficult to compute in general. The computation of the information-spectrum bound depends on the evaluation of a tail probability. The information-spectrum bound is less operational than the hypothesis testing bound in the sense of the hierarchy introduced in ], and the computational compleity of the former is much smaller than that of the latter. However the computation of a tail probability is still not so easy unless the channel is a DMC. For DMCs, computational compleity of Gallager s bound is O) since the Gallager function is additive quantity for DMCs. However, this is not the case if there is a memory 4. Consequently, there is no bound that is efficiently computable for the Markov chain so far. The situation is the same for the source coding with side-ifnromation. Let us now move to achievability bounds on the random number generation with side-infromation, i.e., the privacy amplification. Renner derived a bound by using the leftover hash lemma and the smooth min-entropy 4], which we call the smooth min-entropy bound. By combining this bound and the information-spectrum approach method, Tomamichel and one of the authors derived another bound ], which we call the inf-spectral entropy bound. One of the authors also 3 In the case of quantum channel, the bound is described as a semi-definite programming. 4 The Gallager bound for finite states channels was considered in 3, Section 5.9], but a closed form epression for the eponent was not derived.

2 derived another bound by using the leftover hash lemma, the approimate smoothing of the Rényi entropy of order 2, and the large deviation technique 5], which we call the eponential bound. Further, the authors compared the infspectral entropy bound and the eponential bound 6]. It turned out that the former is tighter than the latter when the required security level ε is rather large, and the latter is tighter than the former when ε is rather small. A bound that interpolates both the bounds was also derived in 6], which we call the hybrid bound. For the computational compleity issue of the privacy amplification, the situation is the same as the coding problems, i.e., there is no bound that is efficiently computable for the Markov chain. The smooth min-entropy bound can be computed by using the linear programming for rather small block length, but it is difficult to compute in general. The computation of the inf-spectral entropy bound depends on the evaluation of a tail probability, and it is also difficult to compute in general. The eponential bound is described by the Gallager function, and thus can be easily computed provided that a source is memoryless. As described above, there is no bound that is efficiently computable for the Markov chain, and the first purpose of this paper is to derive non-asymptotic bounds that are efficiently computable. Net, let us consider the second criterion, i.e., asymptotic optimality. So far, three kinds of asymptotic regimes have been studied in the information theory ], 2], 7], 8], 9], 20], 2]: The large deviation regime in which the error probability ε asymptotically behaves like e nr for some r > 0, The moderate deviation regime in which ε asymptotically behaves like e n 2tr for some r > 0 and t 0, /2), and The second order regime in which ε is a constant. We shall claim that a good non-asymptotic bound should be asymptotically optimal at least one of the above mentioned three regimes. In fact, the information spectrum bound, the hypothesis testing bound, and the DT bound are asymptotically optimal in the moderate deviation regime and the second order regime; the Gallager bound is asymptotically optimal in the large deviation regime; and the RCU bound is asymptotically optimal in all the regimes 5. B. Main Contribution for Non-Asymptotic Analysis To derive non-asymptotic achievability bounds on the problems, we basically use the eponential type bounds 6 for the single shot setting. For the source coding with side-information and the random number generation with side-infromation, we consider two assumptions on transition matrices see Assumption and Assumption 2 of Section II). Although a computable form of the conditional entropy rate is not known in general, 5 The Gallager bound and the RCU bound are asymptotically optimal in the large deviation regime only up to the critical rate. 6 For the channel coding, it corresponds to the Gallager bound. Assumption, which is less restrictive than Assumption 2, enables us to derive a computable form of the conditional entropy rate. In the problems with side-information, eponential type bounds are described by conditional Rényi entropies. There are several definitions of conditional Rényi entropies see 22], 23] for etensive review), and we use the one defined in 24] and the one defined by Arimoto 25]. We shall call the former one the lower conditional Rényi entropy cf. 2)) and the latter one the upper conditional Rényi entropy cf. 7)). To derive non-asymptotic bounds, we need to evaluate these information measures for the Markov chain. For this purpose, under Assumption, we introduce the lower conditional Rényi entropy for transition matrices cf. 2)). Then, we evaluate the lower conditional Rńyi entropy for the Markov chain in terms of its transition matri counterpart. This evaluation gives non-asymptotic bounds for the coding and random number generation problems under Assumption. Under more restrictive assumption, i.e., Assumption 2, we also introduce the upper conditional Rényi entropy for a transition matri cf. 26)). Then, we evaluate the upper Rényi entropy for the Markov chain in terms of its transition matri counterpart. This evaluate gives non-asymptotic bounds that are tighter than those obtained under Assumption. We also derive converse bounds for every problem by using the change of measure argument developed by the authors in the accompanying paper on information geometry 26]. To derive converse bounds for the problems with sideinformation, we further introduce two-parameter conditional Rényi entropy and its transition matri counterpart cf. 4) and 30)). This novel information measure includes the lower conditional Rényi entropy and the upper conditional Rényi entropy as special cases. Here, we would like to remark on terminologies. There are a few ways to epress eponential type bounds. In statistics or the large deviation theory, we usually use the cumulant generating function CGF) to describe eponents. In information theory, we use the Gallager function or the Rényi entropies. Although these three terminologies are essentially the same and are related by change of variables, the CGF and the Gallager function are convenient for some calculations since they have good properties such as conveity. However, they are merely mathematical functions. On the other hand, the Rényi entropies are information measures including Shannon s information measures as special cases. Thus, the Rényi entropies are intuitively familiar in the field of information theory. The Rényi entropies also have an advantage that two types of bounds eg. 82) and 82)) can be epressed in a unified manner. For these reasons, we state our main results in terms of the Rényi entropies while we use the CGF and the Gallager function in the proofs.

3 C. Main Contribution for Asymptotic Analysis For asymptotic analyses of the large deviation and the moderate deviation regimes, we derive the characterizations 7 by using our non-asymptotic achievability and converse bounds, which implies that our non-asymptotic bounds are tight in the large deviation regime and the moderate deviation regime. We also derive the second order rate. It is also clarified that the reciprocal coefficient of the moderate deviation regime and the variance of the second order regime coincide. Furthermore, a single letter form of the variance is clarified 8. D. Organization of Paper In Section II, we introduce information measures that will be needed in later sections. Then, we consider the source coding with side-information and the uniform random number generation with side-information in Section III and Section IV respectively. Omitted results and proofs can be found in 29]. II. INFORMATION MEASURES In this section, we introduce information measures that will be used in later sections. A. Information measures for Single-Shot Setting In this section, we introduce conditional Rényi entropies for the single-shot setting. For more detailed review of conditional Rényi entropies, see 23]. For a correlated random variable X, Y ) on X Y with probability distribution P XY and a marginal distribution Q Y on Y, we introduce the conditional Rényi entropy of order + relative to Q Y as H P XY Q Y ) := log,y P XY, y) Q Y y), ), 0) 0, ). The conditional Rényi entropy of order relative to Q Y is defined by the it with respect to. One of important special cases of H P XY Q Y ) is the case with Q Y = P Y. We shall call this special case the lower conditional Rényi entropy of order + and denote 9 H X Y ) := H P XY P Y ) 2) = log,y The following property holds. Lemma We have P XY, y) P Y y).3) 0 H X Y ) = HX Y ) 4) 7 For the large deviation regime, we only derive the characterizations up to the critical rates. 8 An alternative way to derive a single letter characterization of the variance for the Markov chain was shown in 27, Lemma 20]. It should be also noted that a single letter characterization can be derived by using the fundamental matri 28]. The single letter characterization of the variance in 7, Section VII] and 2, Section III] has an error, which is corrected in this paper. 9 This notation was first introduce in 30]. and VX Y ) := Var log 2 = 0 ] P X Y X Y ) ] HX Y ) H X Y ) 5). 6) The other important special cases of H P XY Q Y ) is the measure maimized over Q Y. We shall call this special case the upper conditional Rényi entropy of order and denote 0 H X Y ) 7) := ma H P XY Q Y ) 8) Q Y PY) = H P XY P ) Y ) 9) = + log ] P Y y) P X Y y), y 0) P ) Y y) := P XY, y) ] y P XY, y ) ]. ) For this measure, we also have properties similar to Lemma. Lemma 2 We have and 0 H X Y ) = HX Y ) 2) ] 2 HX Y ) H X Y ) = VX Y ). 3) 0 When we derive converse bounds on the source coding with side-information or the random number generation with sideinformation, we need to consider the case such that the order of the Rényi entropy and the order of conditioning distribution defined in ) are different. For this purpose, we introduce two-parameter conditional Rényi entropy: H, X Y ) 4) := H P XY P ) Y ) 5) = log ] P Y y) P X Y y) 6) y ] P X Y y) + + H X Y ). 7) 0 For < < 0, 9) can be proved by using the Hölder inequality, and, for 0 <, 9) can be proved by using the reverse Hölder inequality 3, Lemma 8].

4 B. Information Measures for Transition Matri Let {W, y, y )},y),,y )) X Y) 2 be an ergodic and irrecucible transition matri. The purpose of this section is to introduce transition matri counter parts of those measures in Section II-A. For this purpose, we first need to introduce some assumptions on transition matrices: Assumption Non-Hidden) We say that a transition matri W is non-hidden if W, y, y ) = W y y ) 8) for every X and y, y Y. Assumption 2 Strongly Non-Hidden) We say that a transition matri W is strongly non-hidden if, for every, ) and y, y Y, W y y ) := W, y, y ) 9) is well defined, i.e., the right hand side of 9) is independent of. Assumption requires 9) to hold only for = 0, and thus Assumption 2 implies Assumption. However, Assumption 2 strictly stronger condition than Assumption. For eample, let consider the case such that the transition matri is a product form, i.e., W, y, y ) = W )W y y ). In this case, Assumption is obviously satisfied. However, Assumption 2 is not satisfied in general. First, we introduce information measures under Assumption. In order to define a transition matri counterpart of 2), let us introduce the following tilted matri: W, y, y ) := W, y, y ) W y y ). 20) Let λ be the Perron-Frobenius eigenvalue and P,XY be its normalized eigenvector. Then, we define the lower conditional Rényi entropy for W by X Y ) := log λ, 2), 0) 0, ). For = 0, we define the lower conditional Rényi entropy for W by H W X Y ) = X Y ) 22) := X Y ), 23) 0 and we just call it the conditional entropy for W. As a counterpart of 6), we also define ] 2 H W X Y ) V W X Y ) X Y ) :=. 24) 0 Net, we introduce information measures under Assumption 2. In order to define a transition matri counterpart of 7), let us introduce the following Y Y matri: K y y ) := W y y ), 25) W is defined by 9). Let κ be the Perron-Frobenius eigenvalue. Then, we define the upper conditional Rényi entropy for W by H,W + X Y ) := log κ, 26), 0) 0, ). We have the following properties. Lemma 3 We have 0 H,W X Y ) = HW X Y ) 27) and ] 2 H W X Y ) H,W X Y ) = V W X Y ). 28) 0 Now, let us introduce a transition matri counterpart of 4). For this purpose, we introduce the following Y Y matri: N, y y ) := W y y )W y y ). 29) Let ν, be the Perron-Frobenius eigenvalue of N,. Then, we define the two-parameter conditional Rényi entropy by H W, X Y ) := log ν, + + H,W X Y ). 30) For the information measures introduced in this section, we have the following property. Lemma 4 ) The function X Y ) is a concave function of, and it is strict concave iff. V W X Y ) > 0. 2) X Y ) is a monotonically decreasing function of. 3) The function H,W X Y ) is a concave function of, and it is strict concave iff. V W X Y ) > 0. 4) H,W X Y ) is a monotonically decreasing function of. 5) For every, 0) 0, ), we have X Y ) H,W X Y ). 6) For fied, the function H, W X Y ) is a concave function of, and it is strict concave iff. V W X Y ) > 0. 7) For fied, H, W X Y ) is a monotonically decreasing function of. 8) We have 9) We have H,X Y W ) = X Y ). 3) H,X Y W ) = H,W X Y ). 32) 0) For every, 0) 0, ), H, W X Y ) is maimized at =. Net, we consider the etreme cases of X Y ). Let λ be the Perro-Frobenius eigenvalue of W, y, y ) > 0]W y y ). 33)

5 Then, we define 0 X Y ) := log λ. 34) On the other hand, let G W = X Y, E W ) be the graph such that, y ),, y)) E W iff. W, y, y ) > 0. Then, for each, y) X Y, let C,y) be the set of all Hamilton cycle from, y) to itself. Then, we define X Y ) := log ma Lemma 5 We have H,W H,W,ȳ) X Y,y ),,y)) c ma 35) c C,ȳ) W, y, y) / c 36). X Y ) = H,W 0 X Y ), 37) X Y ) = H,W X Y ). 38) Net, we consider the etreme cases of H,W X Y ). When W satisfies Assumption 2, we note that Sy y ) := suppw, y, y)), 39) T y y ) := ma W, y, y) 40) are well defined, i.e., the right hand sides of 39) and 40) are independent of. Let G W = Y, E W ) be the graph such that y, y) E W iff. W y y ) > 0. Then, for each y Y, let C y be the set of all Hamilton cycle from y to itself. Then, we define / c H,W 0 X Y ) := log ma Sy y ). 4) ma ȳ Y c Cȳ y,y) c On the other hand, let κ be the Perro-Frobenius eigenvalue of W y y )T y y ). Then, we define Lemma 6 We have H,W X Y ) := log κ. 42) H,W H,W X Y ) = H,W 0 X Y ), 43) X Y ) = H,W X Y ). 44) From Statement of Lemma 4, dh,w X Y )] d is monotonically decreasing. Thus, we can define the inverse function a) of dh,w X Y )] d by X Y )] d = a 45) =a) d dh for a < a a, a :=,W X Y )] dh,w X Y )] d. Let d and a := Ra) := + a)) a) a) X Y ). 46) Since R a) = + a)), 47) Ra) is a monotonic increasing function of a < a < Ra). Thus, we can define the inverse function ar) of Ra) by + ar)))ar) ar)) ar)) X Y ) = R 48) for Ra) < R < 0 X Y ). For H,W X Y ), by the same reason, we can define the inverse function a) by dh,w X Y )] d = a, 49) =a) and the inverse function ar) of by Ra) := + a))a a)h,w a) X Y ) 50) + ar)))ar) ar))h,w ar)) X Y ) = R, 5) for Ra) < R < H,W 0 X Y ). C. Information Measures for Markov Chain Let X, Y) be the Markov chain induced by transition matri W and some initial distribution P XY. Now, we show how information measures introduced in Section II-B are related to the conditional Rényi entropy rates. First, we introduce the following lemma, which gives finite upper and lower bounds on the lower conditional Rényi entropy. Lemma 7 Suppose that transition matri W satisfies Assumption. Let v be the eigenvector of W T with respect to the Perron-Frobenius eigenvalue λ such that min,y v, y) =. Let w, y) := P XY, y) P Y y). Then, we have n ) X Y ) + δ) 52) H Xn Y n ) 53) n ) X Y ) + δ), 54) δ) := log v w + log ma,y v, y), 55) δ) := log v w. 56) From Lemma 7, we have the following. Theorem Suppose that transition matri W satisfies Assumption. For any initial distribution, we have n n H Xn Y n ) = X Y ), 57) n n HXn Y n ) = H W X Y ), 58) n n H 0 Xn Y n ) = 0 X Y ), 59) n n H X n Y n ) = H,W X Y ). 60)

6 We also have the following asymptotic evaluation of the variance. Theorem 2 Suppose that transition matri W satisfies Assumption. For any initial distribution, we have n n VXn Y n ) = V W X Y ). 6) Theorem 2 is practically important since the it of the variance can be described by a single letter characterized quantity. A method to calculate V W X Y ) can be found in 32]. Net, we show the lemma that gives finite upper and lower bound on the upper conditional Rényi entropy in terms of the upper conditional Rényi entropy for the transition matri. Lemma 8 Suppose that transition matri W satisfies Assumption 2. Let v be the eigenvector of K T with respect to the Perro-Frobenius eigenvalue κ such that min y v y) =. Let w be the Y -dimensional vector defined by w y) := Then, we have P XY, y) ]. 62) n ) + H,W X Y ) + ξ) 63) + H Xn Y n ) 64) n ) + H,W X Y ) + ξ), 65) ξ) := log v w + log ma v y), 66) y ξ) := log v w. 67) From Lemma 8, we have the following. Theorem 3 Suppose that transition matri W satisfies Assumption 2. For any initial distribution, we have n n H Xn Y n ) = H,W X Y ), 68) n n H 0 Xn Y n ) = H,W 0 X Y ), 69) n n H X n Y n ) = H,W X Y ). 70) Finally, we show the lemma that gives finite upper and lower bounds on the two-parameter conditional Rényi entropy in terms of the two-parameter conditional Rényi entropy for the transition matri. Lemma 9 Suppose that transition matri W satisfies Assumption 2. Let v, be the eigenvector of N, T with respect to the Perro-Frobenius eigenvalue ν, such that min y v, y) =. Let w, be the Y -dimensional vector defined by w, y) 7) ] ] := P XY, y) P XY, y) 72). Then, we have n )H W, X Y ) + ζ, ) 73) H, X n Y n ) 74) n )H W, X Y ) + ζ, ), 75) ζ, ) := log v, w, + log ma v, y) + ξ ), y ζ, ) := log v, w, + ξ ) for > 0 and ζ, ) := log v, w, + log ma v, y) + ξ ), y ζ, ) := log v, w, + ξ ) for < 0 From Lemma 9, we have the following. Theorem 4 Suppose that transition matri W satisfies Assumption 2. For any initial distribution, we have n n H, Xn Y n ) = H, W X Y ). 76) III. SOURCE CODING WITH FULL SIDE-INFORMATION A. Problem Formulation A code Ψ = e, d) consists of one encoder e : X {,..., M} and one decoder d : {,..., M} Y X. The decoding error probability is defined by P e Ψ) := Pr{X dex), Y )}. 77) For notational convenience, we introduce the infimum of error probabilities under the condition that the message size is M: P e M) := inf Ψ P eψ) 78) When we construct a source code, we often use a twouniversal hash family F and a random function F on F. Then, we bound the error probability P e ΨF )) averaged over the random function by only using the property of twouniversality. For this reason, it is convenient to introduce the quantity: P e M) := sup EP e ΨF ))], 79) F the supremum is taken over all two-universal hash family from X to {,..., M}. From the definition, we obviously have P e M) P e M). When we consider n-fold A family of functions is said to be universal-two if Pr{F ) = F )} M for any district and 33].

7 etension, the source code and related quantities are denoted with subscript n. Instead of evaluating the error probability P e M n ) or P e M n )) for given M n, we are also interested in evaluating for given 0 ε. Mn, ε) := inf{m n : P e M n ) ε}, 80) Mn, ε) := inf{m n : P e M n ) ε} 8) B. Finite Bounds for Markov Source Under Assumption, we have the following achievability and converse bounds. Theorem 5 Suppose that transition matri W satisfies Assumption. Let R := n log M n. Then we have log P e M n ) sup 0 nr + n )H X Y ) + δ) ]. Theorem 6 Suppose that transition matri W satisfies Assumption. Let R := n log M n. For any H W X Y ) < R < 0 X Y ), we have log P e M n ) inf s>0 < <ar)) { n ) + s) + s) log + X Y ) H,W ++s) X Y ) } + δ 2e n )ER, )+δ 2 ) ] /s, ER, ) := ar)) )ar) ar)) ar)),w X Y ) + H X Y ), + a) and ar) are the inverse functions defined by 45) and 48) respectively, and δ := + s)δ ) δ + s) ), δ 2 := ar)) )R + )δar))) + + ar)))δ ). + ar)) Net, we derive tighter achievability and converse bounds under Assumption 2. Theorem 7 Suppose that transition matri W satisfies Assumption 2. Let R := n log M n. Then we have log P e M n ),W nr + n )H sup X Y ) + ξ) Theorem 8 Suppose that transition matri W satisfies Assumption 2. Let R := n log M n. For any H W X Y ) < R < H,W 0 X Y ), we have log P e M n ) n ) + s) inf s>0 < <ar)) { H W +,ar)) X Y ) } H W X Y ) + δ ++s),ar)) ) ] + s) log 2e n )ER, )+δ 2 /s, ER, ) := ar)) )ar) ar))h,w ar)) X Y ) + H W X Y )] +,ar)) a) and ar) are the inverse functions defined by 49) and 5) respectively, and δ := + s)ζ, ar))) ζ + s), ar))), δ 2 := ar)) )R + )ζar)), ar))) C. Large Deviation + ar)) + + ar)))ζ, ar))) + ar)) From Theorem 5 and Theorem 6, we have the following. Theorem 9 Suppose that transition matri W satisfies Assumption. For H W X Y ) < R, we have inf n n log P e e nr ) sup 0 R + X Y )]. On the other hand, for H W X Y ) < R < 0 X Y ), we have sup n n log P e e nr ) ar))ar) + ar)) ar)) X Y ). Under Assumption 2, from Theorem 7 and Theorem 8, we have the following tighter bound. Theorem 0 Suppose that transition matri W satisfies Assumption 2. For H W X Y ) < R, we have inf n n log P e e nr ),W R + H sup X Y )

8 On the other hand, for H W X Y ) < R < H,W 0 X Y ), we have sup n n log P e e nr ) ar))ar) + ar))h,w ar)) X Y ). Remark For R R cr, cf. 50) for the definition of Ra)) dh,w ) X Y )] R cr := R 82) d = 2 is the critical rate, we can rewrite the lower bound in 82) as,w R + H sup X Y ) = ar))ar) + ar))h,w ar)) X Y ). Thus, the lower bound and the upper bound coincide up to the critical rate. D. Moderate Deviation From Theorem 5 and Theorem 6, we have the following. Theorem Suppose that transition matri W satisfies Assumption. For arbitrary t 0, /2) and δ > 0, we have n n 2t log P e e ) nhw X Y )+n t δ = n n 2t log P ) e e nhw X Y )+n t δ δ 2 = 2V W X Y ). E. Second Order By applying the central it theorem to information spectrum type bounds, and by using Theorem 2, we have the following. Theorem 2 Suppose that transition matri W satisfies Assumption. For arbitrary ε 0, ), we have log Mn, ε) nh W X Y ) n n log = Mn, ε) nh W X Y ) n n = V W X Y )Φ ε). IV. UNIFORM RANDOM NUMBER GENERATION WITH SIDE-INFORMATION In this section, we consider the uniform random number generation, which is also known as the privacy amplification. A. Problem Formulation The privacy amplification is conducted by a function f : X {,..., M}. The security of the generated key is evaluated by f) := 2 P fx)y P Ū P Y, 83) Ū is the uniform random variable on {,..., M} and is the variational distance. For notational convenience, we introduce the infimum of the security criterion under the condition that the range size is M: M) := inf f). 84) f When we construct a function for the privacy amplification, we often use a two-universal hash family F and a random function F on F. Then, we bound the security criterion averaged over the random function by only using the property of two-universality. For this reason, it is convenient to introduce the quantity: M) := sup E F )], 85) F the supremum is taken over all two-universal hash families from X to {,..., M}. From the definition, we obviously have M) M). 86) When we consider n-fold etension, the security criteria are denoted by M n ) or M n ). As in Section III, we also introduce the quantities Mn, ε) and Mn, ε). B. Finite Bounds for Markov Source Under Assumption, we have the following achievability and converse bounds. Theorem 3 Suppose that transition matri W satisfies Assumption. Let R := n log M n. Then we have log M n ) sup 0 nr + n ) X Y ) + δ) log3/2). + Theorem 4 Suppose that transition matri W satisfies Assumption. Let R := n logm n/2). For any a < R < H W X Y ), we have log M n ) inf s>0 >a) { n ) + s) + s) log + X Y ) H,W ++s) X Y ) } + δ e n )ER, )+δ 2 ) ] /s +, ER, ) := R) )R R),W R) X Y ) + H X Y ) +

9 a) is the inverse function defined by 45), and δ := + s)δ ) δ + s) ), δ 2 := R) )R δr)) + δ ). Net, we derive tighter achievability and converse bounds under Assumption 2. Theorem 5 Suppose that transition matri W satisfies Assumption 2. Let R := n log M n. Then we have log M n ) sup 0 nr + n )H,W X Y ) + Theorem 6 Suppose that transition matri W satisfies Assumption 2. Let R be such that n )R + { + ar)))ar) ξar)))) } = logm n /2). If Ra) < R < H W X Y ), then we have log M n ) n ) + s) inf s>0 >ar)) { H W +,ar)) X Y ) } H W X Y ) + δ ++s),ar)) ) ] + s) log e n )ER, )+δ 2 /s + 2, ER, ) := ar)) )ar)) ar))h,w ar)) X Y ) + H W X Y )] +,ar)) a) and ar) are the inverse functions defined by 49) and 5) respectively, δ := + s)ζ, ar))) ζ + s), ar))), δ 2 := ar)) )ar)) ζar)), ar))) C. Large Deviation +ζ, ar))). From Theorem 3 and Theorem 4, we have the following. Theorem 7 Suppose that transition matri W satisfies Assumption. For R < H W X Y ), we have inf n n log e nr) sup 0 R + Remark 2 For R cr R, cf. 50) for the definition of Ra)) + ξ) log3/2). dh,w ) X Y )] R cr := R d X Y ) + On the other hand, for a < R < H W X Y ), we have sup n n log e nr) R)R + R) R) X Y ). Under Assumption 2, from Theorem 5 and Theorem 6, we have the following tighter bound.. Theorem 8 Suppose that transition matri W satisfies Assumption 2. For R < H W X Y ), we have inf n n log e nr) sup 0 R + H,W X Y ) + On the other hand, for Ra) < R < H W X Y ), we have sup n n log e nr) ar))ar) + ar))h,w ar)) X Y ). = is the critical rate, we can rewrite the lower bound in 87) as sup 0 R + H,W X Y ) + = ar))ar) + ar))h,w ar)) X Y ). Thus, the lower bound and the upper bound coincide up to the critical rate. D. Moderate Deviation From Theorem 3 and Theorem 4, we have the following. Theorem 9 Suppose that transition matri W satisfies Assumption. For arbitrary t 0, /2) and δ > 0, we have ) n n 2t log e nhw X Y ) n t δ = log e ) nhw X Y ) n t δ n n 2t δ 2 = 2V W X Y ). E. Second Order By applying the central it theorem to information spectrum bounds, and by using Theorem 2, we have the following. Theorem 20 Suppose that transition matri W satisfies Assumption. For arbitrary ε 0, ), we have log Mn, ε) nh W X Y ) n n log = Mn, ε) nh W X Y ) n n = V W X Y )Φ ε). ACKNOWLEDGMENT HM is partially supported by a MEXT Grant-in-Aid for Scientific Research A) No He is partially supported by the National Institute of Information and Communication Technology NICT), Japan. The Centre for Quantum Technologies is funded by the Singapore Ministry of Education and the National Research Foundation as part of the Research Centres of Ecellence programme..

10 REFERENCES ] Y. Polyanskiy, H. V. Poor, and S. Verdu, Channel coding rate in the finite blocklength regime, IEEE Trans. Inform. Theory, vol. 56, no. 5, pp , May ] M. Hayashi, Information spectrum approach to second-order coding rate in channel coding, IEEE Trans. Inform. Theory, vol. 55, no., pp , November ] C. H. Bennett, G. Brassard, and J. M. Robert, Privacy amplification by public discussion, SIAM Journal on Computing, vol. 7, no. 2, pp , Apr ] C. H. Bennett, G. Brassard, C. Crépeau, and U. Maurer, Generalized privacy amplification, IEEE Trans. Inform. Theory, vol. 4, no. 6, pp , Nov ] S. Verdú and T. S. Han, A general fomula for channel capacity, IEEE Trans. Inform. Theory, vol. 40, no. 4, pp , July ] T. S. Han, Information-Spectrum Methods in Information Theory. Springer, ] M. Hayashi and H. Nagaoka, General formulas for capacity of classicalquantum channels, IEEE Trans. Inform. Theory, vol. 49, no. 7, pp , July ] L. Wang and R. Renner, One-shot classical-quantum capacity and hypothesis testing, Phys. Rev. Lett., vol. 08, no. 20, p , May ] R. G. Gallager, A simple derivation of the coding theorem and some applications, IEEE Trans. Inform. Theory, vol., no., pp. 3 8, January ] Y. Polyanskiy, Channel coding: Non-asymptotic fundamental its, Ph.D. dissertation, Princeton University, November 200. ] M. Tomamichel and M. Hayashi, A hierarchy of information quantities for finite block length analysis of quantum tasks, IEEE Trans. Inform. Theory, vol. 59, no., pp , November ] W. Matthews and S. Wehner, Finite blocklength converse bounds for quantum channels, 202, arxiv: ] R. G. Gallager, Information Theory and Reliable Communication. John Wiley & Sons, ] R. Renner, Security of quantum key distribution, Ph.D. dissertation, Dipl. Phys. ETH, Switzerland, February ] M. Hayashi, Tight eponential analysis of universally composable privacy amplification and its applications, IEEE Trans. Inform. Theory, vol. 59, no., pp , November ] S. Watanabe and M. Hayashi, Non-asymptotic analysis of privacy amplification via Rényi entropy and inf-spectral entropy, in Proc. IEEE Int. Symp. Inf. Theory 203, Istanbul, Turkey, 203, pp , arxiv: ] M. Hayashi, Second-order asymptotics in fied-length source coding and intrinsic randomness, IEEE Trans. Inform. Theory, vol. 54, no. 0, pp , October 2008, arxiv:cs/ ] Y. Altug and A. B. Wagner, Moderate deviation analysis of channel coding: Discrete memoryless case, in Proceedings of IEEE International Symposium on Information Theory, Austin, Teas, USA, June 200, pp ] D. He, L. A. Lastras-Montano, E. Yang, A. Jagmohan, and J. Chen, On the redundancy of slepian-wolf coding, IEEE Trans. Inform. Theory, vol. 55, no. 2, pp , December ] V. Y. F. Tan, Moderate-deviations of lossy source coding for discrete and gaussian sources, in Proc. IEEE Int. Symp. Inf. Theory 202, Cambridge, MA, 202, pp ] S. Kuzuoka, A simple technique for bounding the redundancy of source coding with side information, in Proc. IEEE Int. Symp. Inf. Theory 202, Cambridge, MA, 202, pp ] A. Teieira, A. Matos, and L. Antunes, Conditional Rényi entropies, IEEE Trans. Inform. Theory, vol. 58, no. 7, pp , July ] M. Iwamoto and J. Shikata, Information theoretic security for encryption based on conditional Rényi entropies, 203, 24] M. Hayashi, Eponential decreasing rate of leaked information in universal random privacy amplification, IEEE Trans. Inform. Theory, vol. 57, no. 6, pp , June 20, arxiv: ] S. Arimoto, Information measures and capacity of order α for discrete memoryless channels, Colloquia Mathematica Societatis Janos Bolyai, 6. Topics in Information Theory, pp. 4 52, ] M. Hayashi and S. Watanabe, Information geometry approach to markov chains, ] M. Tomamichel and V. Y. F. Tan, ε-capacities and second-order coding rates for channels with general state, 203, arxiv: ] J. G. Kemeny and J. Snell, Finite Markov Chains. Springer, ] M. Hayashi and S. Watanabe, Non-asymptotic and asymptotic analyses on markov chains in several problems, 203, arxiv: ] M. Tomamichel, M. Berta, and M. Hayashi, A duality relation connecting different quantum generalizations of the conditional Rényi entropy, 203, arxiv: ] M. Hayashi, Large deviation analysis for classical and quantum security via approimate smoothing, 202, arxiv: ] S. Watanabe and M. Hayashi, Finite-length analysis on tail probability and simple hypothesis testing for markov chain, ] M. N. Wegman and J. L. Carter, New hash functions and their use in authentication and set equality, Journal of Computer and System Sciences, vol. 22, pp , 98.

Secret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe

Secret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe Two party secret key agreement Maurer 93, Ahlswede-Csiszár 93 X F Y K x K y ArandomvariableK

More information

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability

More information

Quantum Sphere-Packing Bounds and Moderate Deviation Analysis for Classical-Quantum Channels

Quantum Sphere-Packing Bounds and Moderate Deviation Analysis for Classical-Quantum Channels Quantum Sphere-Packing Bounds and Moderate Deviation Analysis for Classical-Quantum Channels (, ) Joint work with Min-Hsiu Hsieh and Marco Tomamichel Hao-Chung Cheng University of Technology Sydney National

More information

Arimoto Channel Coding Converse and Rényi Divergence

Arimoto Channel Coding Converse and Rényi Divergence Arimoto Channel Coding Converse and Rényi Divergence Yury Polyanskiy and Sergio Verdú Abstract Arimoto proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code

More information

Secret Key Agreement: General Capacity and Second-Order Asymptotics

Secret Key Agreement: General Capacity and Second-Order Asymptotics Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe 1 Abstract We revisit the problem of secret key agreement using interactive public communication

More information

Correlation Detection and an Operational Interpretation of the Rényi Mutual Information

Correlation Detection and an Operational Interpretation of the Rényi Mutual Information Correlation Detection and an Operational Interpretation of the Rényi Mutual Information Masahito Hayashi 1, Marco Tomamichel 2 1 Graduate School of Mathematics, Nagoya University, and Centre for Quantum

More information

Dispersion of the Gilbert-Elliott Channel

Dispersion of the Gilbert-Elliott Channel Dispersion of the Gilbert-Elliott Channel Yury Polyanskiy Email: ypolyans@princeton.edu H. Vincent Poor Email: poor@princeton.edu Sergio Verdú Email: verdu@princeton.edu Abstract Channel dispersion plays

More information

Simple Channel Coding Bounds

Simple Channel Coding Bounds ISIT 2009, Seoul, Korea, June 28 - July 3,2009 Simple Channel Coding Bounds Ligong Wang Signal and Information Processing Laboratory wang@isi.ee.ethz.ch Roger Colbeck Institute for Theoretical Physics,

More information

An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets

An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets Jing Guo University of Cambridge jg582@cam.ac.uk Jossy Sayir University of Cambridge j.sayir@ieee.org Minghai Qin

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks

A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks Marco Tomamichel, Masahito Hayashi arxiv: 1208.1478 Also discussing results of: Second Order Asymptotics for Quantum

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Silas Fong (Joint work with Vincent Tan) Department of Electrical & Computer Engineering National University

More information

Soft Covering with High Probability

Soft Covering with High Probability Soft Covering with High Probability Paul Cuff Princeton University arxiv:605.06396v [cs.it] 20 May 206 Abstract Wyner s soft-covering lemma is the central analysis step for achievability proofs of information

More information

On Third-Order Asymptotics for DMCs

On Third-Order Asymptotics for DMCs On Third-Order Asymptotics for DMCs Vincent Y. F. Tan Institute for Infocomm Research (I R) National University of Singapore (NUS) January 0, 013 Vincent Tan (I R and NUS) Third-Order Asymptotics for DMCs

More information

Second-Order Asymptotics in Information Theory

Second-Order Asymptotics in Information Theory Second-Order Asymptotics in Information Theory Vincent Y. F. Tan (vtan@nus.edu.sg) Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) National Taiwan University November 2015

More information

Converse bounds for private communication over quantum channels

Converse bounds for private communication over quantum channels Converse bounds for private communication over quantum channels Mark M. Wilde (LSU) joint work with Mario Berta (Caltech) and Marco Tomamichel ( Univ. Sydney + Univ. of Technology, Sydney ) arxiv:1602.08898

More information

NUMERICAL COMPUTATION OF THE CAPACITY OF CONTINUOUS MEMORYLESS CHANNELS

NUMERICAL COMPUTATION OF THE CAPACITY OF CONTINUOUS MEMORYLESS CHANNELS NUMERICAL COMPUTATION OF THE CAPACITY OF CONTINUOUS MEMORYLESS CHANNELS Justin Dauwels Dept. of Information Technology and Electrical Engineering ETH, CH-8092 Zürich, Switzerland dauwels@isi.ee.ethz.ch

More information

Convexity/Concavity of Renyi Entropy and α-mutual Information

Convexity/Concavity of Renyi Entropy and α-mutual Information Convexity/Concavity of Renyi Entropy and -Mutual Information Siu-Wai Ho Institute for Telecommunications Research University of South Australia Adelaide, SA 5095, Australia Email: siuwai.ho@unisa.edu.au

More information

Simple and Tight Bounds for Information Reconciliation and Privacy Amplification

Simple and Tight Bounds for Information Reconciliation and Privacy Amplification Simple and Tight Bounds for Information Reconciliation and Privacy Amplification Renato Renner 1 and Stefan Wolf 2 1 Computer Science Department, ETH Zürich, Switzerland. renner@inf.ethz.ch. 2 Département

More information

Discrete Memoryless Channels with Memoryless Output Sequences

Discrete Memoryless Channels with Memoryless Output Sequences Discrete Memoryless Channels with Memoryless utput Sequences Marcelo S Pinho Department of Electronic Engineering Instituto Tecnologico de Aeronautica Sao Jose dos Campos, SP 12228-900, Brazil Email: mpinho@ieeeorg

More information

arxiv: v1 [cs.it] 5 Sep 2008

arxiv: v1 [cs.it] 5 Sep 2008 1 arxiv:0809.1043v1 [cs.it] 5 Sep 2008 On Unique Decodability Marco Dalai, Riccardo Leonardi Abstract In this paper we propose a revisitation of the topic of unique decodability and of some fundamental

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Vincent Y. F. Tan (Joint work with Silas L. Fong) National University of Singapore (NUS) 2016 International Zurich Seminar on

More information

On Gaussian MIMO Broadcast Channels with Common and Private Messages

On Gaussian MIMO Broadcast Channels with Common and Private Messages On Gaussian MIMO Broadcast Channels with Common and Private Messages Ersen Ekrem Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 20742 ersen@umd.edu

More information

A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback

A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback Vincent Y. F. Tan (NUS) Joint work with Silas L. Fong (Toronto) 2017 Information Theory Workshop, Kaohsiung,

More information

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/41 Outline Two-terminal model: Mutual information Operational meaning in: Channel coding: channel

More information

Channels with cost constraints: strong converse and dispersion

Channels with cost constraints: strong converse and dispersion Channels with cost constraints: strong converse and dispersion Victoria Kostina, Sergio Verdú Dept. of Electrical Engineering, Princeton University, NJ 08544, USA Abstract This paper shows the strong converse

More information

Memory in Classical Information Theory: A Brief History

Memory in Classical Information Theory: A Brief History Beyond iid in information theory 8- January, 203 Memory in Classical Information Theory: A Brief History sergio verdu princeton university entropy rate Shannon, 948 entropy rate: memoryless process H(X)

More information

Arimoto-Rényi Conditional Entropy. and Bayesian M-ary Hypothesis Testing. Abstract

Arimoto-Rényi Conditional Entropy. and Bayesian M-ary Hypothesis Testing. Abstract Arimoto-Rényi Conditional Entropy and Bayesian M-ary Hypothesis Testing Igal Sason Sergio Verdú Abstract This paper gives upper and lower bounds on the minimum error probability of Bayesian M-ary hypothesis

More information

Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets

Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets Jonathan Scarlett and Vincent Y. F. Tan Department of Engineering, University of Cambridge Electrical and Computer Engineering,

More information

A Single-letter Upper Bound for the Sum Rate of Multiple Access Channels with Correlated Sources

A Single-letter Upper Bound for the Sum Rate of Multiple Access Channels with Correlated Sources A Single-letter Upper Bound for the Sum Rate of Multiple Access Channels with Correlated Sources Wei Kang Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College

More information

Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities

Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities Vincent Y. F. Tan Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) September 2014 Vincent Tan

More information

Channel Dispersion and Moderate Deviations Limits for Memoryless Channels

Channel Dispersion and Moderate Deviations Limits for Memoryless Channels Channel Dispersion and Moderate Deviations Limits for Memoryless Channels Yury Polyanskiy and Sergio Verdú Abstract Recently, Altug and Wagner ] posed a question regarding the optimal behavior of the probability

More information

An Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel

An Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 4, APRIL 2012 2427 An Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel Ersen Ekrem, Student Member, IEEE,

More information

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/40 Acknowledgement Praneeth Boda Himanshu Tyagi Shun Watanabe 3/40 Outline Two-terminal model: Mutual

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Nevroz Şen, Fady Alajaji and Serdar Yüksel Department of Mathematics and Statistics Queen s University Kingston, ON K7L 3N6, Canada

More information

Secret Key Agreement: General Capacity and Second-Order Asymptotics

Secret Key Agreement: General Capacity and Second-Order Asymptotics Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe Abstract We revisit the problem of secret key agreement using interactive public communication

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

Problems of the CASCADE Protocol and Renyi Entropy Reduction in Classical and Quantum Key Generation

Problems of the CASCADE Protocol and Renyi Entropy Reduction in Classical and Quantum Key Generation arxiv:quant-ph/0703012v1 1 Mar 2007 Problems of the CASCADE Protocol and Renyi Entropy Reduction in Classical and Quantum Key Generation Koichi Yamazaki 1,2, Ranjith Nair 2, and Horace P. Yuen 2 1 Department

More information

Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities

Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities Maxim Raginsky and Igal Sason ISIT 2013, Istanbul, Turkey Capacity-Achieving Channel Codes The set-up DMC

More information

A new converse in rate-distortion theory

A new converse in rate-distortion theory A new converse in rate-distortion theory Victoria Kostina, Sergio Verdú Dept. of Electrical Engineering, Princeton University, NJ, 08544, USA Abstract This paper shows new finite-blocklength converse bounds

More information

Error Exponent Region for Gaussian Broadcast Channels

Error Exponent Region for Gaussian Broadcast Channels Error Exponent Region for Gaussian Broadcast Channels Lihua Weng, S. Sandeep Pradhan, and Achilleas Anastasopoulos Electrical Engineering and Computer Science Dept. University of Michigan, Ann Arbor, MI

More information

On the Duality between Multiple-Access Codes and Computation Codes

On the Duality between Multiple-Access Codes and Computation Codes On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch

More information

CSCI-6971 Lecture Notes: Probability theory

CSCI-6971 Lecture Notes: Probability theory CSCI-6971 Lecture Notes: Probability theory Kristopher R. Beevers Department of Computer Science Rensselaer Polytechnic Institute beevek@cs.rpi.edu January 31, 2006 1 Properties of probabilities Let, A,

More information

On Scalable Coding in the Presence of Decoder Side Information

On Scalable Coding in the Presence of Decoder Side Information On Scalable Coding in the Presence of Decoder Side Information Emrah Akyol, Urbashi Mitra Dep. of Electrical Eng. USC, CA, US Email: {eakyol, ubli}@usc.edu Ertem Tuncel Dep. of Electrical Eng. UC Riverside,

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

On the Simulatability Condition in Key Generation Over a Non-authenticated Public Channel

On the Simulatability Condition in Key Generation Over a Non-authenticated Public Channel On the Simulatability Condition in Key Generation Over a Non-authenticated Public Channel Wenwen Tu and Lifeng Lai Department of Electrical and Computer Engineering Worcester Polytechnic Institute Worcester,

More information

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute ENEE 739C: Advanced Topics in Signal Processing: Coding Theory Instructor: Alexander Barg Lecture 6 (draft; 9/6/03. Error exponents for Discrete Memoryless Channels http://www.enee.umd.edu/ abarg/enee739c/course.html

More information

arxiv: v4 [cs.it] 17 Oct 2015

arxiv: v4 [cs.it] 17 Oct 2015 Upper Bounds on the Relative Entropy and Rényi Divergence as a Function of Total Variation Distance for Finite Alphabets Igal Sason Department of Electrical Engineering Technion Israel Institute of Technology

More information

The Poisson Channel with Side Information

The Poisson Channel with Side Information The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH

More information

A Comparison of Superposition Coding Schemes

A Comparison of Superposition Coding Schemes A Comparison of Superposition Coding Schemes Lele Wang, Eren Şaşoğlu, Bernd Bandemer, and Young-Han Kim Department of Electrical and Computer Engineering University of California, San Diego La Jolla, CA

More information

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,

More information

2015 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media,

2015 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, 2015 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising

More information

Classical communication over classical channels using non-classical correlation. Will Matthews University of Waterloo

Classical communication over classical channels using non-classical correlation. Will Matthews University of Waterloo Classical communication over classical channels using non-classical correlation. Will Matthews IQC @ University of Waterloo 1 Classical data over classical channels Q F E n Y X G ˆQ Finite input alphabet

More information

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels Representation of Correlated s into Graphs for Transmission over Broadcast s Suhan Choi Department of Electrical Eng. and Computer Science University of Michigan, Ann Arbor, MI 80, USA Email: suhanc@eecs.umich.edu

More information

Some Expectations of a Non-Central Chi-Square Distribution With an Even Number of Degrees of Freedom

Some Expectations of a Non-Central Chi-Square Distribution With an Even Number of Degrees of Freedom Some Expectations of a Non-Central Chi-Square Distribution With an Even Number of Degrees of Freedom Stefan M. Moser April 7, 007 Abstract The non-central chi-square distribution plays an important role

More information

Interactive Communication for Data Exchange

Interactive Communication for Data Exchange Interactive Communication for Data Exchange Himanshu Tyagi, Member, IEEE, Pramod Viswanath, Fellow, IEEE, and Shun Watanabe, Member, IEEE Abstract Two parties observing correlated data seek to exchange

More information

Lecture 4 Channel Coding

Lecture 4 Channel Coding Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity

More information

EE 4TM4: Digital Communications II. Channel Capacity

EE 4TM4: Digital Communications II. Channel Capacity EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.

More information

Semidefinite programming strong converse bounds for quantum channel capacities

Semidefinite programming strong converse bounds for quantum channel capacities Semidefinite programming strong converse bounds for quantum channel capacities Xin Wang UTS: Centre for Quantum Software and Information Joint work with Runyao Duan and Wei Xie arxiv:1610.06381 & 1601.06888

More information

Variable Length Codes for Degraded Broadcast Channels

Variable Length Codes for Degraded Broadcast Channels Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates

More information

Information-theoretic Secrecy A Cryptographic Perspective

Information-theoretic Secrecy A Cryptographic Perspective Information-theoretic Secrecy A Cryptographic Perspective Stefano Tessaro UC Santa Barbara WCS 2017 April 30, 2017 based on joint works with M. Bellare and A. Vardy Cryptography Computational assumptions

More information

Interactive Communication for Data Exchange

Interactive Communication for Data Exchange Interactive Communication for Data Exchange Himanshu Tyagi Indian Institute of Science, Bangalore Joint work with Pramod Viswanath and Shun Watanabe The Data Exchange Problem [ElGamal-Orlitsky 84], [Csiszár-Narayan

More information

Distributed Lossless Compression. Distributed lossless compression system

Distributed Lossless Compression. Distributed lossless compression system Lecture #3 Distributed Lossless Compression (Reading: NIT 10.1 10.5, 4.4) Distributed lossless source coding Lossless source coding via random binning Time Sharing Achievability proof of the Slepian Wolf

More information

Upper Bounds on the Capacity of Binary Intermittent Communication

Upper Bounds on the Capacity of Binary Intermittent Communication Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,

More information

On Composite Quantum Hypothesis Testing

On Composite Quantum Hypothesis Testing University of York 7 November 207 On Composite Quantum Hypothesis Testing Mario Berta Department of Computing with Fernando Brandão and Christoph Hirche arxiv:709.07268 Overview Introduction 2 Composite

More information

Information Theoretic Limits of Randomness Generation

Information Theoretic Limits of Randomness Generation Information Theoretic Limits of Randomness Generation Abbas El Gamal Stanford University Shannon Centennial, University of Michigan, September 2016 Information theory The fundamental problem of communication

More information

Bounded Expected Delay in Arithmetic Coding

Bounded Expected Delay in Arithmetic Coding Bounded Expected Delay in Arithmetic Coding Ofer Shayevitz, Ram Zamir, and Meir Feder Tel Aviv University, Dept. of EE-Systems Tel Aviv 69978, Israel Email: {ofersha, zamir, meir }@eng.tau.ac.il arxiv:cs/0604106v1

More information

Upper Bounds to Error Probability with Feedback

Upper Bounds to Error Probability with Feedback Upper Bounds to Error robability with Feedbac Barış Naiboğlu Lizhong Zheng Research Laboratory of Electronics at MIT Cambridge, MA, 0239 Email: {naib, lizhong }@mit.edu Abstract A new technique is proposed

More information

A Formula for the Capacity of the General Gel fand-pinsker Channel

A Formula for the Capacity of the General Gel fand-pinsker Channel A Formula for the Capacity of the General Gel fand-pinsker Channel Vincent Y. F. Tan Institute for Infocomm Research (I 2 R, A*STAR, Email: tanyfv@i2r.a-star.edu.sg ECE Dept., National University of Singapore

More information

Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel

Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel Jun Chen Dept. of Electrical and Computer Engr. McMaster University Hamilton, Ontario, Canada Chao Tian AT&T Labs-Research 80 Park

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

The Fading Number of a Multiple-Access Rician Fading Channel

The Fading Number of a Multiple-Access Rician Fading Channel The Fading Number of a Multiple-Access Rician Fading Channel Intermediate Report of NSC Project Capacity Analysis of Various Multiple-Antenna Multiple-Users Communication Channels with Joint Estimation

More information

Entropy Accumulation in Device-independent Protocols

Entropy Accumulation in Device-independent Protocols Entropy Accumulation in Device-independent Protocols QIP17 Seattle January 19, 2017 arxiv: 1607.01796 & 1607.01797 Rotem Arnon-Friedman, Frédéric Dupuis, Omar Fawzi, Renato Renner, & Thomas Vidick Outline

More information

Katalin Marton. Abbas El Gamal. Stanford University. Withits A. El Gamal (Stanford University) Katalin Marton Withits / 9

Katalin Marton. Abbas El Gamal. Stanford University. Withits A. El Gamal (Stanford University) Katalin Marton Withits / 9 Katalin Marton Abbas El Gamal Stanford University Withits 2010 A. El Gamal (Stanford University) Katalin Marton Withits 2010 1 / 9 Brief Bio Born in 1941, Budapest Hungary PhD from Eötvös Loránd University

More information

Classical and Quantum Channel Simulations

Classical and Quantum Channel Simulations Classical and Quantum Channel Simulations Mario Berta (based on joint work with Fernando Brandão, Matthias Christandl, Renato Renner, Joseph Renes, Stephanie Wehner, Mark Wilde) Outline Classical Shannon

More information

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview AN INTRODUCTION TO SECRECY CAPACITY BRIAN DUNN. Overview This paper introduces the reader to several information theoretic aspects of covert communications. In particular, it discusses fundamental limits

More information

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 53, NO. 12, DECEMBER

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 53, NO. 12, DECEMBER IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 53, NO 12, DECEMBER 2007 4457 Joint Source Channel Coding Error Exponent for Discrete Communication Systems With Markovian Memory Yangfan Zhong, Student Member,

More information

Extremal properties of the variance and the quantum Fisher information; Phys. Rev. A 87, (2013).

Extremal properties of the variance and the quantum Fisher information; Phys. Rev. A 87, (2013). 1 / 24 Extremal properties of the variance and the quantum Fisher information; Phys. Rev. A 87, 032324 (2013). G. Tóth 1,2,3 and D. Petz 4,5 1 Theoretical Physics, University of the Basque Country UPV/EHU,

More information

Common Randomness Principles of Secrecy

Common Randomness Principles of Secrecy Common Randomness Principles of Secrecy Himanshu Tyagi Department of Electrical and Computer Engineering and Institute of Systems Research 1 Correlated Data, Distributed in Space and Time Sensor Networks

More information

On the Rate-Limited Gelfand-Pinsker Problem

On the Rate-Limited Gelfand-Pinsker Problem On the Rate-Limited Gelfand-Pinsker Problem Ravi Tandon Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 ravit@umd.edu ulukus@umd.edu Abstract

More information

Capacity Upper Bounds for the Deletion Channel

Capacity Upper Bounds for the Deletion Channel Capacity Upper Bounds for the Deletion Channel Suhas Diggavi, Michael Mitzenmacher, and Henry D. Pfister School of Computer and Communication Sciences, EPFL, Lausanne, Switzerland Email: suhas.diggavi@epfl.ch

More information

An Outer Bound for the Gaussian. Interference channel with a relay.

An Outer Bound for the Gaussian. Interference channel with a relay. An Outer Bound for the Gaussian Interference Channel with a Relay Ivana Marić Stanford University Stanford, CA ivanam@wsl.stanford.edu Ron Dabora Ben-Gurion University Be er-sheva, Israel ron@ee.bgu.ac.il

More information

Multiplicativity of Maximal p Norms in Werner Holevo Channels for 1 < p 2

Multiplicativity of Maximal p Norms in Werner Holevo Channels for 1 < p 2 Multiplicativity of Maximal p Norms in Werner Holevo Channels for 1 < p 2 arxiv:quant-ph/0410063v1 8 Oct 2004 Nilanjana Datta Statistical Laboratory Centre for Mathematical Sciences University of Cambridge

More information

WE study the capacity of peak-power limited, single-antenna,

WE study the capacity of peak-power limited, single-antenna, 1158 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 3, MARCH 2010 Gaussian Fading Is the Worst Fading Tobias Koch, Member, IEEE, and Amos Lapidoth, Fellow, IEEE Abstract The capacity of peak-power

More information

Generalized Writing on Dirty Paper

Generalized Writing on Dirty Paper Generalized Writing on Dirty Paper Aaron S. Cohen acohen@mit.edu MIT, 36-689 77 Massachusetts Ave. Cambridge, MA 02139-4307 Amos Lapidoth lapidoth@isi.ee.ethz.ch ETF E107 ETH-Zentrum CH-8092 Zürich, Switzerland

More information

The Effect of Memory Order on the Capacity of Finite-State Markov and Flat-Fading Channels

The Effect of Memory Order on the Capacity of Finite-State Markov and Flat-Fading Channels The Effect of Memory Order on the Capacity of Finite-State Markov and Flat-Fading Channels Parastoo Sadeghi National ICT Australia (NICTA) Sydney NSW 252 Australia Email: parastoo@student.unsw.edu.au Predrag

More information

Degrees of Freedom Region of the Gaussian MIMO Broadcast Channel with Common and Private Messages

Degrees of Freedom Region of the Gaussian MIMO Broadcast Channel with Common and Private Messages Degrees of Freedom Region of the Gaussian MIMO Broadcast hannel with ommon and Private Messages Ersen Ekrem Sennur Ulukus Department of Electrical and omputer Engineering University of Maryland, ollege

More information

Asymptotic Distortion Performance of Source-Channel Diversity Schemes over Relay Channels

Asymptotic Distortion Performance of Source-Channel Diversity Schemes over Relay Channels Asymptotic istortion Performance of Source-Channel iversity Schemes over Relay Channels Karim G. Seddik 1, Andres Kwasinski 2, and K. J. Ray Liu 1 1 epartment of Electrical and Computer Engineering, 2

More information

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University) Performance-based Security for Encoding of Information Signals FA9550-15-1-0180 (2015-2018) Paul Cuff (Princeton University) Contributors Two students finished PhD Tiance Wang (Goldman Sachs) Eva Song

More information

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122 Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel

More information

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@cam.ac.uk Alfonso Martinez Universitat Pompeu Fabra alfonso.martinez@ieee.org

More information

Quantum Information Processing with Finite Resources

Quantum Information Processing with Finite Resources Marco Tomamichel arxiv:1504.00233v3 [quant-ph] 18 Oct 2015 Quantum Information Processing with Finite Resources Mathematical Foundations October 18, 2015 Springer Acknowledgements I was introduced to

More information

Quantum Technologies for Cryptography

Quantum Technologies for Cryptography University of Sydney 11 July 2018 Quantum Technologies for Cryptography Mario Berta (Department of Computing) marioberta.info Quantum Information Science Understanding quantum systems (e.g., single atoms

More information

Lecture 7 Introduction to Statistical Decision Theory

Lecture 7 Introduction to Statistical Decision Theory Lecture 7 Introduction to Statistical Decision Theory I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 20, 2016 1 / 55 I-Hsiang Wang IT Lecture 7

More information

Lecture 3: Channel Capacity

Lecture 3: Channel Capacity Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.

More information

An Extended Fano s Inequality for the Finite Blocklength Coding

An Extended Fano s Inequality for the Finite Blocklength Coding An Extended Fano s Inequality for the Finite Bloclength Coding Yunquan Dong, Pingyi Fan {dongyq8@mails,fpy@mail}.tsinghua.edu.cn Department of Electronic Engineering, Tsinghua University, Beijing, P.R.

More information

Computation of Information Rates from Finite-State Source/Channel Models

Computation of Information Rates from Finite-State Source/Channel Models Allerton 2002 Computation of Information Rates from Finite-State Source/Channel Models Dieter Arnold arnold@isi.ee.ethz.ch Hans-Andrea Loeliger loeliger@isi.ee.ethz.ch Pascal O. Vontobel vontobel@isi.ee.ethz.ch

More information