Secret Key and Private Key Constructions for Simple Multiterminal Source Models

Size: px
Start display at page:

Download "Secret Key and Private Key Constructions for Simple Multiterminal Source Models"

Transcription

1 Secret Key and Private Key Constructions for Simple Multiterminal Source Models arxiv:cs/05050v [csit] 3 Nov 005 Chunxuan Ye Department of Electrical and Computer Engineering and Institute for Systems Research University of Maryland College Park, MD 074, USA cxye@engumdedu Abstract This work is motivated by recent results of Csiszár and Narayan (IEEE Trans on Inform Theory, Dec 004), which highlight innate connections between secrecy generation by multiple terminals and multiterminal Slepian-Wolf near-lossless data compression (sans secrecy restrictions) We propose a new approach for constructing secret and private keys based on the long-known Slepian-Wolf code for sources connected by a virtual additive noise channel, due to Wyner (IEEE Trans on Inform Theory, Jan 974) Explicit procedures for such constructions, and their substantiation, are provided I INTRODUCTION The problem of secret key generation by multiple terminals, based on their observations of distinct correlated signals followed by public communication among themselves, has been investigated by several authors ([9], [], among others) It has been shown that these terminals can generate common randomness which is kept secret from an eavesdropper privy to the public interterminal communication Of particular relevance to us are recent results in [5] for models with an arbitrary number of terminals, each of which observes a distinct component of a discrete memoryless multiple source (DMMS) Unrestricted public communication is allowed between these terminals All the transmissions are observed by all the terminals and by the eavesdropper Two models considered in [5] are directly relevant to our work, and these are first briefly described below (i) Suppose that d terminals observe n iid repetitions of the random variables (rvs) X, X d, denoted by X,,X d, respectively A secret key (SK) generated by these terminals consists of common randomness, based on public interterminal communication, which is concealed from an eavesdropper with access to this communication The largest (entropy) rate of such a SK is termed the SK-capacity, denoted by C SK, and is shown in [5] to equal where C SK = H(X,, X d ) R min, () R min = min (R,,R d ) R i= d R i, Prakash Narayan Department of Electrical and Computer Engineering and Institute for Systems Research University of Maryland College Park, MD 074, USA prakash@engumdedu with R = {(R,,R d ) : i B R i H({X j, j B} {X j, j B c }), B {,,d}}, where B c = {,,d}\b (ii) For a given subset A {,,d}, a private key (PK) for the terminals in A, private from the terminals in A c, is a SK generated by the terminals in A (with the possible help of the terminals in A c ), which is concealed from an eavesdropper with access to the public interterminal communication and also from the helper terminals in A c (and, hence, private) The largest (entropy) rate of such a PK is termed the PK-capacity, denoted by C PK (A) It is shown in [5] that C PK (A) = H({X i, i A} {X i, i A c }) R min (A), () where with R min (A) = min R i, {R i,i A} R(A) i A R(A) = {{R i, i A} : i B R i H({X j, j B} {X j, j B c }), B A} The results above afford the following interpretation The SK-capacity C SK, ie, largest rate at which all the d terminals can generate a SK, is obtained by subtracting from the maximum rate of shared common randomness achievable by these terminals, viz H(X,, X d ), the smallest sum-rate R min of the data-compressed interterminal communication which enables each of the terminals to acquire this maximal common randomness A similar interpretation holds for the PK-capacity C PK (A) as well, with the difference that the terminals in A c, which act as helpers but must not be privy to the secrecy generated, can simply reveal their observations Hence, the entropy terms in () are now replaced in () with additional conditioning on {X i, i A c } It should be noted that R min and R min (A) are obtained as solutions to Slepian-Wolf (SW) multiterminal near-lossless data compression problems not involving any secrecy constraints This characterization of the

2 SK-capacity and PK-capacity in terms of the decompositions above also mirrors the consecutive stages in the random coding arguments used in establishing these results For instance, and loosely speaking, to generate a SK, the d terminals first generate common randomness (without any secrecy restrictions), say a rv L of entropy rate nh(l) > 0, through SWcompressed interterminal communication F This means that all the d terminals acquire the rv L with probability = The next step entails an extraction from L of a SK K = g(l) of entropy rate nh(l F), by means of a suitable operation performed identically at each terminal on the acquired common randomness L When the common randomness first acquired by the d terminals is maximal, ie, L = (X,,X d ) with probability =, then the corresponding SK K = g(l) has the best rate C SK given by () A similar approach is used to generate a PK of rate given by () The discussion above suggests that techniques for multiterminal SW data compression could be used for the construction of SKs and PKs Next, in SW coding, the existence of linear data compression codes with rates arbitrarily close to the SW bound has been long known [3] In particular, when the iid sequences observed at the terminals are related to each other through virtual communication channels characterized by independent additive noises, such linear data compression codes can be obtained in terms of the cosets of linear error-correction codes for these virtual channels, a fact first illustrated in [3] for the special case of d = terminals connected by a virtual binary symmetric channel (BSC) This fact, exploited by most known linear constructions of SW codes (cf eg [], [7], [8], []), can enable us to translate these constructions and other significant recent developments in capacity-achieving linear codes into new SK and PK constructions (See also recent independent work [0] for related existence results, as also []) Motivated by these considerations, we seek to devise new constructive schemes for secrecy generation The main technical contribution of this work is the following: we consider four simple models of secrecy generation and show how a new class of secret and private keys can be constructed, based on the SW data compression code from [3] While we do not specify exactly the linear capacity-achieving channel codes used in the SW step of the procedure, these can be chosen for instance from the class of LDPC [8] and turbo codes [7] that have attracted wide attention II PRELIMINARIES Consider a DMMS with d components, with corresponding generic rvs X,, X d taking values in finite alphabets X,,X d, respectively Let X i = (X i,,, X i,n ), i M = {,,d}, be n iid repetitions of rv X i Terminals,,d, with respective observations X,,X d, represent the d users who wish to generate a SK by public communication These terminals can communicate with each other through broadcasts over a noiseless public channel, possibly interactively in many rounds In general, a transmission from a terminal is allowed to be any function of its observations, and of all previous transmissions Let F denote collectively all the public transmissions Given ε > 0, the rv K S represents an ε-secret key (ε-sk) for the terminals in M, achieved with communication F, if there exist rvs K i = K i (X i,f), i M, with K i and K S taking values in the same finite set K S such that K S satisfies the common randomness condition the secrecy condition Pr(K i = K S, i M) ε; n I(K S F) ε; the uniformity condition n H(K S) n log K S ε Let A M be an arbitrary subset of terminals The rv K P (A) represents an ε-private key (ε-pk) for the terminals in A, private from the terminals in A c = M\A, achieved with communication F, if there exist rvs K i = K i (X i,f), i A, with K i and K P (A) taking values in the same finite set K P (A) such that K P (A) satisfies the common randomness condition Pr(K i = K P (A), i A) ε; the secrecy condition n I (K P(A) {X i, i A c },F) ε; the uniformity condition n H(K P(A)) n log K P(A) ε Definition [5]: A nonnegative number R is called an achievable SK rate if an ε n -SK K (n) S is achievable with suitable communication (with the number of rounds ( ) possibly depending on n), such that ε n 0 and n H K (n) R The largest achievable SK rate is called the SK-capacity, denoted by C SK The PK-capacity for the terminals in A, denoted by C PK (A), is similarly defined An achievable SK rate (resp PK rate) will be called strongly achievable if ε n above can be taken to vanish exponentially in n The corresponding capacities are termed strong capacities Single-letter characterizations have been provided for C SK in the case of d = terminals in [9], [] and for d in [5]; and for C PK (A) in case of d = 3 in [] and for d 3 in [5] The proofs of the achievability parts exploit the close connection between secrecy generation and SW data compression For instance, common randomness, without any secrecy restrictions, is first generated through SWcompressed interterminal communication This means that all the d terminals acquire a rv with probability = In the next step, secrecy is then extracted from this common randomness by means of a suitable identical operation performed at each terminal on the acquired common randomness When the common randomness first acquired by the d terminals is S

3 maximal, then the corresponding secret key has the best rate C SK given by () In this work, we consider four simple models for which we illustrate the construction of appropriate strong secret or private keys, which rely on suitable SW codes The SW codes of interest will rely on the following result concerning the existence of good linear channel codes for a BSC Hereafter, a BSC with crossover probability p, 0 < p <, will be denoted by BSC(p) Let h b (p) be the binary entropy function Lemma [6]: For each ε > 0, 0 < p <, and for all n sufficiently large, there exists a binary linear (n, n m) code for the BSC(p), where m < n[h b (p)+ε], such that the average error probability of maximum likelihood decoding is less than nη, for some η > 0 III MAIN RESULTS MODEL : Let the terminals and observe, respectively, n iid repetitions of the correlated rvs X and X, where X, X are {0, }-valued rvs with joint probability mass function (pmf) P XX (x, x ) = ( p)δ x x + p ( δ x x ), p <, (3) with δ being the Kronecker delta function These two terminals wish to generate a strong SK of maximal rate The SK-capacity for this model is [9], [], [5] C SK = I(X X ) = h b (p) bit/symbol In the following, we show a simple scheme for both terminals to generate a SK with rate close to h b (p), which relies on Wyner s well-known method for SW data compression [3] The SW problem of interest entails terminal reconstructing the observed sequence x at terminal from the SW codeword for x and its own observed sequence x (i) SW data compression [3]: Let C be the linear (n, n m) code specified in Lemma with parity check matrix P Both terminals know C and P Terminal transmits the syndrome Px t to terminal The maximum likelihood estimate of x at terminal is: ˆx () = x f P (Px t Px t ), where f P (Px t Pxt ) is the most likely n-sequence v with syndrome Pv t = Px t Px t, with denoting addition modulo and t denoting transposition The probability of decoding error at terminal is given by Pr( ˆX () X ) = Pr(X f P (PX t PX t ) X ) Under the given joint pmf (3), X can be considered as an input to a virtual BSC(p), while X is the corresponding output, ie, we can write X = X V, where V = (V,, V n ) is an iid sequence of {0, }-valued rvs, independent of X, with Pr(V i = ) = p, i n It readily follows that Pr( ˆX () X ) = Pr(f P (PV t ) V) Therefore, it follows from Lemma that for some η > 0, Pr( ˆX () X ) < nη, for all n sufficiently large (ii) SK construction: Consider a (common) standard array for C known to both terminals Denote by a i,j the element of the i th row and the j th column in the standard array, i m, j n m Terminal sets K = j if X equals a i,j in the standard array Terminal sets K = j if ˆX () equals a i,j in the same standard array (iii) SK criteria: The following theorem shows that K constitutes SK-capacity Theorem : The pair of rvs (K, K ) generated above, with (common) range K (say), satisfy Pr(K K ) < nη ; I(K F) = 0; H(K ) = log K n H(K ) > h b (p) ε Remark: The probability of K being different from K exactly equals the average error probability of maximum likelihood decoding when C is used on a BSC(p) Furthermore, the gap between the rate of the generated SK and the SKcapacity is as wide as the gap between the rate of C and the channel capacity Therefore, if a better channel code for a BSC(p), in the sense that the rate of this code is closer to the channel capacity and the average error probability of maximum likelihood decoding is smaller, is applied, then a better SK can be generated at both terminals, in the sense that the rate of this SK is closer to the SK-capacity and the probability is smaller that the keys generated at different terminals do not agree with each other MODEL : Let the terminals and observe, respectively, n iid repetitions of the correlated rvs X and X, where X, X are {0, }-valued rvs with joint pmf P XX (0, 0) = P XX (0, ) = P XX (, 0) = P XX (, ) = ( p)( q), pq, p( q), q( p), where p < and 0 < q < These two terminals wish to generate a strong SK of maximal rate Note that Model is a special case of Model for q = We show below a scheme for both terminals to generate a SK

4 with rate close to the SK-capacity for this model [9], [], [5], which is C SK = I(X X ) = h b (p + q pq) h b (p) bit/symbol (i) SW data compression: This step is identical to step (i) for Model (ii) SK construction: Suppose that both terminals know the linear (n, n m) code C specified in Lemma, and a (common) standard array for C Let {e i : i m } denote the set of coset leaders for all the cosets of C Given a (generic) {0, }-valued rv X, the set of sequences x {0, } n is called X-typical with constant ξ, denoted by T n X,ξ, if n[h(x)+ξ] P n X (x) n[h(x) ξ] Denote by A i the set of TX n,ξ-sequences in the coset of C with coset leader e i, i m If the number of sequences of the same type (cf [4]) in A i is more than n[i(x X) ε ], where ε > ξ +ε, then collect arbitrarily n[i(x X) ε ] such sequences to compose a subset, which we call a regular subset (as it consists of sequences of the same type) Continue this procedure until the number of sequences of every type in A i is less than n[i(x X) ε ] Let N i denote the number of distinct regular subsets of A i Enumerate (in any way) the sequences in each regular subset Let b i,j,k, where i m, j N i, k n[i(x X) ε ], denote the k th sequence of the j th regular subset in the i th coset (ie, the coset with coset leader e i ) Terminal sets K = k if X equals b i,j,k Otherwise, { K is set} to be uniformly distributed on,, n[i(x X) ε ], and independent of (X,X ) Terminal sets K = k if ˆX () equals { b i,j,k Otherwise, K} is set to be uniformly distributed on,, n[i(x X) ε ], independent of (X,X, K ) (iii) SK criteria: The following theorem shows that K constitutes SK-capacity Theorem : For some η = η (η, ξ, ε, ε ) > 0, the pair of rvs (K, K ) generated above, with range K (say), satisfy Pr(K K ) < nη ; I(K F) = 0; H(K ) = log K n H(K ) = I(X X ) ε MODEL 3: Let the terminals,,d observe, respectively, n iid repetitions of {0, }-valued rvs X,, X d which form a Markov chain X X X d, with a joint pmf P X X d given by: for i d, P XiX i+ (x i, x i+ ) = ( p i)δ xix i+ + p i ( δ xix i+ ), p i < These d terminals wish to generate a strong SK of maximal rate Note that Model is a special case of Model 3 for d = Without any loss of generality, let p j = max p i i d Then, the SK-capacity for this model is [5] C SK = I(X j X j+ ) = h b (p j ) bit/symbol We show below how to extract a SK with rate close to h b (p j ) by using a SW data compression scheme for reconstructing x j at all the terminals (i) SW data compression: Let C be the linear (n, n m) code specified in Lemma for the BSC(p j ), with parity check matrix P Terminals i, i d, transmit the syndromes Px t i, respectively Let ˆx i (j) denote the maximum likelihood estimate at terminal i of x j For i j, terminal i, with the knowledge of (Px t i+,,pxt j, x i), forms the following successive maximum likelihood estimates ˆx i (i + ) = x i f P (Px t i Pxt i+ ), ˆx i (i + ) = ˆx i (i + ) f P (Px t i+ Px t i+), ˆx i (j) = ˆx i (j ) f P (Px t j Pxt j ) For j + i d, terminal i, with the knowledge of (Px t j,,pxt i,x i), forms the following successive maximum likelihood estimates ˆx i (i ) = x i f P (Px t i Pxt i ), ˆx i (i ) = ˆx i (i ) f P (Px t i Px t i ), ˆx i (j) = ˆx i (j + ) f P (Px t j+ Pxt j ) It can be shown that for some η = η (η, d) > 0, Pr( ˆX i (j) = X j, i j d) > nη (ii) SK construction: Consider a (common) standard array for C known to all the terminals Denote by a l,k the element of the l th row and the k th column in the standard array, l m, k n m Terminal j sets K j = k j if X j equals a l,kj in the standard array Terminal i, i j d, sets K i = k i if ˆX i (j) equals a l,ki in the same standard array (iii) SK criteria: The following theorem shows that K j constitutes SK-capacity Theorem 3: The set of rvs (K,,K d ) generated above, with range K j (say), satisfy Pr(K = = K d ) > nη ; I(K j F) = 0; H(K j ) = log K j

5 n H(K j) > h b (p j ) ε MODEL 4: Let the terminals, and 3 observe, respectively, n iid repetitions of the correlated rvs X, X, X 3, where X, X, X 3 are {0, }-valued rvs with joint pmf ( p)( q) P XX X 3 (0, 0, 0) = P XX X 3 (0,, ) =, P XX X 3 (0, 0, ) = P XX X 3 (0,, 0) = pq, p( q) P XX X 3 (, 0, 0) = P XX X 3 (,, ) =, q( p) P XX X 3 (, 0, ) = P XX X 3 (,, 0) =, where p < and 0 < q < Terminals and wish to generate a strong PK of maximal rate, which is concealed from the helper terminal 3 Note that under the given joint pmf of X, X, X 3, we can write X = X X 3 V, where V = (V,, V n ) is an iid sequence of {0, }-valued rvs, independent of (X,X 3 ), with Pr(V i = ) = p, i n We show below a scheme for terminals and to generate a PK with rate close to the PK-capacity for this model [], [5] C PK ({, }) = I(X X X 3 ) = h b (p + q pq) h b (p) bit/symbol The preliminary step of this scheme entails terminal 3 simply revealing its observations x 3 to both terminals and Then, Wyner s SW data compression scheme is used for reconstructing x at terminal from the SW codeword for x and x x 3 (i) SW data compression: This step is identical to step (i) for Model (ii) PK construction: Suppose that terminals and know the linear (n, n m) code C specified in Lemma, and a (common) standard array for C Let {e i : i m } denote the set of coset leaders for all the cosets of C Given (generic) {0, }-valued rvs X, Y, the set of pairs of sequences (x,y) {0, } n {0, } n is called XY -typical with constant ξ, denoted by T n XY,ξ, if x T n X,ξ, y T n Y,ξ, and n[h(x,y )+ξ] P n XY (x,y) n[h(x,y ) ξ] For every y {0, } n, the set of sequences x {0, } n is called X Y -typical with respect to y with constant ξ, denoted by TX Y,ξ n (y), if (x,y) T XY,ξ n Note that T X Y,ξ n (y) is an empty set if y TY,ξ n For a sequence x 3 {0, } n, denote by A i (x 3 ) the set of TX n (x X 3,ξ 3)-sequences in the coset of C with coset leader e i, i m If the number of sequences of the same joint type (cf [4]) with x 3 in A i (x 3 ) is more than n[i(x X X3) ε ], where ε > ξ + ε, then collect arbitrarily n[i(x X X3) ε ] such sequences to compose a regular subset Continue this procedure until the number of sequences of every joint type with x 3 in A i (x 3 ) is less than n[i(x X X3) ε ] Let N i (x 3 ) denote the number of distinct regular subsets of A i (x 3 ) For a given sequence x 3, enumerate (in any way) the sequences in each regular subset Let b i,j,k (x 3 ), where i m, j N i (x 3 ), k n[i(x X X3) ε ], denote the k th sequence of the j th regular subset in the i th coset Terminal sets K = k if X equals b i,j,k (X 3 ) Otherwise, K is set to be uniformly distributed on {,, n[i(x X X3) ε ] }, independent of (X,X,X 3 ) Terminal sets K = k if ˆX () equals b i,j,k (X 3 ) Otherwise, K is set to be uniformly distributed on {,, n[i(x X X3) ε ] }, independent of (X,X,X 3, K ) (iii) SK criteria: The following theorem shows that K constitutes a strongly achievable PK with rate approaching the PK-capacity Theorem 4: For some η = η (η, ξ, ε, ε ) > 0, the pair of rvs (K, K ) generated above, with range K (say), satisfy Pr(K K ) < nη ; I(K X 3,F) = 0; H(K ) = log K n H(K ) = I(X X X 3 ) ε REFERENCES [] R Ahlswede and I Csiszár, Common randomness in information theory and cryptography, Part I: Secret sharing, IEEE Trans Inform Theory, vol 39, pp 3, July 993 [] T P Coleman, A H Lee, M Médard, and M Effros, On some new approaches to practical Slepian-Wolf compression inspired by channel coding, Proc IEEE Data Compression Conference, pp 8 9, Snowbird, UT, March 004 [3] I Csiszár, Linear codes for sources and source networks: Error exponents, universal coding, IEEE Trans Inform Theory, vol 8, no 4, pp , July, 98 [4] I Csiszár and J Körner, Information Theory: Coding Theorems for Discrete Memoryless Systems Academic, New York, NY, 98 [5] I Csiszár and P Narayan, Secrecy capacities for multiple terminals, IEEE Trans Inform Theory, vol 50, pp , Dec 004 [6] P Elias, Coding for noisy channels, IRE Convention Record, Part 4, pp 37 46, 955 [7] J Garcia-Frias and Y Zhao, Compression of correlated binary sources using turbo codes, IEEE Commun Lett, vol 5, pp 47 49, Oct 00 [8] A D Liveris, Z Xiong, C N Georghiades, Compression of binary sources with side information at the decoding using LDPC codes, IEEE Commun Lett, vol 6, pp , Oct 00 [9] U M Maurer, Secret key agreement by public discussion from common information, IEEE Trans Inform Theory, vol 39, pp , May 993 [0] J Muramatsu, Secret key agreement from correlated source outputs using LDPC matrices, IEICE Trans Fundamentals, vol E87-A, 004 [] S S Pradhan and K Ramchandran, Distributed source coding using syndromes (DISCUS): Design and construction, IEEE Trans Inform Theory, vol 49, pp , March 003 [] A Thangaraj, S Dihidar, A R Calderbank, S McLaughlin and J M Merolla, Capacity achieving codes for the wiretap channel with applications to quantum key distribution, e-print cs IT/04003, 004 [3] A D Wyner, Recent results in the Shannon theory, IEEE Trans Inform Theory, vol 0, pp 0, Jan 974

MULTITERMINAL SECRECY AND TREE PACKING. With Imre Csiszár, Sirin Nitinawarat, Chunxuan Ye, Alexander Barg and Alex Reznik

MULTITERMINAL SECRECY AND TREE PACKING. With Imre Csiszár, Sirin Nitinawarat, Chunxuan Ye, Alexander Barg and Alex Reznik MULTITERMINAL SECRECY AND TREE PACKING With Imre Csiszár, Sirin Nitinawarat, Chunxuan Ye, Alexander Barg and Alex Reznik Information Theoretic Security A complementary approach to computational security

More information

Secret Key and Private Key Constructions for Simple Multiterminal Source Models Chunxuan Ye, Senior Member, IEEE, and Prakash Narayan, Fellow, IEEE

Secret Key and Private Key Constructions for Simple Multiterminal Source Models Chunxuan Ye, Senior Member, IEEE, and Prakash Narayan, Fellow, IEEE IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 58, NO 2, FEBRUARY 2012 639 Secret Key and Private Key Constructions for Simple Multiterminal Source Models Chunxuan Ye, Senior Member, IEEE, and Prakash Narayan,

More information

A Practical and Optimal Symmetric Slepian-Wolf Compression Strategy Using Syndrome Formers and Inverse Syndrome Formers

A Practical and Optimal Symmetric Slepian-Wolf Compression Strategy Using Syndrome Formers and Inverse Syndrome Formers A Practical and Optimal Symmetric Slepian-Wolf Compression Strategy Using Syndrome Formers and Inverse Syndrome Formers Peiyu Tan and Jing Li (Tiffany) Electrical and Computer Engineering Dept, Lehigh

More information

Secret Key Generation and Secure Computing

Secret Key Generation and Secure Computing Secret Key Generation and Secure Computing Himanshu Tyagi Department of Electrical and Computer Engineering and Institute of System Research University of Maryland, College Park, USA. Joint work with Prakash

More information

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview AN INTRODUCTION TO SECRECY CAPACITY BRIAN DUNN. Overview This paper introduces the reader to several information theoretic aspects of covert communications. In particular, it discusses fundamental limits

More information

Slepian-Wolf Code Design via Source-Channel Correspondence

Slepian-Wolf Code Design via Source-Channel Correspondence Slepian-Wolf Code Design via Source-Channel Correspondence Jun Chen University of Illinois at Urbana-Champaign Urbana, IL 61801, USA Email: junchen@ifpuiucedu Dake He IBM T J Watson Research Center Yorktown

More information

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/40 Acknowledgement Praneeth Boda Himanshu Tyagi Shun Watanabe 3/40 Outline Two-terminal model: Mutual

More information

On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers

On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers On The Binary Lossless Many-Help-One Problem with Independently Degraded Helpers Albrecht Wolf, Diana Cristina González, Meik Dörpinghaus, José Cândido Silveira Santos Filho, and Gerhard Fettweis Vodafone

More information

Common Randomness Principles of Secrecy

Common Randomness Principles of Secrecy Common Randomness Principles of Secrecy Himanshu Tyagi Department of Electrical and Computer Engineering and Institute of Systems Research 1 Correlated Data, Distributed in Space and Time Sensor Networks

More information

Secret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper

Secret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper Secret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper Mohsen Bahrami, Ali Bereyhi, Mahtab Mirmohseni and Mohammad Reza Aref Information Systems and Security

More information

Reliable Computation over Multiple-Access Channels

Reliable Computation over Multiple-Access Channels Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,

More information

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University) Performance-based Security for Encoding of Information Signals FA9550-15-1-0180 (2015-2018) Paul Cuff (Princeton University) Contributors Two students finished PhD Tiance Wang (Goldman Sachs) Eva Song

More information

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/41 Outline Two-terminal model: Mutual information Operational meaning in: Channel coding: channel

More information

Graph Coloring and Conditional Graph Entropy

Graph Coloring and Conditional Graph Entropy Graph Coloring and Conditional Graph Entropy Vishal Doshi, Devavrat Shah, Muriel Médard, Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology Cambridge,

More information

Common Randomness and Secret Key Generation with a Helper

Common Randomness and Secret Key Generation with a Helper 344 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 46, NO. 2, MARCH 2000 Common Romness Secret Key Generation with a Helper Imre Csiszár, Fellow, IEEE, Prakash Narayan, Senior Member, IEEE Abstract We consider

More information

Perfect Omniscience, Perfect Secrecy and Steiner Tree Packing

Perfect Omniscience, Perfect Secrecy and Steiner Tree Packing Perfect Omniscience, Perfect Secrecy and Steiner Tree Packing S. Nitinawarat and P. Narayan Department of Electrical and Computer Engineering and Institute for Systems Research University of Maryland College

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

On Scalable Source Coding for Multiple Decoders with Side Information

On Scalable Source Coding for Multiple Decoders with Side Information On Scalable Source Coding for Multiple Decoders with Side Information Chao Tian School of Computer and Communication Sciences Laboratory for Information and Communication Systems (LICOS), EPFL, Lausanne,

More information

A Single-letter Upper Bound for the Sum Rate of Multiple Access Channels with Correlated Sources

A Single-letter Upper Bound for the Sum Rate of Multiple Access Channels with Correlated Sources A Single-letter Upper Bound for the Sum Rate of Multiple Access Channels with Correlated Sources Wei Kang Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College

More information

Rate-adaptive turbo-syndrome scheme for Slepian-Wolf Coding

Rate-adaptive turbo-syndrome scheme for Slepian-Wolf Coding Rate-adaptive turbo-syndrome scheme for Slepian-Wolf Coding Aline Roumy, Khaled Lajnef, Christine Guillemot 1 INRIA- Rennes, Campus de Beaulieu, 35042 Rennes Cedex, France Email: aroumy@irisa.fr Abstract

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

Design of Optimal Quantizers for Distributed Source Coding

Design of Optimal Quantizers for Distributed Source Coding Design of Optimal Quantizers for Distributed Source Coding David Rebollo-Monedero, Rui Zhang and Bernd Girod Information Systems Laboratory, Electrical Eng. Dept. Stanford University, Stanford, CA 94305

More information

An Efficient SF-ISF Approach for the Slepian-Wolf Source Coding Problem

An Efficient SF-ISF Approach for the Slepian-Wolf Source Coding Problem An Efficient SF-ISF Approach for the Slepian-Wolf Source Coding Problem Zhenyu u, Jing Li (iffany), and Rick S. Blum Department of Electrical and Computer Engineering Lehigh University, Bethlehem, PA 18105

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

ON DISTRIBUTED ARITHMETIC CODES AND SYNDROME BASED TURBO CODES FOR SLEPIAN-WOLF CODING OF NON UNIFORM SOURCES

ON DISTRIBUTED ARITHMETIC CODES AND SYNDROME BASED TURBO CODES FOR SLEPIAN-WOLF CODING OF NON UNIFORM SOURCES 7th European Signal Processing Conference (EUSIPCO 2009) Glasgow, Scotland, August 24-28, 2009 ON DISTRIBUTED ARITHMETIC CODES AND SYNDROME BASED TURBO CODES FOR SLEPIAN-WOLF CODING OF NON UNIFORM SOURCES

More information

An Efficient SF-ISF Approach for the Slepian-Wolf Source Coding Problem

An Efficient SF-ISF Approach for the Slepian-Wolf Source Coding Problem EURASIP Journal on Applied Signal Processing 2005:6, 961 971 c 2005 Hindawi Publishing Corporation An Efficient SF-ISF Approach for the Slepian-Wolf Source Coding Problem Zhenyu Tu Department of Electrical

More information

Performance of Polar Codes for Channel and Source Coding

Performance of Polar Codes for Channel and Source Coding Performance of Polar Codes for Channel and Source Coding Nadine Hussami AUB, Lebanon, Email: njh03@aub.edu.lb Satish Babu Korada and üdiger Urbanke EPFL, Switzerland, Email: {satish.korada,ruediger.urbanke}@epfl.ch

More information

Computing sum of sources over an arbitrary multiple access channel

Computing sum of sources over an arbitrary multiple access channel Computing sum of sources over an arbitrary multiple access channel Arun Padakandla University of Michigan Ann Arbor, MI 48109, USA Email: arunpr@umich.edu S. Sandeep Pradhan University of Michigan Ann

More information

Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function

Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Dinesh Krithivasan and S. Sandeep Pradhan Department of Electrical Engineering and Computer Science,

More information

Variable-Rate Universal Slepian-Wolf Coding with Feedback

Variable-Rate Universal Slepian-Wolf Coding with Feedback Variable-Rate Universal Slepian-Wolf Coding with Feedback Shriram Sarvotham, Dror Baron, and Richard G. Baraniuk Dept. of Electrical and Computer Engineering Rice University, Houston, TX 77005 Abstract

More information

On Scalable Coding in the Presence of Decoder Side Information

On Scalable Coding in the Presence of Decoder Side Information On Scalable Coding in the Presence of Decoder Side Information Emrah Akyol, Urbashi Mitra Dep. of Electrical Eng. USC, CA, US Email: {eakyol, ubli}@usc.edu Ertem Tuncel Dep. of Electrical Eng. UC Riverside,

More information

1 Background on Information Theory

1 Background on Information Theory Review of the book Information Theory: Coding Theorems for Discrete Memoryless Systems by Imre Csiszár and János Körner Second Edition Cambridge University Press, 2011 ISBN:978-0-521-19681-9 Review by

More information

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels Representation of Correlated s into Graphs for Transmission over Broadcast s Suhan Choi Department of Electrical Eng. and Computer Science University of Michigan, Ann Arbor, MI 80, USA Email: suhanc@eecs.umich.edu

More information

Secret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe

Secret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe Two party secret key agreement Maurer 93, Ahlswede-Csiszár 93 X F Y K x K y ArandomvariableK

More information

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel

More information

Source and Channel Coding for Correlated Sources Over Multiuser Channels

Source and Channel Coding for Correlated Sources Over Multiuser Channels Source and Channel Coding for Correlated Sources Over Multiuser Channels Deniz Gündüz, Elza Erkip, Andrea Goldsmith, H. Vincent Poor Abstract Source and channel coding over multiuser channels in which

More information

Interactive Communication for Data Exchange

Interactive Communication for Data Exchange Interactive Communication for Data Exchange Himanshu Tyagi Indian Institute of Science, Bangalore Joint work with Pramod Viswanath and Shun Watanabe The Data Exchange Problem [ElGamal-Orlitsky 84], [Csiszár-Narayan

More information

Secret Key Agreement Using Asymmetry in Channel State Knowledge

Secret Key Agreement Using Asymmetry in Channel State Knowledge Secret Key Agreement Using Asymmetry in Channel State Knowledge Ashish Khisti Deutsche Telekom Inc. R&D Lab USA Los Altos, CA, 94040 Email: ashish.khisti@telekom.com Suhas Diggavi LICOS, EFL Lausanne,

More information

Distributed Functional Compression through Graph Coloring

Distributed Functional Compression through Graph Coloring Distributed Functional Compression through Graph Coloring Vishal Doshi, Devavrat Shah, Muriel Médard, and Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology

More information

Hash Property and Fixed-rate Universal Coding Theorems

Hash Property and Fixed-rate Universal Coding Theorems 1 Hash Property and Fixed-rate Universal Coding Theorems Jun Muramatsu Member, IEEE, Shigeki Miyake Member, IEEE, Abstract arxiv:0804.1183v1 [cs.it 8 Apr 2008 The aim of this paper is to prove the achievability

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

Network coding for multicast relation to compression and generalization of Slepian-Wolf

Network coding for multicast relation to compression and generalization of Slepian-Wolf Network coding for multicast relation to compression and generalization of Slepian-Wolf 1 Overview Review of Slepian-Wolf Distributed network compression Error exponents Source-channel separation issues

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

ECE 4400:693 - Information Theory

ECE 4400:693 - Information Theory ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential

More information

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability

More information

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I

More information

Lecture 3: Channel Capacity

Lecture 3: Channel Capacity Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.

More information

Common Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014

Common Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014 Common Information Abbas El Gamal Stanford University Viterbi Lecture, USC, April 2014 Andrew Viterbi s Fabulous Formula, IEEE Spectrum, 2010 El Gamal (Stanford University) Disclaimer Viterbi Lecture 2

More information

On Network Functional Compression

On Network Functional Compression On Network Functional Compression Soheil Feizi, Student Member, IEEE, Muriel Médard, Fellow, IEEE arxiv:0.5496v2 [cs.it] 30 Nov 200 Abstract In this paper, we consider different aspects of the problem

More information

Distributed Arithmetic Coding

Distributed Arithmetic Coding Distributed Arithmetic Coding Marco Grangetto, Member, IEEE, Enrico Magli, Member, IEEE, Gabriella Olmo, Senior Member, IEEE Abstract We propose a distributed binary arithmetic coder for Slepian-Wolf coding

More information

Joint Source-Channel Coding for the Multiple-Access Relay Channel

Joint Source-Channel Coding for the Multiple-Access Relay Channel Joint Source-Channel Coding for the Multiple-Access Relay Channel Yonathan Murin, Ron Dabora Department of Electrical and Computer Engineering Ben-Gurion University, Israel Email: moriny@bgu.ac.il, ron@ee.bgu.ac.il

More information

Lecture 4 Channel Coding

Lecture 4 Channel Coding Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity

More information

Katalin Marton. Abbas El Gamal. Stanford University. Withits A. El Gamal (Stanford University) Katalin Marton Withits / 9

Katalin Marton. Abbas El Gamal. Stanford University. Withits A. El Gamal (Stanford University) Katalin Marton Withits / 9 Katalin Marton Abbas El Gamal Stanford University Withits 2010 A. El Gamal (Stanford University) Katalin Marton Withits 2010 1 / 9 Brief Bio Born in 1941, Budapest Hungary PhD from Eötvös Loránd University

More information

The Duality Between Information Embedding and Source Coding With Side Information and Some Applications

The Duality Between Information Embedding and Source Coding With Side Information and Some Applications IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 49, NO. 5, MAY 2003 1159 The Duality Between Information Embedding and Source Coding With Side Information and Some Applications Richard J. Barron, Member,

More information

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels S. Sandeep Pradhan a, Suhan Choi a and Kannan Ramchandran b, a {pradhanv,suhanc}@eecs.umich.edu, EECS Dept.,

More information

Secret Key Agreement: General Capacity and Second-Order Asymptotics

Secret Key Agreement: General Capacity and Second-Order Asymptotics Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe 1 Abstract We revisit the problem of secret key agreement using interactive public communication

More information

arxiv: v1 [cs.it] 19 Aug 2008

arxiv: v1 [cs.it] 19 Aug 2008 Distributed Source Coding using Abelian Group Codes arxiv:0808.2659v1 [cs.it] 19 Aug 2008 Dinesh Krithivasan and S. Sandeep Pradhan, Department of Electrical Engineering and Computer Science, University

More information

Channel Coding for Secure Transmissions

Channel Coding for Secure Transmissions Channel Coding for Secure Transmissions March 27, 2017 1 / 51 McEliece Cryptosystem Coding Approach: Noiseless Main Channel Coding Approach: Noisy Main Channel 2 / 51 Outline We present an overiew of linear

More information

CS6304 / Analog and Digital Communication UNIT IV - SOURCE AND ERROR CONTROL CODING PART A 1. What is the use of error control coding? The main use of error control coding is to reduce the overall probability

More information

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute ENEE 739C: Advanced Topics in Signal Processing: Coding Theory Instructor: Alexander Barg Lecture 6 (draft; 9/6/03. Error exponents for Discrete Memoryless Channels http://www.enee.umd.edu/ abarg/enee739c/course.html

More information

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Igal Sason Department of Electrical Engineering Technion - Israel Institute of Technology Haifa 32000, Israel 2009 IEEE International

More information

Distributed Source Coding Using LDPC Codes

Distributed Source Coding Using LDPC Codes Distributed Source Coding Using LDPC Codes Telecommunications Laboratory Alex Balatsoukas-Stimming Technical University of Crete May 29, 2010 Telecommunications Laboratory (TUC) Distributed Source Coding

More information

Information Masking and Amplification: The Source Coding Setting

Information Masking and Amplification: The Source Coding Setting 202 IEEE International Symposium on Information Theory Proceedings Information Masking and Amplification: The Source Coding Setting Thomas A. Courtade Department of Electrical Engineering University of

More information

On Oblivious Transfer Capacity

On Oblivious Transfer Capacity On Oblivious Transfer Capacity Rudolph Ahlswede 1 and Imre Csiszár 2, 1 University of Bielefeld, Germany 2 Rényi Institute of Mathematics, Budapest, Hungary Abstract. Upper and lower bounds to the oblivious

More information

LECTURE 13. Last time: Lecture outline

LECTURE 13. Last time: Lecture outline LECTURE 13 Last time: Strong coding theorem Revisiting channel and codes Bound on probability of error Error exponent Lecture outline Fano s Lemma revisited Fano s inequality for codewords Converse to

More information

The Poisson Channel with Side Information

The Poisson Channel with Side Information The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH

More information

Variable Length Codes for Degraded Broadcast Channels

Variable Length Codes for Degraded Broadcast Channels Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates

More information

Optimal matching in wireless sensor networks

Optimal matching in wireless sensor networks Optimal matching in wireless sensor networks A. Roumy, D. Gesbert INRIA-IRISA, Rennes, France. Institute Eurecom, Sophia Antipolis, France. Abstract We investigate the design of a wireless sensor network

More information

Hypothesis Testing with Communication Constraints

Hypothesis Testing with Communication Constraints Hypothesis Testing with Communication Constraints Dinesh Krithivasan EECS 750 April 17, 2006 Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 1 / 21 Presentation Outline

More information

Information Leakage of Correlated Source Coded Sequences over a Channel with an Eavesdropper

Information Leakage of Correlated Source Coded Sequences over a Channel with an Eavesdropper Information Leakage of Correlated Source Coded Sequences over a Channel with an Eavesdropper Reevana Balmahoon and Ling Cheng School of Electrical and Information Engineering University of the Witwatersrand

More information

Multiterminal Source Coding with an Entropy-Based Distortion Measure

Multiterminal Source Coding with an Entropy-Based Distortion Measure Multiterminal Source Coding with an Entropy-Based Distortion Measure Thomas Courtade and Rick Wesel Department of Electrical Engineering University of California, Los Angeles 4 August, 2011 IEEE International

More information

LDPC Code Design for Asynchronous Slepian-Wolf Coding

LDPC Code Design for Asynchronous Slepian-Wolf Coding 1 LDPC Code Design for Asynchronous Slepian-Wolf Coding Zhibin Sun, Chao Tian, Member, IEEE, Jun Chen, Member, IEEE, and Kon Max Wong, Fellow, IEEE Abstract We consider asynchronous Slepian-Wolf coding

More information

Equivalence for Networks with Adversarial State

Equivalence for Networks with Adversarial State Equivalence for Networks with Adversarial State Oliver Kosut Department of Electrical, Computer and Energy Engineering Arizona State University Tempe, AZ 85287 Email: okosut@asu.edu Jörg Kliewer Department

More information

Remote Source Coding with Two-Sided Information

Remote Source Coding with Two-Sided Information Remote Source Coding with Two-Sided Information Basak Guler Ebrahim MolavianJazi Aylin Yener Wireless Communications and Networking Laboratory Department of Electrical Engineering The Pennsylvania State

More information

Lecture 11: Polar codes construction

Lecture 11: Polar codes construction 15-859: Information Theory and Applications in TCS CMU: Spring 2013 Lecturer: Venkatesan Guruswami Lecture 11: Polar codes construction February 26, 2013 Scribe: Dan Stahlke 1 Polar codes: recap of last

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

On the Simulatability Condition in Key Generation Over a Non-authenticated Public Channel

On the Simulatability Condition in Key Generation Over a Non-authenticated Public Channel On the Simulatability Condition in Key Generation Over a Non-authenticated Public Channel Wenwen Tu and Lifeng Lai Department of Electrical and Computer Engineering Worcester Polytechnic Institute Worcester,

More information

Secret Key Agreement: General Capacity and Second-Order Asymptotics

Secret Key Agreement: General Capacity and Second-Order Asymptotics Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe Abstract We revisit the problem of secret key agreement using interactive public communication

More information

Aalborg Universitet. Bounds on information combining for parity-check equations Land, Ingmar Rüdiger; Hoeher, A.; Huber, Johannes

Aalborg Universitet. Bounds on information combining for parity-check equations Land, Ingmar Rüdiger; Hoeher, A.; Huber, Johannes Aalborg Universitet Bounds on information combining for parity-check equations Land, Ingmar Rüdiger; Hoeher, A.; Huber, Johannes Published in: 2004 International Seminar on Communications DOI link to publication

More information

CONSIDER a joint stationary and memoryless process

CONSIDER a joint stationary and memoryless process 4006 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 55, NO 9, SEPTEMBER 2009 On the Duality Between Slepian Wolf Coding and Channel Coding Under Mismatched Decoding Jun Chen, Member, IEEE, Da-ke He, and

More information

Bounds on Mutual Information for Simple Codes Using Information Combining

Bounds on Mutual Information for Simple Codes Using Information Combining ACCEPTED FOR PUBLICATION IN ANNALS OF TELECOMM., SPECIAL ISSUE 3RD INT. SYMP. TURBO CODES, 003. FINAL VERSION, AUGUST 004. Bounds on Mutual Information for Simple Codes Using Information Combining Ingmar

More information

Cut-Set Bound and Dependence Balance Bound

Cut-Set Bound and Dependence Balance Bound Cut-Set Bound and Dependence Balance Bound Lei Xiao lxiao@nd.edu 1 Date: 4 October, 2006 Reading: Elements of information theory by Cover and Thomas [1, Section 14.10], and the paper by Hekstra and Willems

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

On the Secrecy Capacity of Fading Channels

On the Secrecy Capacity of Fading Channels On the Secrecy Capacity of Fading Channels arxiv:cs/63v [cs.it] 7 Oct 26 Praveen Kumar Gopala, Lifeng Lai and Hesham El Gamal Department of Electrical and Computer Engineering The Ohio State University

More information

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@cam.ac.uk Alfonso Martinez Universitat Pompeu Fabra alfonso.martinez@ieee.org

More information

Keyless authentication in the presence of a simultaneously transmitting adversary

Keyless authentication in the presence of a simultaneously transmitting adversary Keyless authentication in the presence of a simultaneously transmitting adversary Eric Graves Army Research Lab Adelphi MD 20783 U.S.A. ericsgra@ufl.edu Paul Yu Army Research Lab Adelphi MD 20783 U.S.A.

More information

How to Achieve the Capacity of Asymmetric Channels

How to Achieve the Capacity of Asymmetric Channels How to Achieve the Capacity of Asymmetric Channels Marco Mondelli, S. Hamed Hassani, and Rüdiger Urbanke Abstract arxiv:406.7373v5 [cs.it] 3 Jan 208 We survey coding techniques that enable reliable transmission

More information

Entropies & Information Theory

Entropies & Information Theory Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information

More information

Ahmed S. Mansour, Rafael F. Schaefer and Holger Boche. June 9, 2015

Ahmed S. Mansour, Rafael F. Schaefer and Holger Boche. June 9, 2015 The Individual Secrecy Capacity of Degraded Multi-Receiver Wiretap Broadcast Channels Ahmed S. Mansour, Rafael F. Schaefer and Holger Boche Lehrstuhl für Technische Universität München, Germany Department

More information

The Capacity Region of a Class of Discrete Degraded Interference Channels

The Capacity Region of a Class of Discrete Degraded Interference Channels The Capacity Region of a Class of Discrete Degraded Interference Channels Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@umd.edu

More information

Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel

Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel Pritam Mukherjee Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 074 pritamm@umd.edu

More information

Practical Coding Scheme for Universal Source Coding with Side Information at the Decoder

Practical Coding Scheme for Universal Source Coding with Side Information at the Decoder Practical Coding Scheme for Universal Source Coding with Side Information at the Decoder Elsa Dupraz, Aline Roumy + and Michel Kieffer, L2S - CNRS - SUPELEC - Univ Paris-Sud, 91192 Gif-sur-Yvette, France

More information

Chain Independence and Common Information

Chain Independence and Common Information 1 Chain Independence and Common Information Konstantin Makarychev and Yury Makarychev Abstract We present a new proof of a celebrated result of Gács and Körner that the common information is far less than

More information

Proceedings of the DATA COMPRESSION CONFERENCE (DCC 02) /02 $ IEEE

Proceedings of the DATA COMPRESSION CONFERENCE (DCC 02) /02 $ IEEE Compression with Side Information Using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford Universit y,stanford, CA 94305 amaaron@stanford.e du, bgir o d@stanford.e du Abstract

More information

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122 Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel

More information

The Gallager Converse

The Gallager Converse The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing

More information

EECS 750. Hypothesis Testing with Communication Constraints

EECS 750. Hypothesis Testing with Communication Constraints EECS 750 Hypothesis Testing with Communication Constraints Name: Dinesh Krithivasan Abstract In this report, we study a modification of the classical statistical problem of bivariate hypothesis testing.

More information

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7 Electrical and Information Technology Information Theory Problems and Solutions Contents Problems.......... Solutions...........7 Problems 3. In Problem?? the binomial coefficent was estimated with Stirling

More information

SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003

SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003 SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003 SLEPIAN-WOLF RESULT { X i} RATE R x ENCODER 1 DECODER X i V i {, } { V i} ENCODER 0 RATE R v Problem: Determine R, the

More information