Secret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe

Similar documents
Secret Key Agreement: General Capacity and Second-Order Asymptotics

Secret Key Agreement: General Capacity and Second-Order Asymptotics

Interactive Communication for Data Exchange

A Bound For Multiparty Secret Key Agreement And Implications For A Problem Of Secure Computing. Himanshu Tyagi and Shun Watanabe

Interactive Communication for Data Exchange

Common Randomness Principles of Secrecy

Converses For Secret Key Agreement and Secure Computing

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

Explicit Capacity-Achieving Coding Scheme for the Gaussian Wiretap Channel. Himanshu Tyagi and Alexander Vardy

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

Secret Key Generation and Secure Computing

MULTITERMINAL SECRECY AND TREE PACKING. With Imre Csiszár, Sirin Nitinawarat, Chunxuan Ye, Alexander Barg and Alex Reznik

A Formula for the Capacity of the General Gel fand-pinsker Channel

A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback

Second-Order Asymptotics in Information Theory

A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks

Correlation Detection and an Operational Interpretation of the Rényi Mutual Information

LECTURE 3. Last time:

Distributed Lossless Compression. Distributed lossless compression system

Non-Asymptotic and Asymptotic Analyses on Markov Chains in Several Problems

Dispersion of the Gilbert-Elliott Channel

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview

Secret Key and Private Key Constructions for Simple Multiterminal Source Models

Arimoto Channel Coding Converse and Rényi Divergence

Lecture 4 Noisy Channel Coding

Memory in Classical Information Theory: A Brief History

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

EECS 750. Hypothesis Testing with Communication Constraints

Secret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper

On Third-Order Asymptotics for DMCs

Simple and Tight Bounds for Information Reconciliation and Privacy Amplification

Group Secret Key Agreement over State-Dependent Wireless Broadcast Channels

ECE Information theory Final (Fall 2008)

Frans M.J. Willems. Authentication Based on Secret-Key Generation. Frans M.J. Willems. (joint work w. Tanya Ignatenko)

Soft Covering with High Probability

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory

Information Masking and Amplification: The Source Coding Setting

Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities

Channel Dispersion and Moderate Deviations Limits for Memoryless Channels

Secret-Key Agreement over Unauthenticated Public Channels Part I: Definitions and a Completeness Result

Information Theory. Lecture 5 Entropy rate and Markov sources STEFAN HÖST

Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets

Information-theoretic Secrecy A Cryptographic Perspective

LECTURE 13. Last time: Lecture outline

Lecture 5 Channel Coding over Continuous Channels

Network coding for multicast relation to compression and generalization of Slepian-Wolf

Information Theoretic Limits of Randomness Generation

Distributed Hypothesis Testing Over Discrete Memoryless Channels

Secret Key Agreement Using Asymmetry in Channel State Knowledge

Simple Channel Coding Bounds

Multiterminal Source Coding with an Entropy-Based Distortion Measure

Lecture 4 Channel Coding

Hypothesis Testing with Communication Constraints

Minentropy and its Variations for Cryptography

Perfect Omniscience, Perfect Secrecy and Steiner Tree Packing

The Communication Complexity of Correlation. Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan

Lecture 16. Error-free variable length schemes (contd.): Shannon-Fano-Elias code, Huffman code

Channels with cost constraints: strong converse and dispersion

EE5319R: Problem Set 3 Assigned: 24/08/16, Due: 31/08/16

Lecture 22: Final Review

A new converse in rate-distortion theory

ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016

Common Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014

Homework Set #2 Data Compression, Huffman code and AEP

ELEC546 Review of Information Theory

Solutions to Homework Set #1 Sanov s Theorem, Rate distortion

Arimoto-Rényi Conditional Entropy. and Bayesian M-ary Hypothesis Testing. Abstract

ONE of the major trends in information theory, both

Capacity of a channel Shannon s second theorem. Information Theory 1/33

EE 4TM4: Digital Communications II. Channel Capacity

Classical communication over classical channels using non-classical correlation. Will Matthews University of Waterloo

On Function Computation with Privacy and Secrecy Constraints

Lecture 5: Asymptotic Equipartition Property

5 Mutual Information and Channel Capacity

Coding for Computing. ASPITRG, Drexel University. Jie Ren 2012/11/14

On the Capacity Region for Secure Index Coding

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

On the security of XOR-method in biometric authentication systems Ignatenko, T.; Willems, F.M.J.

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach

EE5585 Data Compression May 2, Lecture 27

Minimum Energy Per Bit for Secret Key Acquisition Over Multipath Wireless Channels

The Method of Types and Its Application to Information Hiding

Information Leakage of Correlated Source Coded Sequences over a Channel with an Eavesdropper

Lecture 11: Quantum Information III - Source Coding

Common Randomness and Secret Key Generation with a Helper

Lecture 11: Polar codes construction

SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003

Upper Bounds on the Capacity of Binary Intermittent Communication

Quiz 2 Date: Monday, November 21, 2016

Classical and Quantum Channel Simulations

REALIZING secret key generation in a variety of

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels

Homework 1 Due: Thursday 2/5/2015. Instructions: Turn in your homework in class on Thursday 2/5/2015

1 Background on Information Theory

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

Lecture 10: Broadcast Channel and Superposition Coding

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

Ahmed S. Mansour, Rafael F. Schaefer and Holger Boche. June 9, 2015

Transcription:

Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe

Two party secret key agreement Maurer 93, Ahlswede-Csiszár 93 X F Y K x K y ArandomvariableK constitutes an (ϵ, δ)-sk if: P(K x = K y = K) 1 ϵ :recoverability 1 2 P KF P unif P F δ :security 1

Two party secret key agreement Maurer 93, Ahlswede-Csiszár 93 X F Y K x K y ArandomvariableK constitutes an (ϵ, δ)-sk if: P(K x = K y = K) 1 ϵ :recoverability 1 2 P KF P unif P F δ :security 1

Two party secret key agreement Maurer 93, Ahlswede-Csiszár 93 X F Y K x K y ArandomvariableK constitutes an (ϵ, δ)-sk if: P(K x = K y = K) 1 ϵ :recoverability 1 2 P KF P unif P F δ :security What is the maximum length S(X, Y ) of a SK that can be generated? 1

Where do we stand? Maurer 93, Ahlswede-Csiszár 93 S(X n,y n )=ni(x Y )+o(n) (Secret key capacity) Csiszár-Narayan 04 Secret key capacity for multiple terminals Renner-Wolf 03, 05 Single-shot bounds on S(X, Y ) 2

Where do we stand? Maurer 93, Ahlswede-Csiszár 93 S(X n,y n )=ni(x Y )+o(n) (Secret key capacity) Csiszár-Narayan 04 Secret key capacity for multiple terminals Renner-Wolf 03, 05 Single-shot bounds on S(X, Y ) Typical construction: X sends a compressed version of itself to Y, and the K is extracted from shared X using a 2-universal hash family 2

Where do we stand? Maurer 93, Ahlswede-Csiszár 93 S(X n,y n )=ni(x Y )+o(n) (Secret key capacity) Csiszár-Narayan 04 Secret key capacity for multiple terminals Renner-Wolf 03, 05 Single-shot bounds on S(X, Y ) Typical construction: X sends a compressed version of itself to Y, and the K is extracted from shared X using a 2-universal hash family Converse?? 2

Where do we stand? Maurer 93, Ahlswede-Csiszár 93 Fano s inequality S(X n,y n )=ni(x Y )+o(n) (Secret key capacity) Csiszár-Narayan 04 Fano s inequality Secret key capacity for multiple terminals Renner-Wolf 03, 05 Potential function method Single-shot bounds on S(X, Y ) Typical construction: X sends a compressed version of itself to Y, and the K is extracted from shared X using a 2-universal hash family Converse?? 2

Converse: Conditional independence testing bound The source of our rekindled excitement about this problem: Theorem ( Tyagi-Watanabe 2014) Given ϵ, δ 0 with ϵ + δ<1 and 0 <η<1 ϵ δ. Itholdsthat ( ) S ϵ,δ (X, Y ) log β ϵ+δ+η PXY, P X P Y + 2 log(1/η) 3

Converse: Conditional independence testing bound The source of our rekindled excitement about this problem: Theorem ( Tyagi-Watanabe 2014) Given ϵ, δ 0 with ϵ + δ<1 and 0 <η<1 ϵ δ. Itholdsthat ( ) S ϵ,δ (X, Y ) log β ϵ+δ+η PXY, P X P Y + 2 log(1/η) where β ϵ (P, Q) inf Q[T], T:P[T] 1 ϵ P[T] = v P(v)T(0 v) Q[T] = v Q(v)T(0 v) 3

Converse: Conditional independence testing bound The source of our rekindled excitement about this problem: Theorem ( Tyagi-Watanabe 2014) Given ϵ, δ 0 with ϵ + δ<1 and 0 <η<1 ϵ δ. Itholdsthat ( ) S ϵ,δ (X, Y ) log β ϵ+δ+η PXY, P X P Y + 2 log(1/η) where β ϵ (P, Q) inf Q[T], T:P[T] 1 ϵ P[T] = v P(v)T(0 v) Q[T] = v Q(v)T(0 v) In the spirit of meta-converse of Polyanskiy, Poor, and Verdu 3

Single-shot achievability? Recall the two steps of SK agreement: Step 1 (aka Information reconciliation). Slepian-Wolf code to send X to Y Step 2 (aka Randomness extraction or privacy amplification). Random function K to extract uniform random bits from X as K(X) Example. For (X, Y ) (X n,y n ) Rate of communication in step 1 = H(X Y )=H(X) I(X Y ) Rate of randomness extraction in step 2 = H(X) The difference is the secret key capacity 4

Single-shot achievability? Recall the two steps of SK agreement: Step 1 (aka Information reconciliation). Slepian-Wolf code to send X to Y Step 2 (aka Randomness extraction or privacy amplification). Random function K to extract uniform random bits from X as K(X) Example. For (X, Y ) (X n,y n ) Rate of communication in step 1 = H(X Y )=H(X) I(X Y ) Rate of randomness extraction in step 2 = H(X) The difference is the secret key capacity Are we done? 4

Single-shot achievability? Recall the two steps of SK agreement: Step 1 (aka Information reconciliation). Slepian-Wolf code to send X to Y Step 2 (aka Randomness extraction or privacy amplification). Random function K to extract uniform random bits from X as K(X) Example. For (X, Y ) (X n,y n ) Rate of communication in step 1 = H(X Y )=H(X) I(X Y ) Rate of randomness extraction in step 2 = H(X) The difference is the secret key capacity Are we done? Not quite. Let s take a careful look 4

Step 1: Slepian-Wolf theorem Miyake Kanaya 95, Han 03 Lemma (Slepian-Wolf coding) There exists a code (e, d) of size M with encoder e : X {1,..., M}, and a decoder d : {1,..., M} Y X, such that P XY ({(x, y) x d(e(x),y)}) P XY ( {(x, y) log PX Y (x y) log M γ} ) +2 γ. 5

Step 1: Slepian-Wolf theorem Miyake Kanaya 95, Han 03 Lemma (Slepian-Wolf coding) There exists a code (e, d) of size M with encoder e : X {1,..., M}, and a decoder d : {1,..., M} Y X, such that P XY ({(x, y) x d(e(x),y)}) P XY ( {(x, y) log PX Y (x y) log M γ} ) +2 γ. log P X Y = log P X log(p Y X /P Y ) Compare with H(X Y )=H(X) I(X Y ) The second term is a proxy for the mutual information 5

Step 1: Slepian-Wolf theorem Miyake Kanaya 95, Han 03 Lemma (Slepian-Wolf coding) There exists a code (e, d) of size M with encoder e : X {1,..., M}, and a decoder d : {1,..., M} Y X, such that P XY ({(x, y) x d(e(x),y)}) P XY ({(x, y) log M γ})+2 γ. log P X Y = log P X log(p Y X /P Y ) Compare with H(X Y )=H(X) I(X Y ) The second term is a proxy for the mutual information Communication rate needed is approximately equal to (large probability upper bound on log P X ) log(p Y X /P Y ) 5

Step 2: Leftover hash lemma Lesson from the step 1: Communication rate is approximately (large probability upper bound on log P X ) log(p Y X /P Y ) Recall that the min entropy of X is given by H min (P X )= log max P X (x) x Impagliazzo et. al. 89, Bennett et. al. 95, Renner-Wolf 05 Lemma (Leftover hash) There exists a function K of X taking values in K such that P KZ P unif P Z K Z 2 Hmin(PX) 6

Step 2: Leftover hash lemma Lesson from the step 1: Communication rate is approximately (large probability upper bound on log P X ) log(p Y X /P Y ) Recall that the min entropy of X is given by H min (P X )= log max P X (x) x Impagliazzo et. al. 89, Bennett et. al. 95, Renner-Wolf 05 Lemma (Leftover hash) There exists a function K of X taking values in K such that P KZ P unif P Z K Z 2 Hmin(PX) Randomness can be extracted at a rate approximately equal to (large probability lower bound on log P X ) 6

Step 2: Leftover hash lemma Lesson from the step 1: Communication rate is approximately (large probability upper bound on log P X ) log(p Y X /P Y ) Recall that the min entropy of X is given by Information Spectrum of X H min (P X )= log max x P X (x) Impagliazzo et. al. 89, Bennett et. al. 95, Renner-Wolf 05 Lemma (Leftover hash) Loss in SK rate There exists a function K of X taking values in K such that P KZ P unif P Z log P X (X) K Z 2 Hmin(PX) Randomness can be extracted at a rate approximately equal to (large probability lower bound on log P X ) 6

Spectrum slicing A slice of the spectrum λ min λ max log P X (X) Slice the spectrum of X into L bins of length and send the bin number to Y 7

Single-shot achievability Theorem For every γ>0 and 0 λ λ min,thereexistsan(ϵ, δ)-sk K taking values in K with ( ) P XY (X, Y ) ϵ P log P X (X)P Y (Y ) λ + γ + +P ( log P X (X) / (λ min,λ max )) + 1 L δ 1 2 K 2 (λ 2logL) 8

Secret key capacity for general sources Consider a sequence of sources (X n,y n ) The SK capacity C is defined as 1 C sup lim inf ϵ n,δ n n n S ϵ n,δ n (X n,y n ) where the sup is over all ϵ n,δ n 0 such that lim ϵ n + δ n =0 n 9

Secret key capacity for general sources Consider a sequence of sources (X n,y n ) The SK capacity C is defined as 1 C sup lim inf ϵ n,δ n n n S ϵ n,δ n (X n,y n ) where the sup is over all ϵ n,δ n 0 such that lim ϵ n + δ n =0 n The inf-mutual information rate I(X Y) is defined as { } I(X Y) sup α lim P(Z n <α)=0 n where Z n = 1 n log P X ny n (X n,y n ) P Xn (X n )P Yn (Y n ) 9

General capacity Theorem (Secret key capacity) The SK capacity C for a sequence of sources {X n,y n } n=1 is given by C = I(X Y) 10

General capacity Theorem (Secret key capacity) The SK capacity C for a sequence of sources {X n,y n } n=1 is given by C = I(X Y) Converse. Follows from our conditional independence testing bound with: Lemma (Verdú) For every ϵ n such that it holds that lim ϵ n =0 n lim inf 1 n n log β ϵ n (P XnY n, P Xn P Yn ) I(X Y) 10

General capacity Theorem (Secret key capacity) The SK capacity C for a sequence of sources {X n,y n } n=1 is given by C = I(X Y) Achievability. Use the single-shot construction with λ max = n ( H(X)+ ) λ min = n (H(X) ) λ = n (I (X Y) ) 10

Towards characterizing finite-blocklength performance We identify the second term in the asymptotic expansion of S(X n,y n ): Theorem (Second order asymptotics) For every 0 <ϵ<1 and IID RVs X n,y n,wehave S ϵ (X n,y n )=ni(x Y ) nv Q 1 (ϵ)+o( n) The quantity V is given by V =Var [ ] P XY (X, Y ) log P X (X)P Y (Y ) 11

Towards characterizing finite-blocklength performance We identify the second term in the asymptotic expansion of S(X n,y n ): Theorem (Second order asymptotics) For every 0 <ϵ<1 and IID RVs X n,y n,wehave S ϵ (X n,y n )=ni(x Y ) nv Q 1 (ϵ)+o( n) The quantity V is given by V =Var [ ] P XY (X, Y ) log P X (X)P Y (Y ) Proof relies on the use of Berry-Esseen theorem as in Polyanskiy-Poor-Verdu 10 11

Towards characterizing finite-blocklength performance Dr We identify the second term in the asymptotic expansion of S(X n,y n ): Theorem (Second order asymptotics) For every 0 <ϵ<1 and IID RVs X n,y n,wehave S ϵ (X n,y n )=ni(x Y ) nv Q 1 (ϵ)+o( n) The quantity V is given by V =Var [ ] P XY (X, Y ) log P X (X)P Y (Y ) Proof relies on the use of Berry-Esseen theorem as in Polyanskiy-Poor-Verdu 10 What about S ϵ,δ (X n,y n )? 11

Looking ahead... Dr What if the eavesdropper has side information Z? Best known converse bound on SK capacity due to Gohari-Ananthram 08 Recently we obtained a one-shot version of this bound Tyagi and Watanabe, Converses for Secret Key Agreement and Secure Computing, preprintarxiv:1404.5715,2014-arxiv.org Also, we have a single-shot achievability scheme that is asymptotically tight when X, Y, Z form a Markov chain 12