Information-theoretic Secrecy A Cryptographic Perspective

Similar documents
Topics. Probability Theory. Perfect Secrecy. Information Theory

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)

Explicit Capacity-Achieving Coding Scheme for the Gaussian Wiretap Channel. Himanshu Tyagi and Alexander Vardy

Computational security & Private key encryption

Cryptography. Lecture 2: Perfect Secrecy and its Limitations. Gil Segev

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 3: Lower bound on statistically secure encryption, extractors

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Introduction to Information Theory. B. Škorić, Physical Aspects of Digital Security, Chapter 2

Near-Optimal Secret Sharing and Error Correcting Codes in AC 0

Modern symmetric-key Encryption

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

Lecture 13: Private Key Encryption

Modern Cryptography Lecture 4

Provable Security against Side-Channel Attacks

CS 282A/MATH 209A: Foundations of Cryptography Prof. Rafail Ostrosky. Lecture 4

Outline. Computer Science 418. Number of Keys in the Sum. More on Perfect Secrecy, One-Time Pad, Entropy. Mike Jacobson. Week 3

Lecture 1: Perfect Secrecy and Statistical Authentication. 2 Introduction - Historical vs Modern Cryptography

ECE 4400:693 - Information Theory

PERFECTLY secure key agreement has been studied recently

Computer Science A Cryptography and Data Security. Claude Crépeau

CSA E0 235: Cryptography March 16, (Extra) Lecture 3

On the Randomness Requirements for Privacy

A Lower Bound on the Key Length of Information-Theoretic Forward-Secure Storage Schemes

Secret-Key Agreement over Unauthenticated Public Channels Part I: Definitions and a Completeness Result

ASPECIAL case of the general key agreement scenario defined

Information Leakage of Correlated Source Coded Sequences over a Channel with an Eavesdropper

Public Key Cryptography

Extractors and the Leftover Hash Lemma

Cryptographical Security in the Quantum Random Oracle Model

Bounded-Collusion IBE from Semantically-Secure PKE: Generic Constructions with Short Ciphertexts

Information Theoretic Limits of Randomness Generation

6.892 Computing on Encrypted Data October 28, Lecture 7

The Communication Complexity of Correlation. Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan

COS433/Math 473: Cryptography. Mark Zhandry Princeton University Spring 2017

Scribe for Lecture #5

Lecture 14 February 28

Chapter 2 : Perfectly-Secret Encryption

On the Secrecy Capacity of Fading Channels

Lecture 2: Perfect Secrecy and its Limitations

A Fuzzy Sketch with Trapdoor

Division Property: a New Attack Against Block Ciphers

A Comment on Gu Map-1

Soft Covering with High Probability

Lecture 2: August 31

Metric Pseudoentropy: Characterizations and Applications

Public-Seed Pseudorandom Permutations

4-3 A Survey on Oblivious Transfer Protocols

CS 282A/MATH 209A: Foundations of Cryptography Prof. Rafail Ostrovsky. Lecture 7

functions. E.G.BARDIS*, N.G.BARDIS*, A.P.MARKOVSKI*, A.K.SPYROPOULOS**

How many rounds can Random Selection handle?

From Fixed-Length to Arbitrary-Length RSA Encoding Schemes Revisited

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Information Theory Primer:

protocols such as protocols in quantum cryptography and secret-key agreement by public discussion [8]. Before we formalize the main problem considered

Secret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe

An Extended Fano s Inequality for the Finite Blocklength Coding

Dan Boneh. Stream ciphers. The One Time Pad

Privacy Amplification Theorem for Noisy Main Channel

Simple and Tight Bounds for Information Reconciliation and Privacy Amplification

Introduction to Cryptology. Lecture 3

Quantum to Classical Randomness Extractors

Lecture 11: Quantum Information III - Source Coding

Smooth Projective Hash Function and Its Applications

Information-Theoretic Security: an overview

Channel Coding for Secure Transmissions

On Oblivious Transfer Capacity

William Stallings Copyright 2010

Series 7, May 22, 2018 (EM Convergence)

Pr[C = c M = m] = Pr[C = c] Pr[M = m] Pr[M = m C = c] = Pr[M = m]

Lecture 8: Shannon s Noise Models

Lecture 4 Channel Coding

1 Number Theory Basics

ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016

Outline. CPSC 418/MATH 318 Introduction to Cryptography. Information Theory. Partial Information. Perfect Secrecy, One-Time Pad

Chapter 2. A Look Back. 2.1 Substitution ciphers

The Indistinguishability of the XOR of k permutations

Public-seed Pseudorandom Permutations EUROCRYPT 2017

Chosen Ciphertext Security with Optimal Ciphertext Overhead

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1

Abstract. Often the core diculty in designing zero-knowledge protocols arises from having to

MULTITERMINAL SECRECY AND TREE PACKING. With Imre Csiszár, Sirin Nitinawarat, Chunxuan Ye, Alexander Barg and Alex Reznik

Leftovers from Lecture 3

From Fixed-Length Messages to Arbitrary-Length Messages Practical RSA Signature Padding Schemes

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

On the Limitations of Computational Fuzzy Extractors

Chapter 2: Entropy and Mutual Information. University of Illinois at Chicago ECE 534, Natasha Devroye

Introduction to Cryptography Lecture 4

A Generic Hybrid Encryption Construction in the Quantum Random Oracle Model

Cryptographic Engineering

Entropic Security and the Encryption of High Entropy Messages

Introduction to Cybersecurity Cryptography (Part 4)

Public-Key Cryptosystems Resilient to Key Leakage

A Public Key Encryption Scheme Based on the Polynomial Reconstruction Problem

Lecture Notes. Advanced Discrete Structures COT S

and its Extension to Authenticity

1 Recommended Reading 1. 2 Public Key/Private Key Cryptography Overview RSA Algorithm... 2

Non-malleability under Selective Opening Attacks: Implication and Separation

Differentially Private Multi-party Computation

Transcription:

Information-theoretic Secrecy A Cryptographic Perspective Stefano Tessaro UC Santa Barbara WCS 2017 April 30, 2017 based on joint works with M. Bellare and A. Vardy

Cryptography Computational assumptions CRYPTO, EUROCRYPT, STOC, FOCS, Informationtheoretic cryptography key-agreement, privacy amplification, multi-party protocols Information-theory & coding Physical assumptions (e.g., noise) ISIT, ITW, IEEE Trans. in IT, Physical -layer security Work continues to date

Parallel dimensions Curious Cryptographer Generic constructions

This talk in a nutshell Cryptographic view on the wiretap channel model. (Though really, this extends to information-theoretic secrecy more broadly) M. Bellare, S. Tessaro and A. Vardy. Semantic Security for the Wiretap Channel. Crypto 2012. M. Bellare, S. Tessaro and A. Vardy. A Cryptographic Treatment of the Wiretap Channel. Cryptology Eprint Archive 2012/15. M. Bellare and S. Tessaro. Polynomial-time, Semantically-Secure Encryption Achieving the Secrecy Capacity. Cryptology Eprint Archive 2012/22.

Physical-layer Security Very low power Very short distance 010110. e.g. credit card # Large distance Degraded signal

Wyner s Wiretap Channel [W75,CK78] M ENC P Y X C C 0 M 0 ChR DEC ChA noiser than ChR ChA P Z X Z(M) Message privacy Z(M) gives no information about M Correctness M = M with very high probability

Example Binary Symmetric Channels BSC e 0 1 0 1 1 1 M ENC C C 0 M 0 BSC p DEC p < q BSC q Z(M) Other examples: BECs, Gaussian channels,

Rate and capacity M ENC C C 0 M 0 ChR DEC ChA Z(M) Goal: Maximize rate R = M C Capacity = best possible rate for m Asymptotic setting (parameter = message length m) ChR / ChA have finite alphabet, used c(m) times

Rate and capacity Cont d I(X; Y )= X x,y M PXY (x, y) P XY (x, y) log ENC CP C 0 X (x)p Y (y) M 0 ChR = H(X) H(X Y ) DEC ChA Z(M) If for all X, we have I(X;ChR(X)) I(X; ChA(X)) C = max P X [I(X;ChR(X)) I(X; ChA(X))] Issues: Existential result + weak security [W75,CK78]

Outline 1. Security metrics for the Wiretap Channel 2. Generic construction of capacity-achieving scheme 3. Open directions

Secrecy Metrics

Traditional secrecy notions Based on Shannon metrics and asymptotic M ENC ChA Z(M) Weak secrecy: Strong secrecy: M m lim m!1 = uniform m-bit message I(M m ; Z(M m )) m =0 lim I(M m; Z(M m )) = 0 m!1 weak notion 1/m vs 2 -m ENC works on arbitrary-length message

How secure is a scheme? Many cryptographers have a quantitative approach to security. ENC Next: Which quantity is most suitable? Advantage in R Example: Adv mis-r (ENC; ChA) = I(M; Z(M)) From now on: One-shot Could Later: depend What on a security about parameter M being (e.g., message length), security means adv small as a function of sec parameter. uniform?

Statistical distance Definition. The statistical distance of X and Y is SD(X, Y )= 1 2 X P X (x) P Y (y) = 1 2 kp X P Y k 1 x X Y D D 0/1 0/1 distinguishing advantage SD(X, Y ) = max D Pr[D(X) = 1] Pr[D(Y ) = 1]

RDS security M KL(XkY )= X x ENC P X (x) log ChA PX (x) P Y (x) Z(M) M M 0 ENC ChA Z(M 0 ) Adv rds (ENC; ChA) = SD((M,Z(M)); (M,Z(M 0 ))) Adv mis-r (ENC; ChA) = KL((M,Z(M))k(M,Z(M 0 )))

Example Guessing p g =Pr[ f M = M] ENC ChA Z(M) fm M Adv rds (ENC; ChA) apple p g p 0 g apple p g apple + 1 2 m M 0 ENC ChA Z(M 0 ) f M M p 0 g =Pr[ f M = M] = 1 2 m

Semantic security First contact For any f, guessing f(m) from Z(M) is not (substantially) easier than without knowing Z(M)! Examples of f: Identity First, last bit of the message Subset of message bits

What about MIS-R security? H(M Z(M)) apple h(p e )+P e log(2 m 1) ENC ChA Z(M) fm M I(M; Z(M)) apple p g =Pr[ f M = M] Hard to estimate Fano inequality gives p g apple 1+ m (Better estimates possible, but hard to work with)

Relations [BTV12] Pinsker s inequality Theorem. Adv rds (ENC; ChA) apple q Adv mis-r (ENC; ChA) Caveat: Generally not tight! Exponents matter Theorem. For = Adv rds (ENC; ChA) Adv mis-r (ENC; ChA) apple 2 log 2 c Tight

Proof of 2 nd Thm First, show that for any c-bit X, Y with SD e, Then, note Let Easy to see: H(X) Then, by concavity, I(M; Z(M)) apple 2 2 m X H(Y ) apple 2 log(2 c / ) I(M; Z(M)) = 1 X 2 m (H(Z(M)) m2{0,1} m m = SD(Z(M),Z(m)) = 1 X 2 m m m m H(Z(m))) m log(2 c / m ) apple 2 log(2 c / )

Lessons learnt The above only advocates SD-based metrics as a target MIS security is asymptotically a good privacy metric, but substantial quantitative losses possible Note: Sometimes Shannon entropy / KL divergence are valuable tools (even when stating end results in terms of SD) e.g. KL(X 4 X 5 Y 4 Y 5 = KL(X 4 Y 4 + KL(X 5 Y 5 )

Random plaintext distribution Adv rds (ENC; ChA) = SD((M,Z(M)); (M,Z(M 0 ))) Adv mis-r (ENC; ChA) = I(M; Z(M)) random and uniform Common argument: If data isn t uniform, then just run a compression algorithm to reduce it to a random string with length equal to its entropy! Not true Data may not have entropy to start with! Universal compression not possible Goldwasser-Micali, 1982 Security must hold for all distributions of the plaintext

Issues with RDS security Enc 0 (M) = 8 < : Enc(M) M 6= 0 m, 1 m 0 n M =0 m 1 n M =1 m Enc RDS secure Enc RDS secure What if we only ever encrypt 0 m and 1 m?

Distinguishing and Semantic security Adv ds (ENC; ChA) = max M 0,M 1 SD(Z(M 0 ); Z(M 1 )) Equivalent to semantic security: f, distributions P M : Computing f(m) given Z(M) is not easier than computing f(m) without Z(M), where M P M Adv ss (Enc; ChA) = max 2 H 1(f(M) Z(M)) f,p M 2 H 1(f(M)) Theorem. Adv ss (Enc; ChA) apple Adv ds (Enc; ChA) apple 2 Adv ss (Enc; ChA)

MIS security Adv mis (ENC; ChA) = max P M I(M; Z(M)) Theorem. Adv ds (ENC; ChA) apple q Adv mis (ENC; ChA) Theorem. For = Adv ds (ENC; ChA) Adv mis (ENC; ChA) apple 2 log 2 c

From RDS to DS security Key agreement K ENC random session key ChR ChA DEC Z(K) K One-time pad M K ECC ChR ECC-DEC M good code for ChR ChA Z(M K) K Problem: Worse rate than in the RDS case!

Constructions

Next A construction Generic construction: Analysis does not depend on details of underlying ECC (unlike e.g. [MV10]) Admits poly-time encryption and decryption Achieves SS/DS security Achieves capacity in interesting scenarios Generalizes previous constructions (with no proofs of DS security) [W75,HM10] First semantically-secure capacity-achieving construction with efficient polytime encryption + decryption

Seeded encryption SeedGen S M ENC C C 0 M 0 ChR DEC ChA Z(M) public Seed can be recycled, and sent as part of the ciphertext

Seeded-encryption scheme ENC S (M) k bits m bits M k m bits Abstraction: Inverting randomness extractor on seed S and output M S GF(2 k ) multiplication Public seed X E Poly-time + injective + linear C n bits

Conditional min-entropy M h i H E Z = x Z 1 (X Z) = log X! max Pr[X = x ^ Z = z] xx z S 0 X E Example: ChA = BSC q n H 1 (X Z) k n 1 log 1 1 q C ChA Z

Smooth min-entropy [RW04] H 1(X Z) = sup H 1 (X Z) X 0 Z 0 :SD(X 0 Z 0 ;XZ)apple X E Example: ChA = BSC q n H 1(X Z) k n (1 h(q)+o(1)) C ChA =2 O(p n) Z Note: 1 h(q) apple 1 log(1/(1 q))

Smooth min-entropy cont d C nq + o(1)

Seeded Encryption Security Theorem. [BT12,BTV12] If ChA symmetric, and H e (X Z) m + 2log(1/e) then Adv ds (ENC; ChA) = O( ). For ChA = BSC qn, ChR = BSC pn. Best possible k to allow for decryption over ChR : For some =2 O(p n) k =(1 h(p) o(1))n H 1(X Z) n(h(q) h(p) o(1)) Largest possible message size m =(h(q) h(p) o(1))n optimal rate!

Proof Two steps 1. Prove RDS security SD((Z(M),S,M); (Z(M),S,M 0 )) apple O( ) 2. From RDS to DS security

Proof RDS Security M M M S -1 0 S -1 0 X X S 0 E S 0 E C C ChA ChA H 1(X Z) Z m + 2 log 1 By the Leftover Hash Lemma [BBR88,ILL89,BBCM95] Z

From RDS to DS security In general: Random-message security does not imply DS security. Lemma. If ChA is symmetric, then ENC is DS secure. Proof idea: M Z S (M) is symmetric Z S (M 0 ) Δ S, M: SD(Z S (M); Z S ($)) apple Z S ($) Z S (M) Z S (M 00 )

Extensions Above only achieves capacity for limited channels: ChA($) = $ Extension to arbitrary symmetric channels [TV13] Alternative: Better estimates of Smooth-minentropy? [C15] New soft-covering lemma used to obtain existential proof that rate is achievable in the semantic-security regime!

Conclusions and Open questions

Open questions A crypto wish list Concrete parameters. Given ChA, ChR, message length m, and security level e, find Enc with smallest possible ciphertext length n such that Adv ds (Enc; ChA) apple Cryptanalysis. Do physical assumptions really hold?

Thank you! Merci!