Lecture 9 - One Way Permutations

Similar documents
Lectures One Way Permutations, Goldreich Levin Theorem, Commitments

Lecture 5: Pseudo-Random Generators and Pseudo-Random Functions

Lecture 10 - MAC s continued, hash & MAC

COS598D Lecture 3 Pseudorandom generators from one-way functions

Lecture 4 - Computational Indistinguishability, Pseudorandom Generators

Lecture 3: Randomness in Computation

BU CAS CS 538: Cryptography Lecture Notes. Fall itkis/538/

Homework 7 Solutions

Lecture 22. We first consider some constructions of standard commitment schemes. 2.1 Constructions Based on One-Way (Trapdoor) Permutations

1 Cryptographic hash functions

1 Distributional problems

Introduction to Cryptography

Cryptography: The Landscape, Fundamental Primitives, and Security. David Brumley Carnegie Mellon University

1 Cryptographic hash functions

Where do pseudo-random generators come from?

Lecture 24: Goldreich-Levin Hardcore Predicate. Goldreich-Levin Hardcore Predicate

Pseudorandom Generators

Authentication. Chapter Message Authentication

10 Concrete candidates for public key crypto

Modern Cryptography Lecture 4

Lecture 4: Hardness Amplification: From Weak to Strong OWFs

Last time, we described a pseudorandom generator that stretched its truly random input by one. If f is ( 1 2

Lecture 17 - Diffie-Hellman key exchange, pairing, Identity-Based Encryption and Forward Security

Lecture 10: Zero-Knowledge Proofs

CS 282A/MATH 209A: Foundations of Cryptography Prof. Rafail Ostrosky. Lecture 4

Pr[C = c M = m] = Pr[C = c] Pr[M = m] Pr[M = m C = c] = Pr[M = m]

Indistinguishability and Pseudo-Randomness

Solutions for week 1, Cryptography Course - TDA 352/DIT 250

CS 355: TOPICS IN CRYPTOGRAPHY

Notes for Lecture A can repeat step 3 as many times as it wishes. We will charge A one unit of time for every time it repeats step 3.

Question 2.1. Show that. is non-negligible. 2. Since. is non-negligible so is μ n +

Notes for Lecture Decision Diffie Hellman and Quadratic Residues

Pseudorandom Generators

Inaccessible Entropy and its Applications. 1 Review: Psedorandom Generators from One-Way Functions

Lecture 7: CPA Security, MACs, OWFs

Lecture 15 - Zero Knowledge Proofs

Fully Homomorphic Encryption

6.080 / Great Ideas in Theoretical Computer Science Spring 2008

Lecture 7: Pseudo Random Generators

Foundation of Cryptography, Lecture 4 Pseudorandom Functions

Lecture 4 Chiu Yuen Koo Nikolai Yakovenko. 1 Summary. 2 Hybrid Encryption. CMSC 858K Advanced Topics in Cryptography February 5, 2004

On Pseudorandomness w.r.t Deterministic Observers

Lecture 7: Hard-core Predicate and PRG

Lecture 17: Constructions of Public-Key Encryption

Lecture 3: Error Correcting Codes

Lecture th January 2009 Fall 2008 Scribes: D. Widder, E. Widder Today s lecture topics

Lecture 11: Key Agreement

Lecture Notes 20: Zero-Knowledge Proofs

: On the P vs. BPP problem. 18/12/16 Lecture 10

CS294: Pseudorandomness and Combinatorial Constructions September 13, Notes for Lecture 5

Lecture 11: Hash Functions, Merkle-Damgaard, Random Oracle

CS 151 Complexity Theory Spring Solution Set 5

Lecture 13: Private Key Encryption

Lecture Introduction. 2 Formal Definition. CS CTT Current Topics in Theoretical CS Oct 30, 2012

Generic Case Complexity and One-Way Functions

A list-decodable code with local encoding and decoding

Pseudorandom Generators

Lecture 5: Pseudorandom functions from pseudorandom generators

Pseudorandom Generators

Lecture 21: P vs BPP 2

Lecture 5, CPA Secure Encryption from PRFs

Computational security & Private key encryption

Lecture 5. Lecturer: Yevgeniy Dodis Spring 2012

Lecture 5. 1 Review (Pairwise Independence and Derandomization)

CPSC 467b: Cryptography and Computer Security

Lecture 15 & 16: Trapdoor Permutations, RSA, Signatures

THE RANK METHOD AND APPLICATIONS TO POST- QUANTUM CRYPTOGRAPHY

6.892 Computing on Encrypted Data September 16, Lecture 2

Notes for Lecture 14 v0.9

Lecture 8 (Notes) 1. The book Computational Complexity: A Modern Approach by Sanjeev Arora and Boaz Barak;

Notes for Lecture 9. 1 Combining Encryption and Authentication

YALE UNIVERSITY DEPARTMENT OF COMPUTER SCIENCE

5 Pseudorandom Generators

2 Message authentication codes (MACs)

Lecture 3: Interactive Proofs and Zero-Knowledge

Average Case Complexity: Levin s Theory

Lecture 11: Non-Interactive Zero-Knowledge II. 1 Non-Interactive Zero-Knowledge in the Hidden-Bits Model for the Graph Hamiltonian problem

Foundations of Machine Learning and Data Science. Lecturer: Avrim Blum Lecture 9: October 7, 2015

Lecture 6. Winter 2018 CS 485/585 Introduction to Cryptography. Constructing CPA-secure ciphers

The Weil bounds. 1 The Statement

Lecture 18 - Secret Sharing, Visual Cryptography, Distributed Signatures

Notes for Lecture 7. 1 Increasing the Stretch of Pseudorandom Generators

Lecture 14: Cryptographic Hash Functions

Stanford University CS254: Computational Complexity Handout 8 Luca Trevisan 4/21/2010

MA 1128: Lecture 19 4/20/2018. Quadratic Formula Solving Equations with Graphs

Lecture 23: Alternation vs. Counting

From Unpredictability to Indistinguishability: A Simple. Construction of Pseudo-Random Functions from MACs. Preliminary Version.

From times immemorial, humanity has gotten frequent, often cruel, reminders that many things are easier to do than to reverse.

1 Secure two-party computation

CPSC 467: Cryptography and Computer Security

On the Power of the Randomized Iterate

Goldreich-Levin Hardcore Predicate. Lecture 28: List Decoding Hadamard Code and Goldreich-L

Lecture 5: Two-point Sampling

Lecture 8 Alvaro A. Cardenas Nicholas Sze Yinian Mao Kavitha Swaminathan. 1 Introduction. 2 The Dolev-Dwork-Naor (DDN) Scheme [1]

Lecture 26: Arthur-Merlin Games

Lecture 09: Next-bit Unpredictability. Lecture 09: Next-bit Unpredictability

Lecture 29: Computational Learning Theory

Lecture Summary. 2 Simplified Cramer-Shoup. CMSC 858K Advanced Topics in Cryptography February 26, Chiu Yuen Koo Nikolai Yakovenko

Computational Complexity: A Modern Approach. Draft of a book: Dated January 2007 Comments welcome!

Ex1 Ex2 Ex3 Ex4 Ex5 Ex6

Transcription:

Lecture 9 - One Way Permutations Boaz Barak October 17, 2007 From time immemorial, humanity has gotten frequent, often cruel, reminders that many things are easier to do than to reverse. Leonid Levin Quick Review Homework 2. Minimizing assumptions Up to now, we always assumed the following PRG Axiom: There exists a pseudorandom generator mapping {0, 1} n to {0, 1} n+1. In other words, we believe that there is an algorithm G such that the following task cannot be done: distinguish G(U n ) from U n+1. Since we still don t have a proof that this cannot be done, our only evidence that a task is hard is that many people tried many approaches to solve it and didn t succeed. The problem is that while people have been studying algorithms in one form or another for thousands of years, there hasn t been as much attention devoted to distinguishing the output of a function from the uniform distribution. Therefore, we want to have an assumption that a more natural task, such as computing a function, is hard to do. Def We say that a function g : {0, 1} {0, 1} is hard to compute (in the average case) if for every polynomial-time A, polynomially-bounded ɛ and large enough n Pr [A(x) = g(x)] < ɛ(n) Claim: There exists a hard to compute function g such that g maps {0, 1} n to {0, 1} n for every n. Proof: Just pick g at random. For every particular 2 n -time algorithm A, the expected number of inputs on which A(x) = g(x) is one, and the probability that A computes g successfully on an at least 2 n/10 fraction 1

of the total 2 n inputs can be shown to be less than 2 2 n/2. But a 2 n algorithm can be described by about 2 n 2 n/2 bits and so the total number of such algorithms is much smaller than 2 2n/2. One-way permutation Of course the mere existence of a hard function is not useful for cryptography. But the following assumption will be useful: The OWP Axiom: There exists a polynomial-time function f : {0, 1} {0, 1} such that for every n, f is a permutation over {0, 1} n (i.e., maps {0, 1} n to {0, 1} n in a one-to-one and onto way) and such that the function g = f 1 is hard to compute. Such an f is called a one-way permutation. An equivalent condition is that for every poly-time A, poly-bounded ɛ and large enough n Pr [A(f(x)) = x] < ɛ(n) (can you see why these are equivalent?) We will prove the following theorem: Theorem 1. The OWP Axiom implies the PRG Axiom. This places the PRG Axiom on a much more solid foundation, since (as alluded by Levin s quote), this is the kind of task people have tried and failed to do for centuries. (Note that in cryptography we actually put our failures to good use!) One way functions It is known that the PRG Axiom is implied by an even weaker assumption - the existence of a one way function, defined as a polynomial-time function f (not necessarily a permutation) such that for every poly-time A, poly-bounded ɛ and large enough n, Pr [A(f(x)) = w s.t. f(w) = f(x)] ɛ(n) The assumption that one-way functions exist is minimal for many cryptographic tasks. It can be shown that the existence of pseudorandom generators, encryptions with key shorter than message, message authentication code implies the existence of one-way functions. Proof of Theorem 1 Theorem 1 will follow from the following two theorems: 2

Theorem 2 (Yao s Theorem). A distribution X over {0, 1} m is pseudorandom if and only if it is unpredictable, where the latter means that for every i [m], poly-time A and poly-bounded ɛ, Pr [A(x 1,..., x i 1 ) = x i ] 1/2 + ɛ(n) x R X Theorem 3 (Goldreich-Levin). Let f be a one-way permutation. Then the following distribution is unpredictable: f(x), r, x, r where x, r R {0, 1} n and x, r = def x i r i (mod 2). Theorems 2 and 3 together imply that if f is a one-way permutation then the function x, r f(x), r, x, r is a pseudorandom generator mapping {0, 1} 2n to {0, 1} 2n+1. Proof of Theorem 2 One direction (pseudorandomness implies unpredictability) is easy and left as an exercise. For the other direction, to show that if X is unpredictable then it is pseudorandom, we define the following m + 1 hybrid distributions: H i is the first i bits of X concatenated with m i uniform random bits. It suffices to prove that for every i, H i 1 H i. We do this by reduction using the following claim: Claim 4. Suppose that there is an algorithm D such that Pr[D(H i ) = 1] Pr[D(H i 1 ) = 1] ɛ (1) then, there is an algorithm P with almost the same running time, such that Pr x R X [P (x 1,..., x i 1 ) = x i ] 1/2 + ɛ The claim clearly proves the theorem (exercise). Proof of Claim 4. We can drop without loss of generality the absolute value in (1), since if D satisfies this condition with a negative number inside the absolute value, then D will satisfy it with a positive number. The algorithm P will do the following on input x 1,..., x i 1 : 1. Guess a value b for x i, and choose also m i random bits y i+1,..., y m. 2. Let z = D(x 1,..., x i 1, b, y i+1,..., y m ). 3

3. If z = 1 then output b; otherwise, output 1 b. Analysis: The intuition behind the analysis is that if we guessed correctly then we re in H i situation, where we re more likely to get the z = 1 output. The actual analysis is the following: Let p be the probability that D(H i 1 ) = 1 and p + ɛ the probability that D(H i ) = 1. We know that Pr[z = 1] = p. On the other hand we know that Pr[z = 1 b = x i ] p + ɛ. This means that p = Pr[z = 1] = 1 2 Pr[z = 1 b = x i] + 1 2 Pr[z = 1 b = 1 x i] implying that Pr[z = 1 b = (1 x i )] p ɛ. So, the probability we output a correct answer is: 1 2 Pr[z = 1 b = x i]+ 1 2 (1 Pr[z = 1 b = 1 x i]) 1 2 (p+ɛ)+1 2 (1 p+ɛ) = 1 2 +ɛ Proof of Theorem 3 Theorem 3 will follow from the following lemma (exercise): Lemma 5. There is a poly(n, 1/ɛ)-time algorithm that given oracle access to an oracle A that computes the function r x, r with probability 1/2 + ɛ over the choice of r, outputs x with probability at least ( ɛ 100n) 2. Proof of Lemma 5 The proof of Lemma 5 is a bit involved, so we will do it step by step. The errorless case Suppose that we had a perfect oracle A that computed r x, r with probability 1. Then, we could recover the first bit of x by outputting A(r) A(r e 1 ) for some r (where e 1 is the vector with all zeroes except at the first location). Note that A(r) A(r e 1 ) = x, r x, r e 1 = xi r i + x i (r i e 1 i ) = x i r i + x i r i + x i e 1 i = x 1 4

The small error case Suppose that the oracle A was correct with probability 0.9 over the choice of r. Then because for a random r, r e 1 is uniformly distributed, we can use the union bound to show that the probability we get an incorrect answer when asking A(r) and A(r 1) is at most 0.2. (Note that these questions are dependent but the union bound still works in this case.) Therefore, if we choose a random r, then with probability at least 0.8, A(r) A(r e 1 ) will give us the first bit of x, and we can amplify this probability to 1/(10n) by making 10 log n repetitions and taking the majority vote. In this way, we can recover all of the bits of x with high probability. This will work as long as A is correct with probability more than 3 4, but when A is correct with probability, say, 0.7, this analysis doesn t help us at all. The union bound will only say that we get the right value with probability at least 0.4 - worse than random guessing! The full case The full case is when the oracle A is only guaranteed to be correct with probability 1/2 + ɛ. We will do that case on Thursday. 5