complexity distributions

Similar documents
Extractors for Turing-machine sources

Extractors for circuit sources

Hardness amplification proofs require majority

THE COMPLEXITY OF CONSTRUCTING PSEUDORANDOM GENERATORS FROM HARD FUNCTIONS

BOUNDED-DEPTH CIRCUITS CANNOT SAMPLE GOOD CODES

Norms, XOR lemmas, and lower bounds for GF(2) polynomials and multiparty protocols

Simulating Independence: New Constructions of Condensers, Ramsey Graphs, Dispersers, and Extractors

Extractors and Pseudorandom Generators

COMPRESSION OF SAMPLABLE SOURCES

On the Complexity of Approximating the VC dimension

Simulating Independence: New Constructions of Condensers, Ramsey Graphs, Dispersers, and Extractors

A list-decodable code with local encoding and decoding

Meta-Algorithms vs. Circuit Lower Bounds Valentine Kabanets

Pseudorandomness and Combinatorial Constructions

Average-Case Lower Bounds and Satisfiability Algorithms for Small Threshold Circuits

From Affine to Two-Source Extractors via Approximate Duality

Explicit Two-Source Extractors and Resilient Functions

Deterministic Extractors - Lecture Notes

Local Reductions. September Emanuele Viola Northeastern University

Pseudorandomness and combinatorial constructions

,PSURYHG&RQVWUXFWLRQVIRU([WUDFWLQJ4XDVL5DQGRP%LWV IURP6RXUFHVRI:HDN5DQGRPQHVV

Are all distributions easy?

Deterministic Extractors For Small-Space Sources

Deterministic Extractors For Small-Space Sources

Affine extractors over large fields with exponential error

New Imperfect Random Source with Applications to Coin-Flipping

Pseudorandom Generators for Regular Branching Programs

On the Complexity of Approximating the VC Dimension

Pseudorandom Generators for Regular Branching Programs

Minentropy and its Variations for Cryptography

Notes for Lecture 25

1 Nisan-Wigderson pseudorandom generator

Almost k-wise independence versus k-wise independence

Certifying polynomials for AC 0 [ ] circuits, with applications

Simulating Independence: New Constructions of Condensers, Ramsey Graphs, Dispersers, and Extractors

Finite fields, randomness and complexity. Swastik Kopparty Rutgers University

CS294: Pseudorandomness and Combinatorial Constructions September 13, Notes for Lecture 5

Worst-Case to Average-Case Reductions Revisited

Lecture 3: Randomness in Computation

: On the P vs. BPP problem. 30/12/2016 Lecture 12

An introduction to randomness extractors

c 2006 Society for Industrial and Applied Mathematics

Deterministic Coupon Collection and Better Strong Dispersers

6.842 Randomness and Computation April 2, Lecture 14

Pseudorandom Bits for Constant Depth Circuits with Few Arbitrary Symmetric Gates

2 Natural Proofs: a barrier for proving circuit lower bounds

Deterministic APSP, Orthogonal Vectors, and More:

Recent Developments in Explicit Constructions of Extractors

Kolmogorov Complexity in Randomness Extraction

On Pseudorandom Generators with Linear Stretch in NC 0

Additive Combinatorics and Computational Complexity

Linear sketching for Functions over Boolean Hypercube

On Recycling the Randomness of the States in Space Bounded Computation

Two Sides of the Coin Problem

PRGs for space-bounded computation: INW, Nisan

A Switching Lemma Tutorial. Benjamin Rossman University of Toronto and NII, Tokyo

: On the P vs. BPP problem. 30/12/2016 Lecture 11

Rational Proofs against Rational Verifiers

Lecture 21: P vs BPP 2

Extractors using hardness amplification

Indistinguishability by adaptive procedures with advice, and lower bounds on hardness amplification proofs

Randomness and non-uniformity

Extracting Kolmogorov Complexity with Applications to Dimension Zero-One Laws

How Low Can Approximate Degree and Quantum Query Complexity be for Total Boolean Functions?

Counting dependent and independent strings

The Unified Theory of Pseudorandomness

CS Communication Complexity: Applications and New Directions

The Complexity of the Matroid-Greedoid Partition Problem

The coin problem for product tests

Testing Monotone High-Dimensional Distributions

1 Randomized Computation

Simple extractors via crypto pseudo-random generators

On the (Im)possibility of Cryptography with Imperfect Randomness

Norms, XOR lemmas, and lower bounds for GF (2) polynomials and multiparty protocols

1 AC 0 and Håstad Switching Lemma

A SIHPLE CONSTRUCTION OF ALMOST k-wise INDEPENDENT RANDOM VARIABLES. O. Goldreich and J. Hastad. Technical Report #638

A Sample of Samplers: A Computational Perspective on Sampling

Learning to Impersonate

Quantum boolean functions

Last time, we described a pseudorandom generator that stretched its truly random input by one. If f is ( 1 2

On Extractors and Exposure-Resilient Functions for Sublogarithmic Entropy

Sparse Pseudorandom Distributions

Balls and Bins: Smaller Hash Families and Faster Evaluation

HARDNESS AMPLIFICATION VIA SPACE-EFFICIENT DIRECT PRODUCTS

The one-way communication complexity of the Boolean Hidden Matching Problem

COMPUTATIONAL COMPLEXITY

Communication is bounded by root of rank

On Uniform Amplification of Hardness in NP

Reproduced without access to the TeX macros. Ad-hoc macro denitions were used instead. ON THE POWER OF TWO-POINTS BASED SAMPLING

Hardness Amplification Proofs Require Majority

On Pseudorandomness w.r.t Deterministic Observers

Improved Pseudorandom generators for depth 2 circuits

Interactive Proofs. Merlin-Arthur games (MA) [Babai] Decision problem: D;

Simple Constructions of Almost k-wise Independent Random Variables

On the Average-case Complexity of MCSP and Its Variants. Shuichi Hirahara (The University of Tokyo) Rahul Santhanam (University of Oxford)

CS 880: Advanced Complexity Theory Final Project: Hardness Amplification within NP. Abstract

CSE 291: Fourier analysis Chapter 2: Social choice theory

List Decoding in Average-Case Complexity and Pseudorandomness

On the optimal compression of sets in P, NP, P/poly, PSPACE/poly

Transcription:

The of complexity distributions Emanuele Viola Northeastern University March 2012

Local functions (a.k.a. Junta, NC 0 ) f : {0,1}n {0,1} d-local : output depends on d input bits Input x d f Fact: Parity(x) = 1 xi = 1 mod 2 is not n-1 local Proof: Flip any input bit output flips

Local generation of ( Y, parity(y) ) Theorem [Babai '87; Boppana Lagarias '87] There is f : {0,1} n {0,1} n+1, each bit 2-local Distribution f(x) ( Y, parity(y) ) (X, Y {0,1} n uniform) x 1 x 2 x 3 x n y 1 = x 1 y 2 = x 1 x 2 y 3 = x 2 x 3... y n = x n-1 x n parity(y) = x n

Our message Complexity theory of distributions (as opposed to functions) How hard is it to generate (a.k.a. sample) distribution D given random bits? E.g., D = ( Y, parity(y) ), D = W k := uniform n-bit with k 1's

Is message new? Generate Random Factored Numbers [Bach '85, Kalai] Random Generation of Combinatorial Structures from a Uniform Distribution [Jerrum Valiant Vazirani '86] The Quantum Communication Complexity of Sampling [Ambainis Schulman Ta-Shma Vazirani Wigderson] On the Implementation of Huge Random Objects [Goldreich Goldwasser Nussboim] Our line of work:1) first negative results (lower bounds) for local, AC 0, Turing machines, etc. 2) new connections

Outline of talk Lower bound for sampling Wk = uniform weight-k string Randomness extractors Local sources Bounded-depth circuit (AC 0 ) Turing machine

Theorem [V.] f : {0,1} n {0,1} n 0.1 log n - local f(x) at Statistical Distance > 1 - n -Ω(1) from W n/2 = uniform w/ weight n/2 Tight up to Ω(): f(x) = x Extends to Wk, k n/2, tight? Also open: remove bound on input length

Succinct data structures Problem: Store S {1, 2,, n}, S fixed in u = optimal + r bits, answer i S? probing d bits. 01001001101011 Store b 1 b 2 b 3... b u Connection [V.]: Solution generate W S d-local, Stat. Distance < 1-2 -r Corollary: Need r > Ω(log n) if d = 0.1 log n First lower bound for S = n/2, n/4,...

Proof Theorem: Let f : {0,1} n {0,1} n : d= 0.1 log n-local. There is T {0,1} n : Pr[f(x) T] Pr[W n/2 T] > 1 - n -Ω(1) Warm-up scenarios: f(x) = 000111 Low-entropy T := { 000111 } Pr[ f(x) T] Pr[W n/2 T] = 1 T / (n choose n/2) f(x) = x Anti-concentration T := { z : i z i = n/2 } Pr[ f(x) T] Pr[W n/2 T] = Θ(1)/ n 1

Proof Partition input bits X = (X1, X 2,, X s, H) X 1 X 2... X s H O(d) O(d) O(d) B 1 B 2 B s B H Fix H. Output block Bi depends only on bit X i Many Bi constant ( B i (0,H) = B i (1,H) ) low-entropy Many Bi depend on X i ( B i (0,H) B i (1,H) ) Idea: Independent anti-concentration: can't sum to n/2

X 1 X 2... X s H O(d) O(d) O(d) B 1 B 2 B s B H If many Bi (0,H), B i (1,H) have different sum of bits, use Anti-concentration Lemma [ Littlewood Offord ] For a 1, a 2,..., a s 0, any c, Pr X {0,1} s [ i a i X i = c] < 1/ n Problem: Bi (0,H) = 100, B i (1,H) = 010 high entropy but no anti-concentration Fix: want many blocks 000, so high entropy different sum

X 1 X 2... X s H O(d) O(d) O(d) B 1 B 2 B s B H Test T {0,1}n : Pr[f(X 1,...,X s,h) T] 1 ; Pr[W n/2 T] 0 z T H : X 1,...,X s w/ many blocks B i fixed : f(x 1,...,X s,h) = z OR Few blocks z Bi are 000 OR i z i n/2

Open problem Understand complexity of Wk = uniform weight-k string for all choices of: k, model (local, decision tree, etc.), statistical distance, randomness complexity Similar problems in combinatorics, coding, ergodic theory One example 2-local f : {0,1} 2n {0,1} n Distance(f(X), W n/4 ) 1-Θ(1)/ n input length= H(1/4)n+o(n) Distance 1-2 -Ω(n)?

Outline of talk Lower bound for sampling Wk = uniform weight-k string Randomness extractors Local sources Bounded-depth circuit (AC 0 ) Turing machine

Randomness extractors Want: turn weak randomness (correlation, bias,...) into close to uniform Extractor for sources (distributions) S on {0,1}n Deterministic, efficient map : {0,1} n {0,1} m D S, Extractor(D) ε-close to uniform Starting with [Von Neumann '51] major line of research

Sources Independent blocks [Chor Goldreich 88, Barak Bourgain Impagliazzo Kindler Rao Raz Shaltiel Sudakov Wigderson...] Some bits fixed, others uniform & indep. [Chor Friedman Goldreich Hastad Rudich Smolensky '85, Cohen Wigderson, Kamp Zuckerman, ] One-way, space-bounded algorithm [Blum '86, Vazirani, Koenig Maurer, Kamp Rao Vadhan Zuckerman] Affine set [BKSSW, Bourgain, Rao, Ben-Sasson Kopparty, Shaltiel] Our results: first extractors for circuit sources: local, AC 0 and for Turing-machine sources

Trevisan Vadhan; 2000 Sources D with min-entropy k ( Pr[D = a] < 2-k a ) sampled by small circuit C: {0,1} * {0,1} n given random bits. Extractor Lower bound for C (even 1 bit from k=n-1) Extractor Time(2 O(n) ) circuit size 2 o(n)

[V.] Extractor (1 bit from k=n-1) Sampling lower bound f : {0,1} n {0,1} (balanced) small circuits cannot sample f -1 (0) (uniformly, given random bits) (lower bound we just saw extract 1 bit, error < 1, from entropy k=n-1, O(1)-local source)

Outline of talk Lower bound for sampling Wk = uniform weight-k string Randomness extractors Local sources Bounded-depth circuit (AC 0 ) Turing machine

Extractors for local functions f : {0,1} * {0,1} n d-local : each output bit depends on d input Theorem[V.] From d-local n-bit source with min-entropy k: Let T := k poly(k/nd) Extract T bits, error exp(-t) E.g. T = k c from k = n 1-c, d = n c Note: always need k > d d = O(1) NC 0 source. Independently [De Watson]

High-level proof Theorem d-local n-bit min-entropy k source (T:=k poly(k/nd)) Is convex combination of bit-block source block-size = dn/k, entropy T, error exp(-t) Bit-block source with entropy T: (0, 1, X 1, 1- X 5, X 3, X 3, 1- X 2, 0, X 7, 1- X 8, 1, X 1 ) X 1, X 2,..., X T {0,1} 0 < occurrences of X i < block-size = dn/k Special case of low-weight affine sources Use [Rao 09]

Proof d-local n-bit source min-entropy k: convex combo bit-block x 1 x 2 x 3 x 4 x 5 x 6 x 7 w.l.o.g. nd/k d y 1 y 2 y 3 y 4 y 5 n output bits Output entropy > k y i with variance > k/n Isoperimetry x j with influence > k/nd Set uniformly N(N(x j )) \ {x j } (N(v) = neighbors of v) with prob. > k/nd, N(x j ) non-constant block of size nd/k Repeat k / N(N(xj )) = k k/nd 2 times, expect k k 2 /n 2 d 3 blocks

Open problem Does previous result hold for decision-tree sources? May use isoperimetric inequality for decision trees [O'Donnell Saks Schramm Servedio]

Outline of talk Lower bound for sampling Wk = uniform weight-k string Randomness extractors Local sources Bounded-depth circuit (AC 0 ) Turing machine

Bounded-depth circuits (AC 0 ) Random Input x V /\ V /\ V /\ V /\ V /\ /\ bounded depth \/ = or /\ = and = not Sample y n output bits Theorem [V.] From AC 0 n-bit source with min-entropy k: Extract k poly(k / n 1.001 ) bits, error 1/n ω(1)

High-level proof Apply random restriction [Furst Saxe Sipser, Ajtai, Yao, Hastad] Switching lemma: Circuit collapses to d=n ε -local apply previous extractor for local sources Problem: fix 1-o(1) input variables, entropy?

The effect of restrictions on entropy Theorem f : {0,1} * {0,1} n : f(x) has min-entropy k Let R be random restriction with Pr[*] = p With high prob., f R (X) has min-entropy pk Parameters: k = poly(n), p = 1/ k After restriction both circuit collapsed and min-entropy pk = k still poly(n)

The effect of restrictions on entropy Theorem f : {0,1} * {0,1} n : f(x) has min-entropy k Let R be random restriction with Pr[*] = p With high prob., f R (X) has min-entropy pk Proof: Builds on [Lovett V] Isoperimetric inequality for noise: A {0,1} L of density α random m, m' obtained flipping bits w/ probability p : α 2 Pr[both m A and m' A] α 1+p Bound collision probability Pr[ f R (X) = f R (Y) ] Qed

Bounded-depth circuits (AC 0 ) Corollary to AC 0 extractor Explicit boolean f : AC 0 cannot sample ( Y, f(y) ) f := 1-bit affine extractor for min-entropy k = n 0.99 Note: For k > 1/2, Inner Product 1-bit affine extractor, and AC 0 can sample ( Y, InnerProduct(Y) ) [Impagliazzo Naor] Explains why affine extractors for k < 1/2 more complicated

Open problem Theorem[V.] AC0 can generate ( Y, majority(y) ), error 2 - Y Challenge: error 0? Related [Lovett V.] Does every bijection {0,1} n = = {x {0,1} n+1 : x i n/2 } have large expected hamming distortion? (n even)

Outline of talk Lower bound for sampling Wk = uniform weight-k string Randomness extractors Local sources Bounded-depth circuit (AC 0 ) Turing machine

Turing-machine source Machines start on blank (all-zero) tape have coin-toss state: writes random bit on tape coin toss 0 0 1 1 0 1... When computation is over, first n bits on tape are sample

Extractors Theorem [V.] From Turing-machine n-bit source running in time n 1.9 and with min-entropy k n 0.9 : Extract n Ω(1) bits, error exp(-n Ω(1) ) Proof: Variant of crossing-sequence technique TM source = convex combo of independent-block source (no error) Use e.g. [Kamp Rao Vadhan Zuckerman]

Extractors Corollary [V.] Turing-machine running in time n 1.9 cannot sample (X, Y, InnerProduct(X,Y) ) for X = Y = n Proof: As before, but use extractor in [Chor Goldreich]

Summary Complexity of distributions = uncharted research direction New connections to data structures, randomness extractors, and various combinatorial problems First sampling lower bounds and extractors for local, decision tree (not in this talk), AC 0 Turing machines