Getting lost while hiking in the Boolean wilderness

Similar documents
Rings, Paths, and Cayley Graphs

The Markov Chain Monte Carlo Method

Ch. 7.6 Squares, Squaring & Parabolas

arxiv: v2 [math.pr] 26 Aug 2017

CS359G Lecture 5: Characters of Abelian Groups

WAITING FOR A BAT TO FLY BY (IN POLYNOMIAL TIME)

Periodicity & State Transfer Some Results Some Questions. Periodic Graphs. Chris Godsil. St John s, June 7, Chris Godsil Periodic Graphs

Lectures 15: Cayley Graphs of Abelian Groups

Unit 4 Systems of Equations Systems of Two Linear Equations in Two Variables

Solving Quadratic Equations Review

EMBEDDED PATHS AND CYCLES IN FAULTY HYPERCUBES

CS 70 Discrete Mathematics and Probability Theory Fall 2016 Seshia and Walrand Midterm 1 Solutions

RING ELEMENTS AS SUMS OF UNITS

CPSC 536N: Randomized Algorithms Term 2. Lecture 9

COS597D: Information Theory in Computer Science October 5, Lecture 6

Advanced Algebra II 1 st Semester Exam Review

10.1 Radical Expressions and Functions Math 51 Professor Busken

Algebraic Constructions of Graphs

Value-Ordering and Discrepancies. Ciaran McCreesh and Patrick Prosser

Coupling. 2/3/2010 and 2/5/2010

Chapter 0: Preliminaries

Exponential algorithmic speedup by quantum walk

Coupling AMS Short Course

Polynomials: Adding, Subtracting, & Multiplying (5.1 & 5.2)

Linear Index Codes are Optimal Up To Five Nodes

Review Unit Multiple Choice Identify the choice that best completes the statement or answers the question.

Controlling the Population

Combinatorics of p-ary Bent Functions

Small Label Classes in 2-Distinguishing Labelings

August 2015 Qualifying Examination Solutions

Borel circle squaring

Definition: A "system" of equations is a set or collection of equations that you deal with all together at once.

Zero-Divisor Graph with Respect to an Ideal *

Homomorphism Testing

Math 2 Variable Manipulation Part 6 System of Equations

MATH 22 HAMILTONIAN GRAPHS. Lecture V: 11/18/2003

Ma/CS 6b Class 3: Stable Matchings

WHAT IS a sandpile? Lionel Levine and James Propp

1.3 Vertex Degrees. Vertex Degree for Undirected Graphs: Let G be an undirected. Vertex Degree for Digraphs: Let D be a digraph and y V (D).

Math 430 Exam 2, Fall 2008

Elliptic Curves Spring 2015 Problem Set #10 Due: 4/24/2015

REVIEW QUESTIONS. Chapter 1: Foundations: Sets, Logic, and Algorithms

Distinct edge weights on graphs

118 PU Ph D Mathematics

ACO Comprehensive Exam March 20 and 21, Computability, Complexity and Algorithms

Planar Ramsey Numbers for Small Graphs

Limits of cubes E1 4NS, U.K.

The Minesweeper game: Percolation and Complexity

Almost giant clusters for percolation on large trees

Lecture 10: Lovász Local Lemma. Lovász Local Lemma

CSCI3390-Second Test with Solutions

Finite fields, randomness and complexity. Swastik Kopparty Rutgers University

Probability Models of Information Exchange on Networks Lecture 5

Regular actions of groups and inverse semigroups on combinatorial structures

Symmetric Rendezvous in Graphs: Deterministic Approaches

Lecture 6: The Pigeonhole Principle and Probability Spaces

Shortest reconfiguration paths in the solution space of Boolean formulas

Distinguishing Chromatic Number of Cartesian Products of Graphs

Chapter 6. Systems of Equations and Inequalities

Local limits of random graphs

A New Variation of Hat Guessing Games

Show that the following problems are NP-complete

Algebraic Properties and Panconnectivity of Folded Hypercubes

On the Dynamic Chromatic Number of Graphs

ACP ALGEBRA II MIDTERM REVIEW PACKET

MATH 409 LECTURES THE KNAPSACK PROBLEM

The Probabilistic Method

< k 2n. 2 1 (n 2). + (1 p) s) N (n < 1

ELECTRICAL NETWORK THEORY AND RECURRENCE IN DISCRETE RANDOM WALKS

Lecture 4: Orbits. Rajat Mittal. IIT Kanpur

The Hypercube Graph and the Inhibitory Hypercube Network

Student Guide: Chapter 1

COS597D: Information Theory in Computer Science September 21, Lecture 2

Statistics 251/551 spring 2013 Homework # 4 Due: Wednesday 20 February

Quantum algorithms (CO 781, Winter 2008) Prof. Andrew Childs, University of Waterloo LECTURE 11: From random walk to quantum walk

Lecture 9: Counting Matchings

Ringing the Changes II

18.5 Crossings and incidences

DOMINO TILINGS INVARIANT GIBBS MEASURES

The structure of bull-free graphs I three-edge-paths with centers and anticenters

1 Primals and Duals: Zero Sum Games

Lecture Notes CS:5360 Randomized Algorithms Lecture 20 and 21: Nov 6th and 8th, 2018 Scribe: Qianhang Sun

The Quadratic Formula, the Discriminant, and Solving Quadratic Equations and Inequalities

Lesson 24: Modeling with Quadratic Functions

The Rainbow Turán Problem for Even Cycles

Semester 2 Final Review

Propp-Wilson Algorithm (and sampling the Ising model)

Foundations of Mathematics and Pre-Calculus 10. Sample Questions for Algebra and Number. Teacher Version

Multi-coloring and Mycielski s construction

Combinatorial semigroups and induced/deduced operators

Algebra II Vocabulary Cards

Asymptotically optimal induced universal graphs

Tutorial: Locally decodable codes. UT Austin

Multiple Choice Questions for Review

10.4 The Kruskal Katona theorem

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

MATH 2200 Final Review

2 Natural Proofs: a barrier for proving circuit lower bounds

Hence, (f(x) f(x 0 )) 2 + (g(x) g(x 0 )) 2 < ɛ

Pre-AP Algebra 2 Unit 8 - Lesson 5 Functions that differ at only one point; holes; simplifying rational functions

Transcription:

Getting lost while hiking in the Boolean wilderness Renan Gross Student Probability Day VI, 11/05/17 Weizmann Institute of Science Joint work with Uri Grupel

Warning This presentation shows explicit images of graphs. Viewer discretion is advised. 1

Scenery reconstruction Suppose we are given a graph G, 2

Scenery reconstruction Suppose we are given a graph G, 2

Scenery reconstruction Suppose we are given a graph G, colored by some function f (x). 2

Scenery reconstruction An agent performs a simple random walk S t on the graph. 3

Scenery reconstruction An agent performs a simple random walk S t on the graph. Reported scenery: W 3

Scenery reconstruction An agent performs a simple random walk S t on the graph. Reported scenery: W B 3

Scenery reconstruction An agent performs a simple random walk S t on the graph. Reported scenery: W B W 3

Scenery reconstruction An agent performs a simple random walk S t on the graph. Reported scenery: W B W B 3

Scenery reconstruction An agent performs a simple random walk S t on the graph. Reported scenery: W B W B W 3

Scenery reconstruction An agent performs a simple random walk S t on the graph. Reported scenery: W B W B W W 3

Scenery reconstruction An agent performs a simple random walk S t on the graph. Reported scenery: W B W B W W B 3

Scenery reconstruction An agent performs a simple random walk S t on the graph. Reported scenery: W B W B W W B B 3

Scenery reconstruction Question Suppose we know G. Can we reconstruct the coloring f based only on the trace of the random walk f (S t )? 4

Scenery reconstruction Question Suppose we know G. Can we reconstruct the coloring f based only on the trace of the random walk f (S t )? Remarks: We are given an infinite random walk reconstruction should happen with probability 1 4

Scenery reconstruction Question Suppose we know G. Can we reconstruct the coloring f based only on the trace of the random walk f (S t )? Remarks: We are given an infinite random walk reconstruction should happen with probability 1 Up to isomorphisms of the graph. 4

Scenery reconstruction The answer is known for a variety of Abelian Cayley graphs and walks. What about the n-dimensional Boolean hypercube? 5

Scenery reconstruction The answer is known for a variety of Abelian Cayley graphs and walks. What about the n-dimensional Boolean hypercube? Vote? 5

Boolean scenery The way to do this is to show that there are two colorings which yield the same distribution of sceneries. Example: 6

Boolean scenery The way to do this is to show that there are two colorings which yield the same distribution of sceneries. Example: 6

Boolean scenery The way to do this is to show that there are two colorings which yield the same distribution of sceneries. Example: The process f (S t ) is Bernoulli IID with success probability 1/2! 6

Locally biased functions So we defined locally biased functions, and started investigating them in their own right. Definition Let G be a graph. A Boolean function f : G { 1, 1} is called locally p-biased, if for every vertex x G we have {y x; f (y) = 1} deg(x) = p. In words, f is locally p-biased if for every vertex x, f takes the value 1 on exactly a p-fraction of x s neighbors. 7

Locally biased functions So we defined locally biased functions, and started investigating them in their own right. Definition Let G be a graph. A Boolean function f : G { 1, 1} is called locally p-biased, if for every vertex x G we have {y x; f (y) = 1} deg(x) = p. In words, f is locally p-biased if for every vertex x, f takes the value 1 on exactly a p-fraction of x s neighbors. Existence of two non-isomorphic locally biased functions implies that the scenery reconstruction problem cannot be solved. 7

Locally biased functions Locally biased functions can be defined for any graph. 8

Locally biased functions Locally biased functions can be defined for any graph. 8

Theorems We found: Theorem (characterization) Let n N be a natural number and p [0, 1]. There exists a locally p-biased function f : { 1, 1} n { 1, 1} if and only if p = b/2 k for some integers b 0, k 0, and 2 k divides n. Theorem (size) Let n be even. Let B1/2 n be a maximal class of non-isomorphic locally 1/2-biased functions, i.e every two functions in B 1/2 n are non-isomorphic to each other. Then B1/2 n C2 n /n 1/4, where C > 0 is a universal constant. 9

Proof of theorem 1 We start with locally 1/n-biased functions. 10

Proof of theorem 1 We start with locally 1/n-biased functions. 10

Proof of theorem 1 We start with locally 1/n-biased functions. 10

Proof of theorem 1 We start with locally 1/n-biased functions. A tile for n = 4. 10

Proof of theorem 1 We start with locally 1/n-biased functions. A tile for n = 4. A tile for n = 8. 10

Proof of theorem 1 Solution: find a half-tiling. 11

Proof of theorem 1 Solution: find a half-tiling. 11

Proof of theorem 1 Solution: find a half-tiling. 11

Proof of theorem 1 Solution: find a half-tiling. 11

Proof of theorem 1 Solution: find a half-tiling. We have a 1/n tiling! 11

Proof of theorem 1 To get m/n instead of 1/n, combine several disjoint tilings. 12

Proof of theorem 1 Where do we get half-tilings from? Perfect codes. 13

Proof of theorem 1 Where do we get half-tilings from? Perfect codes. Where do we get disjoint half-tilings from? Hamming perfect codes. 13

A word on the other direction Let x be a uniformly random element of the cube. Then f (x) = 1 with probability l/2 n, where l = {x { 1, 1} n ; f (x) = 1} 14

A word on the other direction Let x be a uniformly random element of the cube. Then f (x) = 1 with probability l/2 n, where l = {x { 1, 1} n ; f (x) = 1} Let y be a uniformly random neighbor of x. Then f (y) = 1 with probability p by definition. 14

A word on the other direction Let x be a uniformly random element of the cube. Then f (x) = 1 with probability l/2 n, where l = {x { 1, 1} n ; f (x) = 1} Let y be a uniformly random neighbor of x. Then f (y) = 1 with probability p by definition. Since both x and y are uniform random vertices, P(f (x) = 1) = P(f (y) = 1) 14

A word on the other direction Let x be a uniformly random element of the cube. Then f (x) = 1 with probability l/2 n, where l = {x { 1, 1} n ; f (x) = 1} Let y be a uniformly random neighbor of x. Then f (y) = 1 with probability p by definition. Since both x and y are uniform random vertices, P(f (x) = 1) = P(f (y) = 1) Denoting p = m/n for some m {0, 1,..., n}, this gives p = l 2 n = m n. Writing n = c2 k gives the desired result. 14

Proof of theorem 2 15

Proof of theorem 2 15

Proof of theorem 2 g(x 1, x 2, x 3, x 4 ) = x 1 x 2 h(x 1, x 2, x 3, x 4 ) = 1 2 (x 1x 2 + x 2 x 3 x 3 x 4 + x 1 x 4 ) 15

Proof of theorem 2 g n (x 1,..., x n ) = x 1 x n/2 ( h k = k 1 h i=0 x 1+4i,..., ) k 1 i=0 x 4+4i 15

Proof of theorem 2 Fact Let f i : { 1, 1} n i { 1, 1} be locally 1/2-biased functions for i = 1, 2 where n 1 + n 2 = n. Then f (x) = f 1 (x 1,..., x n1 )f 2 (x n1 +1,..., x n ) is a locally 1/2-biased function on { 1, 1} n. We can then build up locally 1/2-biased functions from the building blocks h 0, h 1,... and g 0, g 2, g 4,.... 16

Proof of theorem 2 Fact Let f i : { 1, 1} n i { 1, 1} be locally 1/2-biased functions for i = 1, 2 where n 1 + n 2 = n. Then f (x) = f 1 (x 1,..., x n1 )f 2 (x n1 +1,..., x n ) is a locally 1/2-biased function on { 1, 1} n. We can then build up locally 1/2-biased functions from the building blocks h 0, h 1,... and g 0, g 2, g 4,.... Every time we pick h 0, we use 4 bits. 16

Proof of theorem 2 Fact Let f i : { 1, 1} n i { 1, 1} be locally 1/2-biased functions for i = 1, 2 where n 1 + n 2 = n. Then f (x) = f 1 (x 1,..., x n1 )f 2 (x n1 +1,..., x n ) is a locally 1/2-biased function on { 1, 1} n. We can then build up locally 1/2-biased functions from the building blocks h 0, h 1,... and g 0, g 2, g 4,.... Every time we pick h 0, we use 4 bits. Every time we pick h 1, we use 8 bits. 16

Proof of theorem 2 Fact Let f i : { 1, 1} n i { 1, 1} be locally 1/2-biased functions for i = 1, 2 where n 1 + n 2 = n. Then f (x) = f 1 (x 1,..., x n1 )f 2 (x n1 +1,..., x n ) is a locally 1/2-biased function on { 1, 1} n. We can then build up locally 1/2-biased functions from the building blocks h 0, h 1,... and g 0, g 2, g 4,.... Every time we pick h 0, we use 4 bits. Every time we pick h 1, we use 8 bits. Every time we pick h 2, we use 12 bits. 16

Proof of theorem 2 Fact Let f i : { 1, 1} n i { 1, 1} be locally 1/2-biased functions for i = 1, 2 where n 1 + n 2 = n. Then f (x) = f 1 (x 1,..., x n1 )f 2 (x n1 +1,..., x n ) is a locally 1/2-biased function on { 1, 1} n. We can then build up locally 1/2-biased functions from the building blocks h 0, h 1,... and g 0, g 2, g 4,.... Every time we pick h 0, we use 4 bits. Every time we pick h 1, we use 8 bits. Every time we pick h 2, we use 12 bits.... 16

Proof of theorem 2 Fact Let f i : { 1, 1} n i { 1, 1} be locally 1/2-biased functions for i = 1, 2 where n 1 + n 2 = n. Then f (x) = f 1 (x 1,..., x n1 )f 2 (x n1 +1,..., x n ) is a locally 1/2-biased function on { 1, 1} n. We can then build up locally 1/2-biased functions from the building blocks h 0, h 1,... and g 0, g 2, g 4,.... Every time we pick h 0, we use 4 bits. Every time we pick h 1, we use 8 bits. Every time we pick h 2, we use 12 bits.... The rest of the bits are filled with g k. 16

Proof of theorem 2 Every time we pick h 0, we use 4 bits. Every time we pick h 1, we use 8 bits. Every time we pick h 2, we use 12 bits.... The rest of the bits are filled with g k. The # of different combinations is the same as the # of solutions to: 4a 1 + 8a 2 + + 4ka k n which is at least C 2 n /n 1/4 for some constant C. 17

The end of the presentation There are other methods of showing that cube scenery cannot be reconstructed. But locally biased functions are still cool! Try them out for your favorite graphs! 18