U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 6 Luca Trevisan September 12, 2017

Similar documents
U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017

Lecture 10: May 6, 2013

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016

Finding Dense Subgraphs in G(n, 1/2)

U.C. Berkeley CS278: Computational Complexity Professor Luca Trevisan 2/21/2008. Notes for Lecture 8

Eigenvalues of Random Graphs

Stanford University CS254: Computational Complexity Notes 7 Luca Trevisan January 29, Notes for Lecture 7

Problem Set 9 Solutions

Spectral Graph Theory and its Applications September 16, Lecture 5

Min Cut, Fast Cut, Polynomial Identities

Lecture 4: Universal Hash Functions/Streaming Cont d

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Notes on Frequency Estimation in Data Streams

Calculation of time complexity (3%)

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Errors for Linear Systems

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications

Matrix Approximation via Sampling, Subspace Embedding. 1 Solving Linear Systems Using SVD

Affine transformations and convexity

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 3 Luca Trevisan August 31, 2017

Lecture 3 January 31, 2017

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009

Expected Value and Variance

Lecture Space-Bounded Derandomization

Randić Energy and Randić Estrada Index of a Graph

Edge Isoperimetric Inequalities

Math 217 Fall 2013 Homework 2 Solutions

DISCRIMINANTS AND RAMIFIED PRIMES. 1. Introduction A prime number p is said to be ramified in a number field K if the prime ideal factorization

Lecture 3: Shannon s Theorem

APPENDIX A Some Linear Algebra

Randomness and Computation

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

princeton univ. F 13 cos 521: Advanced Algorithm Design Lecture 3: Large deviations bounds and applications Lecturer: Sanjeev Arora

18.1 Introduction and Recap

Lecture Randomized Load Balancing strategies and their analysis. Probability concepts include, counting, the union bound, and Chernoff bounds.

BOUNDEDNESS OF THE RIESZ TRANSFORM WITH MATRIX A 2 WEIGHTS

Computing Correlated Equilibria in Multi-Player Games

Complete subgraphs in multipartite graphs

find (x): given element x, return the canonical element of the set containing x;

The Second Anti-Mathima on Game Theory

Statistical Mechanics and Combinatorics : Lecture III

2.3 Nilpotent endomorphisms

NOTES ON SIMPLIFICATION OF MATRICES

Lecture 4: September 12

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 8 Luca Trevisan September 19, 2017

Complex Numbers Alpha, Round 1 Test #123

Text S1: Detailed proofs for The time scale of evolutionary innovation

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

The Second Eigenvalue of Planar Graphs

Norms, Condition Numbers, Eigenvalues and Eigenvectors

Exercises of Chapter 2

Lecture 5 September 17, 2015

Computing π with Bouncing Balls

Lecture 4: Constant Time SVD Approximation

More metrics on cartesian products

Stanford University Graph Partitioning and Expanders Handout 3 Luca Trevisan May 8, 2013

HMMT February 2016 February 20, 2016

CS 2750 Machine Learning. Lecture 5. Density estimation. CS 2750 Machine Learning. Announcements

Remarks on the Properties of a Quasi-Fibonacci-like Polynomial Sequence

COS 511: Theoretical Machine Learning. Lecturer: Rob Schapire Lecture # 15 Scribe: Jieming Mao April 1, 2013

Modulo Magic Labeling in Digraphs

The Expectation-Maximization Algorithm

6.842 Randomness and Computation February 18, Lecture 4

Solutions Homework 4 March 5, 2018

Lecture 5 Decoding Binary BCH Codes

Graph Reconstruction by Permutations

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 16

P exp(tx) = 1 + t 2k M 2k. k N

Random Walks on Digraphs

Prof. Dr. I. Nasser Phys 630, T Aug-15 One_dimensional_Ising_Model

CHAPTER 17 Amortized Analysis

11 Tail Inequalities Markov s Inequality. Lecture 11: Tail Inequalities [Fa 13]

Discrete Mathematics. Laplacian spectral characterization of some graphs obtained by product operation

Attacks on RSA The Rabin Cryptosystem Semantic Security of RSA Cryptology, Tuesday, February 27th, 2007 Nils Andersen. Complexity Theoretic Reduction

763622S ADVANCED QUANTUM MECHANICS Solution Set 1 Spring c n a n. c n 2 = 1.

Introduction to Algorithms

Case A. P k = Ni ( 2L i k 1 ) + (# big cells) 10d 2 P k.

Lecture 21: Numerical methods for pricing American type derivatives

Salmon: Lectures on partial differential equations. Consider the general linear, second-order PDE in the form. ,x 2

Lecture 3. Ax x i a i. i i

Singular Value Decomposition: Theory and Applications

Maximizing the number of nonnegative subsets

Learning Theory: Lecture Notes

Turbulence classification of load data by the frequency and severity of wind gusts. Oscar Moñux, DEWI GmbH Kevin Bleibler, DEWI GmbH

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS

JAB Chain. Long-tail claims development. ASTIN - September 2005 B.Verdier A. Klinger

Problem Set 6: Trees Spring 2018

Composite Hypotheses testing

Section 8.3 Polar Form of Complex Numbers

Dynamic Systems on Graphs

The Order Relation and Trace Inequalities for. Hermitian Operators

An Experiment/Some Intuition (Fall 2006): Lecture 18 The EM Algorithm heads coin 1 tails coin 2 Overview Maximum Likelihood Estimation

On the energy of singular graphs

NP-Completeness : Proofs

MATH Homework #2

Bayesian epistemology II: Arguments for Probabilism

6. Stochastic processes (2)

arxiv: v1 [math.co] 1 Mar 2014

Transcription:

U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 6 Luca Trevsan September, 07 Scrbed by Theo McKenze Lecture 6 In whch we study the spectrum of random graphs. Overvew When attemptng to fnd n polynomal tme an upper bound certfcate on the max cut and maxmum ndependent set of a graph, we have used the followng property. Proposton If G G n,, then wth hgh probablty A E(A) O( n), where s the spectral norm. Generally, f G G n,p and p > log n n then w.h.p. A E(A) O( np). Today we wll prove how to obtan the bound n Proposton wth an extra term of log n, as well as show an outlne of the method of fndng the bound n Proposton. We wll also show how when p s small ths bound breaks down, namely how when p = Θ( n ), ( ) log n A E(A) Ω. Introducng the Trace Henceforth M k j sgnfes (M k ) j. Take M symmetrc and real. All egenvalues of ths matrx are real, and we can enumerate them λ, λ,..., λ n such that λ λ... λ n. Defnton The trace Tr(A) s defned to be Tr(A) = n = A where A s an n n matrx. Moreover we know that Theorem Tr(A) = n = λ.

If we take k large and even, the egenvalues of M k are λ k λ k... λ n k. Therefore we have M k = Tr(M k ) = λ k λ k = M k. Moreover we have Tr(M k ) k = ( λ k ) k n k λ = n k M. Ths gves us an estmaton of the norm, M ( M k) k gves a constant factor approxmaton of M. n k M, whch for k > log n 3 Usng the Trace to Bound the Spectral Norm Assume that G G n, Theorem 3 and A s the adjacency matrx of G. We wll prove the followng. E (Tr((A E(A)) k )) s bounded above by O(k) n +k/ k k/. If k > log n, G G n, by takng the kth root we acheve a bound of O( n log n) on A E(A). 3. Expected Value of Matrx Entres Frst, we examne the matrx M = A E(A). We have M = 0 and M j {±} wth equal probablty of each when j. Moreover M j = M j. If j, E(Mj k ) = 0 f k s odd and E(Mj k ) = for k even. E( M k) = n E(M k ) by the lnearty of expectaton and symmetry between the entres. We evalute M k. M k = M M M k, {,..., k } where,... k represents the ntermedate steps on a path between vertces that starts at and returns to. For example, M = M M. Note that we can repeat edges n these paths. By the lnearty of expectaton E(M) k = E(M M M k, ). {,..., k } If any par {, j} occurs l tmes n the sequence of pars {, }, {, },..., { k, }, where l s odd, then as the value of ths term s ndependent from all other terms and E M l j = 0 for odd l, then E(M M M k ) = 0. If all pars occur an even number of tmes, ther product s expectaton s. Therefore E(M k ) s the number of sequences,..., k V k such that, n the sequence of pars {, }, {, },..., { k, }, each par occurs an even number of tmes.

3. Encodng argument In order to gve an upper bound on the number of such sequences, we wll show how to encode a sequence where there are m dstnct edges. In the sequence,... k, the element j s represented ether as (0, j ), whch takes + log n bts, f j appears for the frst tme n the sequence at locaton j, and as (0, l) otherwse, where l < j s such that l = j, whch requres + log k bts. Notce that, f j occurs for the frst tme at locaton j, then the par { j, j } also occurs for the frst tme at the locaton j and j. Thus the number of tmes that we encounter a vertex for the frst tme s at most the number of dstnct edges. If we have t dstnct vertces (other than vertex ), then we are usng k + t log n + (k t) log k; for k < n, ths value ncreases wth t, but we have t m k/ (because every edge has to appear an even number of tmes and so there can be at most k/ dstnct edges. Ths means that we use at most k + k log n + k log k bts n the encodng. The number of strngs that can be encoded usng at most L bts s L+. If we assume k < n, we have the bound E(M k ) k k n k k+, meanng Tr(M) = n E(M k ) n + k k+ k k. Therefore usng sutable k and t we acheve our bound on M. For example, choose k = log n and t = 0 n log n. We use Markov s nequalty to obtan P( M > t) = P( M k > t k ) E M k t k ( n ) k k n k e Ω(log n) 0. t 4 Tghtenng the Bound To obtan the sharper bound of O( n), we need to count the number of pars more sharply and remove the k k term, namely mprove the way we talk about repettons. Here we gve an outlne for how to fnd a tghter bound. The worst case n the above analyss s when the number of dstnct vertces (not countng vertex ) s maxmal, whch s k/. In that case, the number of dstnct edges { j, j+ } s k/, and they must form a connected graph over + k/ vertces, that s, they have to form a tree. Furthermore, each edges s repeated exactly twce n the closed walk, otherwse we would not have enough dstnct edges to connect + k/ dstnct vertces. If the pars form a tree, then the only way we can have closed walk n whch every edge s repeated twce s that the closed walk s a depth-frst vst of the tree. In ths case, we can mprove our encodng n the followng way. In a depth-frst vst of a tree only two events are possble at each step: ether we dscover a new vertex, or we backtrack on the edge between the current node and the parent node. Thus we only need to pay + log n bts to encode a new node n the sequence and bt to encode an already seen node, and we obtan a bound of k + k log n+ k = k n k. By takng the kth root we obtan a bound on M of O( n). 3

5 Generalzng to any p Now assume G G n,p and A s the adjacency matrx of G. We also assume p <. We defne M = A E(A). In ths matrx M = 0 and f j, M,j = p wth probablty p and p wth probablty p. Therefore E(M j ) = 0, E(Mj ) = p p p. In fact, E(Mj k ) p for all k. From ths we see we need to sum over sequences such that the multset has each par occurng at least two tmes, as f any par occurs once, the expectaton s 0. Therefore the bound s E(M k ),... k p l where l s the number of dstnct pars and the sum s taken over multsets where each par occurs at least twce. For large l, the number of sequences where each par occurs at least twce wth l dstnct pars s approxmately O(l) n l. Ths would gve us p l = p l O(l) n l O(p k O(k) n k ),... k l so the bound on M s O( np). However, the bound on the number of sequences wth l dstct pars breaks down when l s much smaller than k. In a full proof much more complcated calculatons must be done. 6 Problems wth sparse graphs ( ) Theorem 4 If p = Θ( n ), then A E(A) Ω log n w.h.p. Ths breaks down the nce bound we obtaned n secton 5. Ths follows from the rregularty of sparse graphs. There wll be solated vertces and vertces wth degree much hgher than average. ( ) Lemma If p = Θ( n ) then w.h.p. the hghest degree vertex of G s of order Θ log n. Proposton 5 If G has a node of degree d, then, for every p < 4 d, λ max(a pj) Ω( d). Ths mples that 0 < p <. 4 d, A pj Ω( d). Proof: We have λ max (A pj) = max x x T (A pj)x x where the maxmum s taken over all nonzero vectors x. Call v a node of degree d and call d of ts neghbors u,..., u d. 4

Consder the vector x such that x v =, x u = d and x w = 0 for other vertces w. We have x T Ax d x T pjx = p ( x ) = p ( + d) 4pd x = Therefore f p 4, d x T (A pj)x x d d = Ω( d) yeldng the desred bound. Theorem 4 proceeds mmedately from Proposton 5 and Lemma. 5