Lecture 7: Advanced Coupling & Mixing Time via Eigenvalues

Size: px
Start display at page:

Download "Lecture 7: Advanced Coupling & Mixing Time via Eigenvalues"

Transcription

1 Counting and Sampling Fall 07 Lecture 7: Advanced Coupling & Miing Time via Eigenvalues Lecturer: Shaan Oveis Gharan October 8 Disclaimer: These notes have not been subjected to the usual scrutin reserved for formal publications. In this lecture first we discuss [HV05] to prove that the Glauber dnamics generates a random coloring of a graph G with maimum degree using q.764 colors in On log n). Our main motivation is to introduce additional more technical tools in coupling, beond the path coupling technique. We prove the following theorem Theorem 7.. Let α satisfies α = e /α. If G is triangle free with ma degree, and q α + ɛ) and ɛ logn), then the heat-bath chain mies in O n ɛ logn)) time. The main idea behind the proof is a local uniformit propert which holds with high probabilit at the stationarit distribution. The show that for an triangle-free graph an eas lower bound on the number of available colors for all vertices of graph which holds with high probabilit at stationarit. We use the following notation throughout this section and the net one. For a coloring configuration X and a verte v let AX, v) be the set of colors available to v. In other words, AX, v) is all colors that are not appeared on an neighbors of v in X. Lemma 7.. Let G be a triangle-free graph with maimum degree, ɛ > 0 and q > +. For a uniforml random q-coloring of G, we have [ ] P v : AX, v) qe /q ɛ) ne ɛq. Proof. Fi a verte v. We condition on the colors of all vertices which are not neighbors of v. We call the conditional information F. Observe that, conditioned on F, all neighbors of v are colored independentl. This is because the form an independent set b the traingle-free assumption). Let X c,u be the random variable indicating that a neighbor u of v is colored with c. AX, v) = X c,u ). c u v Therefore, E [ AX, v) F] = c = c q c = q u v E [X c,u F]) u v u v:c AX,u) u v:c AX,u) c AX,u) q u v e /q = qe /q. ) AX, u) ) /q AX, u) ) /q AX, u) 7-

2 7- Lecture 7: Advanced Coupling & Miing Time via Eigenvalues where the first inequalit follows b the AM-GM inequalit. Note that since X c,u s are independent under F, and AX, v) is a Lipschitz function of X c,us it follows that we have strong concentration on the above epected bound. Therefore, the claim simpl follows b McDiarmid s inequalit and a union bound. Recall that McDiarmids inequalit sas that for an function fx,..., X n ) which is a Lipshitz function with Lipshitz constant of independent variables X,..., X n we have P [ fx,..., X n ) EfX,..., X n ) ɛ] e ɛ /n. Equipped with the above lemma, let α be such that α = e /α. Then, for q = α, and for Ωlog n) we have qe /q qe /α = q α = Therefore, if we take a slightl larger q, sa q = α + ɛ), for ɛ log n, we have that with probabilit /n a random coloring X satisifies the following: For all v, AX, v) > + Ωɛ)). 7. Improved bound on Coupling In this section step we show that if in a coloring X all vertices have strictl more than available colors, then we can couple X with an proper coloring Y such that the distance decreases in epectation. Note that as a special case, this will also give us a proof of the q + bound from last lecture using ordinar coupling. We work with a slightl different Markov chain: We choose a random verte v and we assign a color that is not appeared in an of its neighbors uniforml at random, i.e., we work with the Heat-Bath chain as opposed to the Metropolis rule. Lemma 7.3. Let X be a proper coloring of G such that for some 0 < β < and all vertices v, AX, v) β. Then, for ever Y Ω, there eists a coupling X, Y ) X, Y ) such that E [dx, Y ) X, Y ] β/n) dx, Y ). Note that in particular, if q +, then, we can let β Ω/ ), and the above lemma would simpl impl the miing. Proof. We use a coupling strateg similar to the one we considered last time. First of all, both X, Y choose the same verte v. For ever color c AX, v) AY, v) both chains choose c with probabilit min{ AX,v), AY,v) }. Note that in all other cases v will end up with different colors in X, Y. So, we don t

3 Lecture 7: Advanced Coupling & Miing Time via Eigenvalues 7-3 need to specif how to match the rest of the events. Now, observe that P [c X v) c Y v) v] = AX, v) AY, v) ma{ AX, v), AY, v) } = ma{ax, v), AY, v)} AX, v) AY, v) ma{ax, v), AY, v)} {u v : c Xu) c Y u)} ma{ax, v), AY, v)} β {u v : c Xu) c Y u)} Note that the second to last inequalit can be justified as follows: Sa AX, v) > AY, v). Then, for ever color in c AX, v) \ AX, v) AY, v)) there must be a verte u that is a neighbor of v such that c Y u) = c c X u). The last inequalit follows b the assumption of the lemma. Therefore, for ever verte v, P [c X v) c Y v)] β n {u v : c Xu) c Y u)} + Summing this over all v, we get as desired. E [dx, Y ) X, Y ] β dx, Y )) + n ) I [c X v) c Y v)]. n ) dx, Y ) = β ) dx, Y ) n n 7. Coupling with stationarit Using lemmas 7. and 7.3, to prove Theorem 7., it remains to show that for a coupling proof it is enough to show that for most states X the distance decreases: Theorem 7.4. Suppose that the distance function d.,.) is integral. For ɛ > 0 and S Ω suppose that for all X Ω and Y S there is a coupling X, Y ) X, Y ) such that if πs) ɛ diamω), then τ mi O E [dx, Y ) X, Y ] ɛ)dx, Y ). logdiamω)) Proof. The main insight of the proof is that in a coupling proof we can alwas assume X 0 = for some state Ω and Y 0 π. Therefore, for all t, P [Y t / S] = πs). As usual if X = Y, then X = Y. Otherwise, if Y t is not in S we just follow an independent coupling of X t, Y t. Otherwise, we use the coupling promised in the statement of the theorem. Now, observe that on the event that Y t / S, we have no control on the dx t+, Y t+ ) but we know in the worst case it is at most diamω). Otherwise, we know the distance decreases. Therefore, E [dx, Y )] = E [dx, Y ) Y S] P [Y S] + E [dx, Y ) Y / S] P [Y / S] ɛ ɛ)dx, Y ) + diamω) ɛ/)dx, Y ). diamω) ɛ ).

4 7-4 Lecture 7: Advanced Coupling & Miing Time via Eigenvalues where we used that dx, Y ) because X Y and d.,.) is integer. O ɛ logdiamω))). So, the chain mies in time This completes the proof of Theorem Miing Time Using Eigenvalues In this section we discuss a spectral theor of reversible Markov chains. Based on this theor in the net lecture we introduce the path technolog which will be one of the strongest tools to analze Markov chains. Note that we particular focus on reversible chains because ever chain that we will use in our counting tasks is reversible. In this section we will treat the Markov kernel K as an operator. It turns out that K is not a smmetric matri operator). But, it has real eigenvalues. There are several was to see this. Perhaps, one of the cleanest was is to stud K in a different linear product space, in which K is self-adjoint, i.e., smmetric. The inner product space L π) is defined as follows: f, g = f)g)π). In particular, f = f) π). Recall that for an vector/function f, Kf) = K, )f). Although K is not a smmetric matri, it is self-adjoint with respect the above inner product, i.e., for an two functions f, g, Kf, g = f, Kg. This is because, Kf, g = Kf)g)π) = ) K, )f) g)π) = K, )f)g)π), =, K, )f)g)π) = f, Kg. Now, we can appl the spectral theorem: K has orthonormal set of eigenvalues λ λ Ω with corresponding eigenvectors ψ,..., ψ Ω such that for all i, and for all i j, Kψ i = λ i ψ i, ψ i, ψ j = 0. It is not hard to see that we alwas have λ = and ψ = because K is stochastic. Furthermore, also b stochasticit, λ i for all i. We can translate the ergodicit properties of a Markov chain to this language. A reversible) Markov chain is irreducible iff λ <, and a reversible) markov chain is aperiodic, i.e., it is not bipartite, iff λ Ω >. Let λ = ma i> λ i = ma{λ, λ n }.

5 Lecture 7: Advanced Coupling & Miing Time via Eigenvalues 7-5 Net, we show that if λ <, then we can bound the miing time b logn)/ λ ). Firstl, it turns out that it is more convenient to measure the miing time in L distance as opposed to the total variation distance; this is because eigenvalues are L properties of the chain. Definition 7.5 L P distance). For a kernel K, the L p distance of K t,.) to stationar is defined as K t,.) π.) = π) K t, ) ) p /p p π). Here, we are thinking of K t, )/π) as a function of. Observe that K t,.) π T V = K t, ) π) = π) π) K t, ) π) K t, ) π) = K t,.) π. 7.) So, if we prove that the L distance is small, it automaticall implies that the total variation distance is small. Note that if p =, then the distance of K t /π from is equal to ma K t, )/π). Theorem 7.6. Let λ = ma i> λ i = ma{λ, λ n }. Then, K t π λ t π). Therefore, b 7.), ) log ɛ π) τ ɛ) λ. Note that the above bound this bound is ideal in applications where π is ver far from a uniform distribution. In these scenarios we want to start the chain from a state with large π). This idea is usuall called a warm start. Proof. First, observe that for an eigenvector ψ i we have K t π, ψ i = K t, ) ψ i )π) = λ t π) iψ i ). Note that b above, Kt π, =. Therefore, K t π = i> K t π, ψ i ψ i = λ t iψ i )ψ i. i> And, K t π = λ t i ψ i ) λ t ψ i ) λ t π) i> i> The onl nontrivial step is the last inequalit which holds because ψ i ) = ψ i, π = ψ i, π ψ i = π = π). i i i where is the indicator function of.

6 7-6 Lecture 7: Advanced Coupling & Miing Time via Eigenvalues References [HV05] Tom Haes and Eric Vigoda. Coupling with the stationar distribution and improved sampling for colorings and independent sets. In SODA, pages Societ for Industrial and Applied Mathematics,

Lecture 3: September 10

Lecture 3: September 10 CS294 Markov Chain Monte Carlo: Foundations & Applications Fall 2009 Lecture 3: September 10 Lecturer: Prof. Alistair Sinclair Scribes: Andrew H. Chan, Piyush Srivastava Disclaimer: These notes have not

More information

Lecture 6: September 22

Lecture 6: September 22 CS294 Markov Chain Monte Carlo: Foundations & Applications Fall 2009 Lecture 6: September 22 Lecturer: Prof. Alistair Sinclair Scribes: Alistair Sinclair Disclaimer: These notes have not been subjected

More information

Lecture 8: Path Technology

Lecture 8: Path Technology Counting and Sampling Fall 07 Lecture 8: Path Technology Lecturer: Shayan Oveis Gharan October 0 Disclaimer: These notes have not been subjected to the usual scrutiny reserved for formal publications.

More information

Identifying second degree equations

Identifying second degree equations Chapter 7 Identifing second degree equations 71 The eigenvalue method In this section we appl eigenvalue methods to determine the geometrical nature of the second degree equation a 2 + 2h + b 2 + 2g +

More information

Lecture 9: Counting Matchings

Lecture 9: Counting Matchings Counting and Sampling Fall 207 Lecture 9: Counting Matchings Lecturer: Shayan Oveis Gharan October 20 Disclaimer: These notes have not been subjected to the usual scrutiny reserved for formal publications.

More information

0.24 adults 2. (c) Prove that, regardless of the possible values of and, the covariance between X and Y is equal to zero. Show all work.

0.24 adults 2. (c) Prove that, regardless of the possible values of and, the covariance between X and Y is equal to zero. Show all work. 1 A socioeconomic stud analzes two discrete random variables in a certain population of households = number of adult residents and = number of child residents It is found that their joint probabilit mass

More information

Lecture 9: SVD, Low Rank Approximation

Lecture 9: SVD, Low Rank Approximation CSE 521: Design and Analysis of Algorithms I Spring 2016 Lecture 9: SVD, Low Rank Approimation Lecturer: Shayan Oveis Gharan April 25th Scribe: Koosha Khalvati Disclaimer: hese notes have not been subjected

More information

Lecture 14: Random Walks, Local Graph Clustering, Linear Programming

Lecture 14: Random Walks, Local Graph Clustering, Linear Programming CSE 521: Design and Analysis of Algorithms I Winter 2017 Lecture 14: Random Walks, Local Graph Clustering, Linear Programming Lecturer: Shayan Oveis Gharan 3/01/17 Scribe: Laura Vonessen Disclaimer: These

More information

Higher. Functions and Graphs. Functions and Graphs 15

Higher. Functions and Graphs. Functions and Graphs 15 Higher Mathematics UNIT UTCME Functions and Graphs Contents Functions and Graphs 5 Set Theor 5 Functions 6 Inverse Functions 9 4 Eponential Functions 0 5 Introduction to Logarithms 0 6 Radians 7 Eact Values

More information

2: Distributions of Several Variables, Error Propagation

2: Distributions of Several Variables, Error Propagation : Distributions of Several Variables, Error Propagation Distribution of several variables. variables The joint probabilit distribution function of two variables and can be genericall written f(, with the

More information

6.842 Randomness and Computation March 3, Lecture 8

6.842 Randomness and Computation March 3, Lecture 8 6.84 Randomness and Computation March 3, 04 Lecture 8 Lecturer: Ronitt Rubinfeld Scribe: Daniel Grier Useful Linear Algebra Let v = (v, v,..., v n ) be a non-zero n-dimensional row vector and P an n n

More information

7.1 Coupling from the Past

7.1 Coupling from the Past Georgia Tech Fall 2006 Markov Chain Monte Carlo Methods Lecture 7: September 12, 2006 Coupling from the Past Eric Vigoda 7.1 Coupling from the Past 7.1.1 Introduction We saw in the last lecture how Markov

More information

Chapter 6. Self-Adjusting Data Structures

Chapter 6. Self-Adjusting Data Structures Chapter 6 Self-Adjusting Data Structures Chapter 5 describes a data structure that is able to achieve an epected quer time that is proportional to the entrop of the quer distribution. The most interesting

More information

Mathematics 309 Conic sections and their applicationsn. Chapter 2. Quadric figures. ai,j x i x j + b i x i + c =0. 1. Coordinate changes

Mathematics 309 Conic sections and their applicationsn. Chapter 2. Quadric figures. ai,j x i x j + b i x i + c =0. 1. Coordinate changes Mathematics 309 Conic sections and their applicationsn Chapter 2. Quadric figures In this chapter want to outline quickl how to decide what figure associated in 2D and 3D to quadratic equations look like.

More information

Convex Optimization CMU-10725

Convex Optimization CMU-10725 Convex Optimization CMU-10725 Simulated Annealing Barnabás Póczos & Ryan Tibshirani Andrey Markov Markov Chains 2 Markov Chains Markov chain: Homogen Markov chain: 3 Markov Chains Assume that the state

More information

Lecture 5: Random Walks and Markov Chain

Lecture 5: Random Walks and Markov Chain Spectral Graph Theory and Applications WS 20/202 Lecture 5: Random Walks and Markov Chain Lecturer: Thomas Sauerwald & He Sun Introduction to Markov Chains Definition 5.. A sequence of random variables

More information

arxiv: v1 [math.co] 21 Dec 2016

arxiv: v1 [math.co] 21 Dec 2016 The integrall representable trees of norm 3 Jack H. Koolen, Masood Ur Rehman, Qianqian Yang Februar 6, 08 Abstract In this paper, we determine the integrall representable trees of norm 3. arxiv:6.0697v

More information

Lecture 14: October 22

Lecture 14: October 22 CS294 Markov Chain Monte Carlo: Foundations & Applications Fall 2009 Lecture 14: October 22 Lecturer: Alistair Sinclair Scribes: Alistair Sinclair Disclaimer: These notes have not been subjected to the

More information

INTRODUCTION TO DIOPHANTINE EQUATIONS

INTRODUCTION TO DIOPHANTINE EQUATIONS INTRODUCTION TO DIOPHANTINE EQUATIONS In the earl 20th centur, Thue made an important breakthrough in the stud of diophantine equations. His proof is one of the first eamples of the polnomial method. His

More information

MA Game Theory 2005, LSE

MA Game Theory 2005, LSE MA. Game Theor, LSE Problem Set The problems in our third and final homework will serve two purposes. First, the will give ou practice in studing games with incomplete information. Second (unless indicated

More information

Toda s Theorem: PH P #P

Toda s Theorem: PH P #P CS254: Computational Compleit Theor Prof. Luca Trevisan Final Project Ananth Raghunathan 1 Introduction Toda s Theorem: PH P #P The class NP captures the difficult of finding certificates. However, in

More information

Lecture 12: Introduction to Spectral Graph Theory, Cheeger s inequality

Lecture 12: Introduction to Spectral Graph Theory, Cheeger s inequality CSE 521: Design and Analysis of Algorithms I Spring 2016 Lecture 12: Introduction to Spectral Graph Theory, Cheeger s inequality Lecturer: Shayan Oveis Gharan May 4th Scribe: Gabriel Cadamuro Disclaimer:

More information

3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES

3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES 3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES 3.1 Introduction In this chapter we will review the concepts of probabilit, rom variables rom processes. We begin b reviewing some of the definitions

More information

MATRIX TRANSFORMATIONS

MATRIX TRANSFORMATIONS CHAPTER 5. MATRIX TRANSFORMATIONS INSTITIÚID TEICNEOLAÍOCHTA CHEATHARLACH INSTITUTE OF TECHNOLOGY CARLOW MATRIX TRANSFORMATIONS Matri Transformations Definition Let A and B be sets. A function f : A B

More information

Lecture 1 and 2: Random Spanning Trees

Lecture 1 and 2: Random Spanning Trees Recent Advances in Approximation Algorithms Spring 2015 Lecture 1 and 2: Random Spanning Trees Lecturer: Shayan Oveis Gharan March 31st Disclaimer: These notes have not been subjected to the usual scrutiny

More information

Definition A finite Markov chain is a memoryless homogeneous discrete stochastic process with a finite number of states.

Definition A finite Markov chain is a memoryless homogeneous discrete stochastic process with a finite number of states. Chapter 8 Finite Markov Chains A discrete system is characterized by a set V of states and transitions between the states. V is referred to as the state space. We think of the transitions as occurring

More information

LESSON 35: EIGENVALUES AND EIGENVECTORS APRIL 21, (1) We might also write v as v. Both notations refer to a vector.

LESSON 35: EIGENVALUES AND EIGENVECTORS APRIL 21, (1) We might also write v as v. Both notations refer to a vector. LESSON 5: EIGENVALUES AND EIGENVECTORS APRIL 2, 27 In this contet, a vector is a column matri E Note 2 v 2, v 4 5 6 () We might also write v as v Both notations refer to a vector (2) A vector can be man

More information

Lecture 17: D-Stable Polynomials and Lee-Yang Theorems

Lecture 17: D-Stable Polynomials and Lee-Yang Theorems Counting and Sampling Fall 2017 Lecture 17: D-Stable Polynomials and Lee-Yang Theorems Lecturer: Shayan Oveis Gharan November 29th Disclaimer: These notes have not been subjected to the usual scrutiny

More information

Lecture 16: Roots of the Matching Polynomial

Lecture 16: Roots of the Matching Polynomial Counting and Sampling Fall 2017 Lecture 16: Roots of the Matching Polynomial Lecturer: Shayan Oveis Gharan November 22nd Disclaimer: These notes have not been subjected to the usual scrutiny reserved for

More information

Approximation of Continuous Functions

Approximation of Continuous Functions Approximation of Continuous Functions Francis J. Narcowich October 2014 1 Modulus of Continuit Recall that ever function continuous on a closed interval < a x b < is uniforml continuous: For ever ɛ > 0,

More information

Lecture 19: November 10

Lecture 19: November 10 CS294 Markov Chain Monte Carlo: Foundations & Applications Fall 2009 Lecture 19: November 10 Lecturer: Prof. Alistair Sinclair Scribes: Kevin Dick and Tanya Gordeeva Disclaimer: These notes have not been

More information

15. Eigenvalues, Eigenvectors

15. Eigenvalues, Eigenvectors 5 Eigenvalues, Eigenvectors Matri of a Linear Transformation Consider a linear ( transformation ) L : a b R 2 R 2 Suppose we know that L and L Then c d because of linearit, we can determine what L does

More information

Lecture 28: April 26

Lecture 28: April 26 CS271 Randomness & Computation Spring 2018 Instructor: Alistair Sinclair Lecture 28: April 26 Disclaimer: These notes have not been subjected to the usual scrutiny accorded to formal publications. They

More information

Flows and Connectivity

Flows and Connectivity Chapter 4 Flows and Connectivit 4. Network Flows Capacit Networks A network N is a connected weighted loopless graph (G,w) with two specified vertices and, called the source and the sink, respectivel,

More information

Lecture 8: Spectral Graph Theory III

Lecture 8: Spectral Graph Theory III A Theorist s Toolkit (CMU 18-859T, Fall 13) Lecture 8: Spectral Graph Theory III October, 13 Lecturer: Ryan O Donnell Scribe: Jeremy Karp 1 Recap Last time we showed that every function f : V R is uniquely

More information

Lecture 1: Contraction Algorithm

Lecture 1: Contraction Algorithm CSE 5: Design and Analysis of Algorithms I Spring 06 Lecture : Contraction Algorithm Lecturer: Shayan Oveis Gharan March 8th Scribe: Mohammad Javad Hosseini Disclaimer: These notes have not been subjected

More information

25.1 Markov Chain Monte Carlo (MCMC)

25.1 Markov Chain Monte Carlo (MCMC) CS880: Approximations Algorithms Scribe: Dave Andrzejewski Lecturer: Shuchi Chawla Topic: Approx counting/sampling, MCMC methods Date: 4/4/07 The previous lecture showed that, for self-reducible problems,

More information

U.C. Berkeley Better-than-Worst-Case Analysis Handout 3 Luca Trevisan May 24, 2018

U.C. Berkeley Better-than-Worst-Case Analysis Handout 3 Luca Trevisan May 24, 2018 U.C. Berkeley Better-than-Worst-Case Analysis Handout 3 Luca Trevisan May 24, 2018 Lecture 3 In which we show how to find a planted clique in a random graph. 1 Finding a Planted Clique We will analyze

More information

Constant 2-labelling of a graph

Constant 2-labelling of a graph Constant 2-labelling of a graph S. Gravier, and E. Vandomme June 18, 2012 Abstract We introduce the concept of constant 2-labelling of a graph and show how it can be used to obtain periodic sphere packing.

More information

A Note on the Glauber Dynamics for Sampling Independent Sets

A Note on the Glauber Dynamics for Sampling Independent Sets A Note on the Glauber Dynamics for Sampling Independent Sets Eric Vigoda Division of Informatics King s Buildings University of Edinburgh Edinburgh EH9 3JZ vigoda@dcs.ed.ac.uk Submitted: September 25,

More information

University of Chicago Autumn 2003 CS Markov Chain Monte Carlo Methods

University of Chicago Autumn 2003 CS Markov Chain Monte Carlo Methods University of Chicago Autumn 2003 CS37101-1 Markov Chain Monte Carlo Methods Lecture 4: October 21, 2003 Bounding the mixing time via coupling Eric Vigoda 4.1 Introduction In this lecture we ll use the

More information

Directional derivatives and gradient vectors (Sect. 14.5). Directional derivative of functions of two variables.

Directional derivatives and gradient vectors (Sect. 14.5). Directional derivative of functions of two variables. Directional derivatives and gradient vectors (Sect. 14.5). Directional derivative of functions of two variables. Partial derivatives and directional derivatives. Directional derivative of functions of

More information

February 11, JEL Classification: C72, D43, D44 Keywords: Discontinuous games, Bertrand game, Toehold, First- and Second- Price Auctions

February 11, JEL Classification: C72, D43, D44 Keywords: Discontinuous games, Bertrand game, Toehold, First- and Second- Price Auctions Eistence of Mied Strateg Equilibria in a Class of Discontinuous Games with Unbounded Strateg Sets. Gian Luigi Albano and Aleander Matros Department of Economics and ELSE Universit College London Februar

More information

Common-Knowledge / Cheat Sheet

Common-Knowledge / Cheat Sheet CSE 521: Design and Analysis of Algorithms I Fall 2018 Common-Knowledge / Cheat Sheet 1 Randomized Algorithm Expectation: For a random variable X with domain, the discrete set S, E [X] = s S P [X = s]

More information

Lecture 7. The Cluster Expansion Lemma

Lecture 7. The Cluster Expansion Lemma Stanford Universit Spring 206 Math 233: Non-constructive methods in combinatorics Instructor: Jan Vondrák Lecture date: Apr 3, 206 Scribe: Erik Bates Lecture 7. The Cluster Expansion Lemma We have seen

More information

The Markov Chain Monte Carlo Method

The Markov Chain Monte Carlo Method The Markov Chain Monte Carlo Method Idea: define an ergodic Markov chain whose stationary distribution is the desired probability distribution. Let X 0, X 1, X 2,..., X n be the run of the chain. The Markov

More information

2 Push and Pull Source Sink Push

2 Push and Pull Source Sink Push 1 Plaing Push vs Pull: Models and Algorithms for Disseminating Dnamic Data in Networks. R.C. Chakinala, A. Kumarasubramanian, Kofi A. Laing R. Manokaran, C. Pandu Rangan, R. Rajaraman Push and Pull 2 Source

More information

Lecture 2: September 8

Lecture 2: September 8 CS294 Markov Chain Monte Carlo: Foundations & Applications Fall 2009 Lecture 2: September 8 Lecturer: Prof. Alistair Sinclair Scribes: Anand Bhaskar and Anindya De Disclaimer: These notes have not been

More information

Coupling AMS Short Course

Coupling AMS Short Course Coupling AMS Short Course January 2010 Distance If µ and ν are two probability distributions on a set Ω, then the total variation distance between µ and ν is Example. Let Ω = {0, 1}, and set Then d TV

More information

Lecture 7: Spectral Graph Theory II

Lecture 7: Spectral Graph Theory II A Theorist s Toolkit (CMU 18-859T, Fall 2013) Lecture 7: Spectral Graph Theory II September 30, 2013 Lecturer: Ryan O Donnell Scribe: Christian Tjandraatmadja 1 Review We spend a page to review the previous

More information

0.1 Naive formulation of PageRank

0.1 Naive formulation of PageRank PageRank is a ranking system designed to find the best pages on the web. A webpage is considered good if it is endorsed (i.e. linked to) by other good webpages. The more webpages link to it, and the more

More information

Introduction to Machine Learning CMU-10701

Introduction to Machine Learning CMU-10701 Introduction to Machine Learning CMU-10701 Markov Chain Monte Carlo Methods Barnabás Póczos & Aarti Singh Contents Markov Chain Monte Carlo Methods Goal & Motivation Sampling Rejection Importance Markov

More information

arxiv: v1 [cs.ds] 28 Oct 2016

arxiv: v1 [cs.ds] 28 Oct 2016 Sort well with energ-constrained comparisons Barbara Geissmann and Paolo Penna Department of Computer Science, ETH Zurich, Zurich, Switzerland arxiv:1610.09223v1 [cs.ds] 28 Oct 2016 Abstract We stud ver

More information

The coupling method - Simons Counting Complexity Bootcamp, 2016

The coupling method - Simons Counting Complexity Bootcamp, 2016 The coupling method - Simons Counting Complexity Bootcamp, 2016 Nayantara Bhatnagar (University of Delaware) Ivona Bezáková (Rochester Institute of Technology) January 26, 2016 Techniques for bounding

More information

arxiv:math/ v1 [math.pr] 5 Oct 2006

arxiv:math/ v1 [math.pr] 5 Oct 2006 The Annals of Applied Probability 2006, Vol. 16, No. 3, 1297 1318 DOI: 10.1214/105051606000000330 c Institute of Mathematical Statistics, 2006 arxiv:math/0610188v1 [math.pr] 5 Oct 2006 COUPLING WITH THE

More information

A. Derivation of regularized ERM duality

A. Derivation of regularized ERM duality A. Derivation of regularized ERM dualit For completeness, in this section we derive the dual 5 to the problem of computing proximal operator for the ERM objective 3. We can rewrite the primal problem as

More information

Markov chains. Randomness and Computation. Markov chains. Markov processes

Markov chains. Randomness and Computation. Markov chains. Markov processes Markov chains Randomness and Computation or, Randomized Algorithms Mary Cryan School of Informatics University of Edinburgh Definition (Definition 7) A discrete-time stochastic process on the state space

More information

Solutions to Two Interesting Problems

Solutions to Two Interesting Problems Solutions to Two Interesting Problems Save the Lemming On each square of an n n chessboard is an arrow pointing to one of its eight neighbors (or off the board, if it s an edge square). However, arrows

More information

8.1 Concentration inequality for Gaussian random matrix (cont d)

8.1 Concentration inequality for Gaussian random matrix (cont d) MGMT 69: Topics in High-dimensional Data Analysis Falll 26 Lecture 8: Spectral clustering and Laplacian matrices Lecturer: Jiaming Xu Scribe: Hyun-Ju Oh and Taotao He, October 4, 26 Outline Concentration

More information

Subband Coding and Wavelets. National Chiao Tung University Chun-Jen Tsai 12/04/2014

Subband Coding and Wavelets. National Chiao Tung University Chun-Jen Tsai 12/04/2014 Subband Coding and Wavelets National Chiao Tung Universit Chun-Jen Tsai /4/4 Concept of Subband Coding In transform coding, we use N (or N N) samples as the data transform unit Transform coefficients are

More information

2, find c in terms of k. x

2, find c in terms of k. x 1. (a) Work out (i) 8 0.. (ii) 5 2 1 (iii) 27 3. 1 (iv) 252.. (4) (b) Given that x = 2 k and 4 c 2, find c in terms of k. x c =. (1) (Total 5 marks) 2. Solve the equation 7 1 4 x 2 x 1 (Total 7 marks)

More information

Lecture 18: March 15

Lecture 18: March 15 CS71 Randomness & Computation Spring 018 Instructor: Alistair Sinclair Lecture 18: March 15 Disclaimer: These notes have not been subjected to the usual scrutiny accorded to formal publications. They may

More information

Correlation analysis 2: Measures of correlation

Correlation analysis 2: Measures of correlation Correlation analsis 2: Measures of correlation Ran Tibshirani Data Mining: 36-462/36-662 Februar 19 2013 1 Review: correlation Pearson s correlation is a measure of linear association In the population:

More information

8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains

8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains 8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains 8.1 Review 8.2 Statistical Equilibrium 8.3 Two-State Markov Chain 8.4 Existence of P ( ) 8.5 Classification of States

More information

Markov Decision Processes and Dynamic Programming

Markov Decision Processes and Dynamic Programming Master MVA: Reinforcement Learning Lecture: 2 Markov Decision Processes and Dnamic Programming Lecturer: Alessandro Lazaric http://researchers.lille.inria.fr/ lazaric/webpage/teaching.html Objectives of

More information

The query register and working memory together form the accessible memory, denoted H A. Thus the state of the algorithm is described by a vector

The query register and working memory together form the accessible memory, denoted H A. Thus the state of the algorithm is described by a vector 1 Query model In the quantum query model we wish to compute some function f and we access the input through queries. The complexity of f is the number of queries needed to compute f on a worst-case input

More information

Lecture 13: Spectral Graph Theory

Lecture 13: Spectral Graph Theory CSE 521: Design and Analysis of Algorithms I Winter 2017 Lecture 13: Spectral Graph Theory Lecturer: Shayan Oveis Gharan 11/14/18 Disclaimer: These notes have not been subjected to the usual scrutiny reserved

More information

215 Problem 1. (a) Define the total variation distance µ ν tv for probability distributions µ, ν on a finite set S. Show that

215 Problem 1. (a) Define the total variation distance µ ν tv for probability distributions µ, ν on a finite set S. Show that 15 Problem 1. (a) Define the total variation distance µ ν tv for probability distributions µ, ν on a finite set S. Show that µ ν tv = (1/) x S µ(x) ν(x) = x S(µ(x) ν(x)) + where a + = max(a, 0). Show that

More information

Lecture 8: Linear Algebra Background

Lecture 8: Linear Algebra Background CSE 521: Design and Analysis of Algorithms I Winter 2017 Lecture 8: Linear Algebra Background Lecturer: Shayan Oveis Gharan 2/1/2017 Scribe: Swati Padmanabhan Disclaimer: These notes have not been subjected

More information

8. BOOLEAN ALGEBRAS x x

8. BOOLEAN ALGEBRAS x x 8. BOOLEAN ALGEBRAS 8.1. Definition of a Boolean Algebra There are man sstems of interest to computing scientists that have a common underling structure. It makes sense to describe such a mathematical

More information

On the relation between the relative earth mover distance and the variation distance (an exposition)

On the relation between the relative earth mover distance and the variation distance (an exposition) On the relation between the relative earth mover distance and the variation distance (an eposition) Oded Goldreich Dana Ron Februar 9, 2016 Summar. In this note we present a proof that the variation distance

More information

Lecture 12: Constant Degree Lossless Expanders

Lecture 12: Constant Degree Lossless Expanders Lecture : Constant Degree Lossless Expanders Topics in Complexity Theory and Pseudorandomness (Spring 03) Rutgers University Swastik Kopparty Scribes: Meng Li, Yun Kuen Cheung Overview In this lecture,

More information

Linear Algebra using Dirac Notation: Pt. 2

Linear Algebra using Dirac Notation: Pt. 2 Linear Algebra using Dirac Notation: Pt. 2 PHYS 476Q - Southern Illinois University February 6, 2018 PHYS 476Q - Southern Illinois University Linear Algebra using Dirac Notation: Pt. 2 February 6, 2018

More information

Eigenvectors and Eigenvalues 1

Eigenvectors and Eigenvalues 1 Ma 2015 page 1 Eigenvectors and Eigenvalues 1 In this handout, we will eplore eigenvectors and eigenvalues. We will begin with an eploration, then provide some direct eplanation and worked eamples, and

More information

Lecture: Modeling graphs with electrical networks

Lecture: Modeling graphs with electrical networks Stat260/CS294: Spectral Graph Methods Lecture 16-03/17/2015 Lecture: Modeling graphs with electrical networks Lecturer: Michael Mahoney Scribe: Michael Mahoney Warning: these notes are still very rough.

More information

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 12 Luca Trevisan October 3, 2017

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 12 Luca Trevisan October 3, 2017 U.C. Berkeley CS94: Beyond Worst-Case Analysis Handout 1 Luca Trevisan October 3, 017 Scribed by Maxim Rabinovich Lecture 1 In which we begin to prove that the SDP relaxation exactly recovers communities

More information

Formal Asymptotic Homogenization

Formal Asymptotic Homogenization September 21 25, 2015 Formal asymptotic homogenization Here we present a formal asymptotic technique (sometimes called asymptotic homogenization) based on two-spatial scale expansions These expansions

More information

Markov Chain Monte Carlo Inference. Siamak Ravanbakhsh Winter 2018

Markov Chain Monte Carlo Inference. Siamak Ravanbakhsh Winter 2018 Graphical Models Markov Chain Monte Carlo Inference Siamak Ravanbakhsh Winter 2018 Learning objectives Markov chains the idea behind Markov Chain Monte Carlo (MCMC) two important examples: Gibbs sampling

More information

Rings, Paths, and Cayley Graphs

Rings, Paths, and Cayley Graphs Spectral Graph Theory Lecture 5 Rings, Paths, and Cayley Graphs Daniel A. Spielman September 16, 2014 Disclaimer These notes are not necessarily an accurate representation of what happened in class. The

More information

QUADRATIC AND CONVEX MINIMAX CLASSIFICATION PROBLEMS

QUADRATIC AND CONVEX MINIMAX CLASSIFICATION PROBLEMS Journal of the Operations Research Societ of Japan 008, Vol. 51, No., 191-01 QUADRATIC AND CONVEX MINIMAX CLASSIFICATION PROBLEMS Tomonari Kitahara Shinji Mizuno Kazuhide Nakata Toko Institute of Technolog

More information

Laurie s Notes. Overview of Section 3.5

Laurie s Notes. Overview of Section 3.5 Overview of Section.5 Introduction Sstems of linear equations were solved in Algebra using substitution, elimination, and graphing. These same techniques are applied to nonlinear sstems in this lesson.

More information

11.4. RECURRENCE AND TRANSIENCE 93

11.4. RECURRENCE AND TRANSIENCE 93 11.4. RECURRENCE AND TRANSIENCE 93 Similar arguments show that simple smmetric random walk is also recurrent in 2 dimensions but transient in 3 or more dimensions. Proposition 11.10 If x is recurrent and

More information

Lecture 9: Low Rank Approximation

Lecture 9: Low Rank Approximation CSE 521: Design and Analysis of Algorithms I Fall 2018 Lecture 9: Low Rank Approximation Lecturer: Shayan Oveis Gharan February 8th Scribe: Jun Qi Disclaimer: These notes have not been subjected to the

More information

Review of Probability

Review of Probability Review of robabilit robabilit Theor: Man techniques in speech processing require the manipulation of probabilities and statistics. The two principal application areas we will encounter are: Statistical

More information

Math 497C Mar 3, Curves and Surfaces Fall 2004, PSU

Math 497C Mar 3, Curves and Surfaces Fall 2004, PSU Math 497C Mar 3, 2004 1 Curves and Surfaces Fall 2004, PSU Lecture Notes 10 2.3 Meaning of Gaussian Curvature In the previous lecture we gave a formal definition for Gaussian curvature K in terms of the

More information

Ch 3 Alg 2 Note Sheet.doc 3.1 Graphing Systems of Equations

Ch 3 Alg 2 Note Sheet.doc 3.1 Graphing Systems of Equations Ch 3 Alg Note Sheet.doc 3.1 Graphing Sstems of Equations Sstems of Linear Equations A sstem of equations is a set of two or more equations that use the same variables. If the graph of each equation =.4

More information

A New Lower Bound Technique for Quantum Circuits without Ancillæ

A New Lower Bound Technique for Quantum Circuits without Ancillæ A New Lower Bound Technique for Quantum Circuits without Ancillæ Debajyoti Bera Abstract We present a technique to derive depth lower bounds for quantum circuits. The technique is based on the observation

More information

Eigenvalues, random walks and Ramanujan graphs

Eigenvalues, random walks and Ramanujan graphs Eigenvalues, random walks and Ramanujan graphs David Ellis 1 The Expander Mixing lemma We have seen that a bounded-degree graph is a good edge-expander if and only if if has large spectral gap If G = (V,

More information

Feedforward Neural Networks

Feedforward Neural Networks Chapter 4 Feedforward Neural Networks 4. Motivation Let s start with our logistic regression model from before: P(k d) = softma k =k ( λ(k ) + w d λ(k, w) ). (4.) Recall that this model gives us a lot

More information

Interspecific Segregation and Phase Transition in a Lattice Ecosystem with Intraspecific Competition

Interspecific Segregation and Phase Transition in a Lattice Ecosystem with Intraspecific Competition Interspecific Segregation and Phase Transition in a Lattice Ecosstem with Intraspecific Competition K. Tainaka a, M. Kushida a, Y. Ito a and J. Yoshimura a,b,c a Department of Sstems Engineering, Shizuoka

More information

Lecture 17: Cheeger s Inequality and the Sparsest Cut Problem

Lecture 17: Cheeger s Inequality and the Sparsest Cut Problem Recent Advances in Approximation Algorithms Spring 2015 Lecture 17: Cheeger s Inequality and the Sparsest Cut Problem Lecturer: Shayan Oveis Gharan June 1st Disclaimer: These notes have not been subjected

More information

CONTINUOUS SPATIAL DATA ANALYSIS

CONTINUOUS SPATIAL DATA ANALYSIS CONTINUOUS SPATIAL DATA ANALSIS 1. Overview of Spatial Stochastic Processes The ke difference between continuous spatial data and point patterns is that there is now assumed to be a meaningful value, s

More information

An Introduction to Discrete Vector Calculus on Finite Networks

An Introduction to Discrete Vector Calculus on Finite Networks Grupo ARIDIS REU 010, UW Notes for the Lecture, June 010. An Introduction to Discrete ector Calculus on Finite Networks A. Carmona and A.M. Encinas We aim here at introducing the basic terminology and

More information

University of Chicago Autumn 2003 CS Markov Chain Monte Carlo Methods. Lecture 7: November 11, 2003 Estimating the permanent Eric Vigoda

University of Chicago Autumn 2003 CS Markov Chain Monte Carlo Methods. Lecture 7: November 11, 2003 Estimating the permanent Eric Vigoda University of Chicago Autumn 2003 CS37101-1 Markov Chain Monte Carlo Methods Lecture 7: November 11, 2003 Estimating the permanent Eric Vigoda We refer the reader to Jerrum s book [1] for the analysis

More information

UNCORRECTED SAMPLE PAGES. 3Quadratics. Chapter 3. Objectives

UNCORRECTED SAMPLE PAGES. 3Quadratics. Chapter 3. Objectives Chapter 3 3Quadratics Objectives To recognise and sketch the graphs of quadratic polnomials. To find the ke features of the graph of a quadratic polnomial: ais intercepts, turning point and ais of smmetr.

More information

4 Inverse function theorem

4 Inverse function theorem Tel Aviv Universit, 2013/14 Analsis-III,IV 53 4 Inverse function theorem 4a What is the problem................ 53 4b Simple observations before the theorem..... 54 4c The theorem.....................

More information

Markov Chains, Stochastic Processes, and Matrix Decompositions

Markov Chains, Stochastic Processes, and Matrix Decompositions Markov Chains, Stochastic Processes, and Matrix Decompositions 5 May 2014 Outline 1 Markov Chains Outline 1 Markov Chains 2 Introduction Perron-Frobenius Matrix Decompositions and Markov Chains Spectral

More information

Exact Equations. M(x,y) + N(x,y) y = 0, M(x,y) dx + N(x,y) dy = 0. M(x,y) + N(x,y) y = 0

Exact Equations. M(x,y) + N(x,y) y = 0, M(x,y) dx + N(x,y) dy = 0. M(x,y) + N(x,y) y = 0 Eact Equations An eact equation is a first order differential equation that can be written in the form M(, + N(,, provided that there eists a function ψ(, such that = M (, and N(, = Note : Often the equation

More information

A note on adiabatic theorem for Markov chains

A note on adiabatic theorem for Markov chains Yevgeniy Kovchegov Abstract We state and prove a version of an adiabatic theorem for Markov chains using well known facts about mixing times. We extend the result to the case of continuous time Markov

More information

In applications, we encounter many constrained optimization problems. Examples Basis pursuit: exact sparse recovery problem

In applications, we encounter many constrained optimization problems. Examples Basis pursuit: exact sparse recovery problem 1 Conve Analsis Main references: Vandenberghe UCLA): EECS236C - Optimiation methods for large scale sstems, http://www.seas.ucla.edu/ vandenbe/ee236c.html Parikh and Bod, Proimal algorithms, slides and

More information