Lecture 9 Sept 29, 2017

Size: px
Start display at page:

Download "Lecture 9 Sept 29, 2017"

Transcription

1 Sketchng Algorthms for Bg Data Fall 2017 Prof. Jelan Nelson Lecture 9 Sept 29, 2017 Scrbe: Mtal Bafna 1 Fast JL transform Typcally we have some hgh-mensonal computatonal geometry problem, an we use JL to spee up our algorthm n two steps: (1) apply a JL map Π to reuce the problem to low menson m, then (2) solve the lower-mensonal problem. As m s mae smaller, typcally (2) becomes faster. However, eally we woul also lke step (1) to be as fast as possble. In ths secton, we nvestgate two approaches to spee up the computaton of Πx. One of the analyses wll make use of the followng Chernoff boun. Theorem 1 (Chernoff boun). Let X 1,..., X n be nepenent ranom varables n [0, τ], an wrte µ := E X. Then ε > 0, P( ( e ε X µ > εµ) < 2 (1 + ε) 1+ε ) µ/τ The approach we cover here was nvestgate by Alon an Chazelle [AC09]. Ths approach gves a runnng tme to compute Πx of roughly O( log ). They calle ther transformaton the Fast Johnson-Lnenstrauss Transform (FJLT). A constructon smlar to thers, whch we wll analyze here, s the m n matrx Π efne as Π = SHD (1) m where S s an m samplng matrx wth replacement (each row has a 1 n a unformly ranom locaton an zeroes elsewhere, an the rows are nepenent), H s a boune orthonormal system, an D = ag(α) for a vector α of nepenent Raemachers. A boune orthonormal system s a matrx H C such that H H = I an max,j H,j 1/. For example, H can be the Fourer matrx or Haamar matrx. The motvaton for the constructon (1) s spee: D can be apple n O() tme, H n O( log ) tme (e.g. usng the Fast Fourer Transform or ve an conquer n the case of the Haamar matrx), an S n O(m) tme. Thus, overall, applyng Π to any fxe vector x takes O( log ) tme. Compare ths wth usng a ense matrx of Raemachers, whch takes O(m) tme to apply. We wll now gve some ntuton behn why such a Π works. Conser the samplng matrx S whch samples a ranom coornate of x. If the norm of x s sprea out among ts coornates then n expectaton the norm of Sx s the norm of x. But what o we o n the case where x has mass only on a few coornates. It s known that a Fourer matrx spreas out the mass of vectors wth hghly concentrate mass an vce versa. So we multply S wth H an to hanle the case where H concentrates the mass of vectors wth ther mass sprea out we fnally multply x n the begnnng by D α. 1

2 1.1 Analyss of [AC09] We wll show that for m ε 2 log(1/δ) log(/δ), the ranom Π escrbe n (1) proves DJL. We wll conser the case of H as the normalze Haamar matrx, so that every entry of H s n { 1/, 1/ }. Theorem 2. Let x R n be an arbtrary unt norm vector, an suppose 0 < ε, δ < 1/2. Also let Π = m SHD as escrbe above wth a number of rows equal to m ε 2 log(1/δ) log(n/δ). Then P Π ( Πx > ε) < δ. Proof. Defne y = HDx. The goal s to frst show that HDx = O( log(/δ)/n) wth probablty 1 δ/2, then contone on ths event, that (1 ε) m Sy 2 2 (1 + ε) wth probablty 1 δ/2. For the frst event, note y = (HDx) = n j=1 σ j ( 1 γ,j x j ) = σ, z, where γ,j = 1 an z s the vector wth (z ) j = 1 γ,j x j. Thus by Khntchne s nequalty Thus by a unon boun,, P( y > 2 log(4/δ) n ) < 2e log(/δ) = δ 2. 2 log(4/δ) 2 log(4/δ) P( y > ) = P( : y > ) < δ n n 2. Now, let us conton on ths event that y 2 2 log(4/δ)/n := τ/. For [m], efne X = y 2. By the Chernoff boun above, P( m =1 whch s at most δ/2 for m ε 2 log(1/δ) log(/δ). ( e ε ) m/τ X m > εm) < 2 (1 + ε) 1+ε, Remark 1. Note that the FJLT as analyze above proves suboptmal m. If one esre optmal m, one can nstea use the embeng matrx Π Π,where Π s the FJLT an Π s, say, a ense matrx wth Raemacher entres havng the optmal m = O(ε 2 log(1/δ)) rows. The ownse s that the runtme to apply our embeng worsens by an atve m m. [AC09] slghtly mprove ths atve term (by an ε 2 multplcatve factor) by replacng the matrx S wth a ranom sparse matrx P. Can a better analyss be gven? Unfortunately not by much: the quaratc epenence log 2 (1/δ) nees to be there by an example of Erc Prce. The ba case s x has 1/ 1/4 on the frst coornates, an magne δ 2. 2

3 1.2 Analyss base on RIP Here we gve a fferent analyss, base on combnng the man results of [KW11] an [RV08] whch use the metho of channg, as seen n the last lecture. Frst we have to gve a efnton. Defnton 1. We say a matrx Π R m n satsfes the (ε, k)-restrcte sometry property (or RIP for short) f for all k-sparse vectors x of unt Euclean norm, 1 ε Πx ε. Usng the fact that the operator norm of a matrx M s equal to sup x x T Mx, t follows that beng (ε, k)-rip s equvalent to sup I k (Π (T ) ) Π (T ) < ε, T [n] where Π (T ) s the m T matrx obtane by restrctng Π to the columns n T. As we wll see later n the course, ths noton of RIP s useful for compresse sensng, whch s closely relate to the heavy htters problem. For now, we wll just use t to obtan fast JL by combnng t wth the followng theorem of [KW11]. Theorem 3. There exsts a unversal constant C > 0 such that the followng hols. Suppose A satsfes (ε/c, k)-rip for k C log(1/δ), an let α { 1, 1} n be chosen unformly at ranom. Then for any x R n of unt norm P α ( AD α x > ε) < δ. In other wors, the probablty strbuton Π = AD α over matrces, nuce by α, satsfes the strbutonal JL property. We wll not prove Theorem 3 here, but we wll show that the matrx msh satsfes RIP wth postve probablty for farly small m. That s, there oes some choce of few rows of a boune orthonormal system that gves RIP (though unfortunately we o not know whch explct set, though see [BDF + 11]). A number of bouns on the best m to acheve RIP for samplng Fourer/Haamar rows were gven, startng wth the work of Canés an Tao [CT06]. Then subsequent works gave better bouns [RV08, Bou14, HR16]. An analyss was also gven for a relate constructon n [NPW14]. We wll gve the analyss of [RV08] snce t s most smlar to what we saw n the last lecture. Recall for T R n, r(t ) := E sup σ, x. σ x T Last lecture we not nclue the absolute values, but t oes not make much of a fference (the Khntchne tal boun only ffers by a factor of two). Also recall that we showe r(t ) (T, 2 ), 3

4 where for T a set of vectors of at most unt norm, 1 (T, ) 2 k lg1/2 N (T,, 1 2 k ) lg 1/2 N (T,, u)u nf k=1 0 {T r} r=1 Ths was the Duley boun. Let us now show that for RIP, m = Ω(ε 2 k log 4 n) suffces. 2 r/2 sup x T r. x T We wll analyze a slghtly fferent constructon, just for ease of notaton. Instea of samplng m rows from H, we wll smply keep each row wth probablty m/, nepenently. Let η be an ncator for whether we keep row. Also, let us efne x to equal the th row of H, so x { 1, 1} n. We let β = E µ sup I k 1 m terms of β. µ z (T ) ) T an we wll now get an upper boun for β n E sup I k 1 µ z (T ) µ m = E sup E ( 1 µ µ µ m z (T ) 1 E µ,µ m sup ) T ) T 1 m µ z (T ) ) T ) µ z (T ) ) T µ z (T ) ) T ) Jensen s nequalty = 1 m E µ,µ,σ sup σ (µ µ )z (T ) ) T By symmetrzaton over σ 2 m E µ E σ sup σ µ z (T ) ) T Trangle nequalty = 2 m E µ E σ sup = 2 m E µ E σ sup sup σ µ x, z (T ) x R n 2 Usng the efn of operator norm of a matrx [] sup x D,k 2 [] σ µ x, z (T ) 2 where D,k 2 = set of all k-sparse unt vectors n R We let, T µ = {µ 1 x, z 1 2,..., µ x, z 2, x D,k 2 } an r(t µ) = E sup z Tµ σ, z. Duley s nequalty gves us that r(t ) (T, l 2 ). Let g(x) = (µ 1 x, z 1,..., µ x, z ) an g(y) s efne smlarly. We have that, So we get that, g(x) g(y) 2 max 1 j z j, x y 2 m (β + 1) 1/2. β β + 1 (D,k 2, ) m, 4

5 whch mples that β 2 CRβ CR 0. References [AC09] Nr Alon an Bernar Chazelle. The fast Johnson Lnenstrauss transform an approxmate nearest neghbors. SIAM J. Comput., 39(1): , [BDF + 11] Jean Bourgan, Stephen Dlworth, Kevn For, Serge Konyagn, an Denka Kutzarova. Explct constructons of RIP matrces an relate problems. Duke Mathematcal Journal, 159(1): , [Bou14] Jean Bourgan. An mprove estmate n the restrcte sometry problem. Geometrc Aspects of Functonal Analyss, 2116:65 70, [CT06] Emmanuel J. Canés an Terence Tao. Near-optmal sgnal recovery from ranom projectons: unversal encong strateges? IEEE Trans. Inform. Theory, 52(12): , [HR16] [KW11] Ishay Havv an Oe Regev. The restrcte sometry property of subsample fourer matrces. In Proceengs of the Twenty-Seventh Annual ACM-SIAM Symposum on Dscrete Algorthms (SODA), pages , Felx Krahmer an Rachel War. New an mprove Johnson-Lnenstrauss embengs va the Restrcte Isometry Property. SIAM J. Math. Anal., 43(3): , [NPW14] Jelan Nelson, Erc Prce, an Mary Wootters. New constructons of RIP matrces wth fast multplcaton an fewer rows. In Proceengs of the 25th Annual ACM-SIAM Symposum on Dscrete Algorthms (SODA), pages , January [RV08] Mark Ruelson an Roman Vershynn. On sparse reconstructon from Fourer an Gaussan measurements. Comm. Pure Appl. Math., 61(8): ,

Dimensionality Reduction Notes 2

Dimensionality Reduction Notes 2 Dmensonalty Reducton Notes 2 Jelan Nelson mnlek@seas.harvard.edu August 11, 2015 1 Optmalty theorems for JL Yesterday we saw for MJL that we could acheve target dmenson m = O(ε 2 log N), and for DJL we

More information

An introduction to chaining, and applications to sublinear algorithms

An introduction to chaining, and applications to sublinear algorithms An ntroducton to channg, and applcatons to sublnear algorthms Jelan Nelson Harvard August 28, 2015 What s ths talk about? What s ths talk about? Gven a collecton of random varables X 1, X 2,...,, we would

More information

Large-Scale Data-Dependent Kernel Approximation Appendix

Large-Scale Data-Dependent Kernel Approximation Appendix Large-Scale Data-Depenent Kernel Approxmaton Appenx Ths appenx presents the atonal etal an proofs assocate wth the man paper [1]. 1 Introucton Let k : R p R p R be a postve efnte translaton nvarant functon

More information

A MULTIDIMENSIONAL ANALOGUE OF THE RADEMACHER-GAUSSIAN TAIL COMPARISON

A MULTIDIMENSIONAL ANALOGUE OF THE RADEMACHER-GAUSSIAN TAIL COMPARISON A MULTIDIMENSIONAL ANALOGUE OF THE RADEMACHER-GAUSSIAN TAIL COMPARISON PIOTR NAYAR AND TOMASZ TKOCZ Abstract We prove a menson-free tal comparson between the Euclean norms of sums of nepenent ranom vectors

More information

Matrix Approximation via Sampling, Subspace Embedding. 1 Solving Linear Systems Using SVD

Matrix Approximation via Sampling, Subspace Embedding. 1 Solving Linear Systems Using SVD Matrx Approxmaton va Samplng, Subspace Embeddng Lecturer: Anup Rao Scrbe: Rashth Sharma, Peng Zhang 0/01/016 1 Solvng Lnear Systems Usng SVD Two applcatons of SVD have been covered so far. Today we loo

More information

CIS 700: algorithms for Big Data

CIS 700: algorithms for Big Data CIS 700: algorthms for Bg Data Lecture 5: Dmenson Reducton Sldes at htt://grgory.us/bg-data-class.html Grgory Yaroslavtsev htt://grgory.us Today Dmensonalty reducton AMS as dmensonalty reducton Johnson-Lndenstrauss

More information

A MULTIDIMENSIONAL ANALOGUE OF THE RADEMACHER-GAUSSIAN TAIL COMPARISON

A MULTIDIMENSIONAL ANALOGUE OF THE RADEMACHER-GAUSSIAN TAIL COMPARISON A MULTIDIMENSIONAL ANALOGUE OF THE RADEMACHER-GAUSSIAN TAIL COMPARISON PIOTR NAYAR AND TOMASZ TKOCZ Abstract We prove a menson-free tal comparson between the Euclean norms of sums of nepenent ranom vectors

More information

Dimensionality Reduction Notes 1

Dimensionality Reduction Notes 1 Dmensonalty Reducton Notes 1 Jelan Nelson mnlek@seas.harvard.edu August 10, 2015 1 Prelmnares Here we collect some notaton and basc lemmas used throughout ths note. Throughout, for a random varable X,

More information

ENTROPIC QUESTIONING

ENTROPIC QUESTIONING ENTROPIC QUESTIONING NACHUM. Introucton Goal. Pck the queston that contrbutes most to fnng a sutable prouct. Iea. Use an nformaton-theoretc measure. Bascs. Entropy (a non-negatve real number) measures

More information

Lecture 10: May 6, 2013

Lecture 10: May 6, 2013 TTIC/CMSC 31150 Mathematcal Toolkt Sprng 013 Madhur Tulsan Lecture 10: May 6, 013 Scrbe: Wenje Luo In today s lecture, we manly talked about random walk on graphs and ntroduce the concept of graph expander,

More information

Lecture 4: Constant Time SVD Approximation

Lecture 4: Constant Time SVD Approximation Spectral Algorthms and Representatons eb. 17, Mar. 3 and 8, 005 Lecture 4: Constant Tme SVD Approxmaton Lecturer: Santosh Vempala Scrbe: Jangzhuo Chen Ths topc conssts of three lectures 0/17, 03/03, 03/08),

More information

Lecture 3. Ax x i a i. i i

Lecture 3. Ax x i a i. i i 18.409 The Behavor of Algorthms n Practce 2/14/2 Lecturer: Dan Spelman Lecture 3 Scrbe: Arvnd Sankar 1 Largest sngular value In order to bound the condton number, we need an upper bound on the largest

More information

Lecture 4: Universal Hash Functions/Streaming Cont d

Lecture 4: Universal Hash Functions/Streaming Cont d CSE 5: Desgn and Analyss of Algorthms I Sprng 06 Lecture 4: Unversal Hash Functons/Streamng Cont d Lecturer: Shayan Oves Gharan Aprl 6th Scrbe: Jacob Schreber Dsclamer: These notes have not been subjected

More information

Deriving the X-Z Identity from Auxiliary Space Method

Deriving the X-Z Identity from Auxiliary Space Method Dervng the X-Z Identty from Auxlary Space Method Long Chen Department of Mathematcs, Unversty of Calforna at Irvne, Irvne, CA 92697 chenlong@math.uc.edu 1 Iteratve Methods In ths paper we dscuss teratve

More information

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016 U.C. Berkeley CS94: Spectral Methods and Expanders Handout 8 Luca Trevsan February 7, 06 Lecture 8: Spectral Algorthms Wrap-up In whch we talk about even more generalzatons of Cheeger s nequaltes, and

More information

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could

More information

princeton univ. F 13 cos 521: Advanced Algorithm Design Lecture 3: Large deviations bounds and applications Lecturer: Sanjeev Arora

princeton univ. F 13 cos 521: Advanced Algorithm Design Lecture 3: Large deviations bounds and applications Lecturer: Sanjeev Arora prnceton unv. F 13 cos 521: Advanced Algorthm Desgn Lecture 3: Large devatons bounds and applcatons Lecturer: Sanjeev Arora Scrbe: Today s topc s devaton bounds: what s the probablty that a random varable

More information

Chapter 2 Transformations and Expectations. , and define f

Chapter 2 Transformations and Expectations. , and define f Revew for the prevous lecture Defnton: support set of a ranom varable, the monotone functon; Theorem: How to obtan a cf, pf (or pmf) of functons of a ranom varable; Eamples: several eamples Chapter Transformatons

More information

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017 U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 4s Luca Trevsan September 5, 07 Summary of Lecture 4 In whch we ntroduce semdefnte programmng and apply t to Max Cut. Semdefnte Programmng Recall that

More information

Explicit bounds for the return probability of simple random walk

Explicit bounds for the return probability of simple random walk Explct bouns for the return probablty of smple ranom walk The runnng hea shoul be the same as the ttle.) Karen Ball Jacob Sterbenz Contact nformaton: Karen Ball IMA Unversty of Mnnesota 4 Ln Hall, 7 Church

More information

Math 217 Fall 2013 Homework 2 Solutions

Math 217 Fall 2013 Homework 2 Solutions Math 17 Fall 013 Homework Solutons Due Thursday Sept. 6, 013 5pm Ths homework conssts of 6 problems of 5 ponts each. The total s 30. You need to fully justfy your answer prove that your functon ndeed has

More information

Lecture Space-Bounded Derandomization

Lecture Space-Bounded Derandomization Notes on Complexty Theory Last updated: October, 2008 Jonathan Katz Lecture Space-Bounded Derandomzaton 1 Space-Bounded Derandomzaton We now dscuss derandomzaton of space-bounded algorthms. Here non-trval

More information

Errors for Linear Systems

Errors for Linear Systems Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch

More information

p(z) = 1 a e z/a 1(z 0) yi a i x (1/a) exp y i a i x a i=1 n i=1 (y i a i x) inf 1 (y Ax) inf Ax y (1 ν) y if A (1 ν) = 0 otherwise

p(z) = 1 a e z/a 1(z 0) yi a i x (1/a) exp y i a i x a i=1 n i=1 (y i a i x) inf 1 (y Ax) inf Ax y (1 ν) y if A (1 ν) = 0 otherwise Dustn Lennon Math 582 Convex Optmzaton Problems from Boy, Chapter 7 Problem 7.1 Solve the MLE problem when the nose s exponentally strbute wth ensty p(z = 1 a e z/a 1(z 0 The MLE s gven by the followng:

More information

Non-negative Matrices and Distributed Control

Non-negative Matrices and Distributed Control Non-negatve Matrces an Dstrbute Control Yln Mo July 2, 2015 We moel a network compose of m agents as a graph G = {V, E}. V = {1, 2,..., m} s the set of vertces representng the agents. E V V s the set of

More information

Eigenvalues of Random Graphs

Eigenvalues of Random Graphs Spectral Graph Theory Lecture 2 Egenvalues of Random Graphs Danel A. Spelman November 4, 202 2. Introducton In ths lecture, we consder a random graph on n vertces n whch each edge s chosen to be n the

More information

Randomness and Computation

Randomness and Computation Randomness and Computaton or, Randomzed Algorthms Mary Cryan School of Informatcs Unversty of Ednburgh RC 208/9) Lecture 0 slde Balls n Bns m balls, n bns, and balls thrown unformly at random nto bns usually

More information

Notes on Frequency Estimation in Data Streams

Notes on Frequency Estimation in Data Streams Notes on Frequency Estmaton n Data Streams In (one of) the data streamng model(s), the data s a sequence of arrvals a 1, a 2,..., a m of the form a j = (, v) where s the dentty of the tem and belongs to

More information

18.1 Introduction and Recap

18.1 Introduction and Recap CS787: Advanced Algorthms Scrbe: Pryananda Shenoy and Shjn Kong Lecturer: Shuch Chawla Topc: Streamng Algorthmscontnued) Date: 0/26/2007 We contnue talng about streamng algorthms n ths lecture, ncludng

More information

Homework Notes Week 7

Homework Notes Week 7 Homework Notes Week 7 Math 4 Sprng 4 #4 (a Complete the proof n example 5 that s an nner product (the Frobenus nner product on M n n (F In the example propertes (a and (d have already been verfed so we

More information

MATH 241B FUNCTIONAL ANALYSIS - NOTES EXAMPLES OF C ALGEBRAS

MATH 241B FUNCTIONAL ANALYSIS - NOTES EXAMPLES OF C ALGEBRAS MATH 241B FUNCTIONAL ANALYSIS - NOTES EXAMPLES OF C ALGEBRAS These are nformal notes whch cover some of the materal whch s not n the course book. The man purpose s to gve a number of nontrval examples

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

Lecture 5 September 17, 2015

Lecture 5 September 17, 2015 CS 229r: Algorthms for Bg Data Fall 205 Prof. Jelan Nelson Lecture 5 September 7, 205 Scrbe: Yakr Reshef Recap and overvew Last tme we dscussed the problem of norm estmaton for p-norms wth p > 2. We had

More information

The Second Anti-Mathima on Game Theory

The Second Anti-Mathima on Game Theory The Second Ant-Mathma on Game Theory Ath. Kehagas December 1 2006 1 Introducton In ths note we wll examne the noton of game equlbrum for three types of games 1. 2-player 2-acton zero-sum games 2. 2-player

More information

The Order Relation and Trace Inequalities for. Hermitian Operators

The Order Relation and Trace Inequalities for. Hermitian Operators Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence

More information

2. High dimensional data

2. High dimensional data /8/00. Hgh mensons. Hgh mensonal ata Conser representng a ocument by a vector each component of whch correspons to the number of occurrences of a partcular wor n the ocument. The Englsh language has on

More information

COS 521: Advanced Algorithms Game Theory and Linear Programming

COS 521: Advanced Algorithms Game Theory and Linear Programming COS 521: Advanced Algorthms Game Theory and Lnear Programmng Moses Charkar February 27, 2013 In these notes, we ntroduce some basc concepts n game theory and lnear programmng (LP). We show a connecton

More information

8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS

8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS SECTION 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS 493 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS All the vector spaces you have studed thus far n the text are real vector spaces because the scalars

More information

Feature Selection: Part 1

Feature Selection: Part 1 CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?

More information

MAT 578 Functional Analysis

MAT 578 Functional Analysis MAT 578 Functonal Analyss John Qugg Fall 2008 Locally convex spaces revsed September 6, 2008 Ths secton establshes the fundamental propertes of locally convex spaces. Acknowledgment: although I wrote these

More information

The lower and upper bounds on Perron root of nonnegative irreducible matrices

The lower and upper bounds on Perron root of nonnegative irreducible matrices Journal of Computatonal Appled Mathematcs 217 (2008) 259 267 wwwelsevercom/locate/cam The lower upper bounds on Perron root of nonnegatve rreducble matrces Guang-Xn Huang a,, Feng Yn b,keguo a a College

More information

1 Convex Optimization

1 Convex Optimization Convex Optmzaton We wll consder convex optmzaton problems. Namely, mnmzaton problems where the objectve s convex (we assume no constrants for now). Such problems often arse n machne learnng. For example,

More information

APPENDIX A Some Linear Algebra

APPENDIX A Some Linear Algebra APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,

More information

1 The Mistake Bound Model

1 The Mistake Bound Model 5-850: Advanced Algorthms CMU, Sprng 07 Lecture #: Onlne Learnng and Multplcatve Weghts February 7, 07 Lecturer: Anupam Gupta Scrbe: Bryan Lee,Albert Gu, Eugene Cho he Mstake Bound Model Suppose there

More information

HANSON-WRIGHT INEQUALITY AND SUB-GAUSSIAN CONCENTRATION

HANSON-WRIGHT INEQUALITY AND SUB-GAUSSIAN CONCENTRATION HANSON-WRIGHT INEQUALITY AND SUB-GAUSSIAN CONCENTRATION MARK RUDELSON AND ROMAN VERSHYNIN Abstract. In ths expostory note, we gve a modern proof of Hanson-Wrght nequalty for quadratc forms n sub-gaussan

More information

Appendix B. Criterion of Riemann-Stieltjes Integrability

Appendix B. Criterion of Riemann-Stieltjes Integrability Appendx B. Crteron of Remann-Steltes Integrablty Ths note s complementary to [R, Ch. 6] and [T, Sec. 3.5]. The man result of ths note s Theorem B.3, whch provdes the necessary and suffcent condtons for

More information

The Second Eigenvalue of Planar Graphs

The Second Eigenvalue of Planar Graphs Spectral Graph Theory Lecture 20 The Second Egenvalue of Planar Graphs Danel A. Spelman November 11, 2015 Dsclamer These notes are not necessarly an accurate representaton of what happened n class. The

More information

Lecture 3 January 31, 2017

Lecture 3 January 31, 2017 CS 224: Advanced Algorthms Sprng 207 Prof. Jelan Nelson Lecture 3 January 3, 207 Scrbe: Saketh Rama Overvew In the last lecture we covered Y-fast tres and Fuson Trees. In ths lecture we start our dscusson

More information

A New Refinement of Jacobi Method for Solution of Linear System Equations AX=b

A New Refinement of Jacobi Method for Solution of Linear System Equations AX=b Int J Contemp Math Scences, Vol 3, 28, no 17, 819-827 A New Refnement of Jacob Method for Soluton of Lnear System Equatons AX=b F Naem Dafchah Department of Mathematcs, Faculty of Scences Unversty of Gulan,

More information

A GENERALIZATION OF JUNG S THEOREM. M. Henk

A GENERALIZATION OF JUNG S THEOREM. M. Henk A GENERALIZATION OF JUNG S THEOREM M. Henk Abstract. The theorem of Jung establshes a relaton between crcumraus an ameter of a convex boy. The half of the ameter can be nterprete as the maxmum of crcumra

More information

WHY NOT USE THE ENTROPY METHOD FOR WEIGHT ESTIMATION?

WHY NOT USE THE ENTROPY METHOD FOR WEIGHT ESTIMATION? ISAHP 001, Berne, Swtzerlan, August -4, 001 WHY NOT USE THE ENTROPY METHOD FOR WEIGHT ESTIMATION? Masaak SHINOHARA, Chkako MIYAKE an Kekch Ohsawa Department of Mathematcal Informaton Engneerng College

More information

Spectral Graph Theory and its Applications September 16, Lecture 5

Spectral Graph Theory and its Applications September 16, Lecture 5 Spectral Graph Theory and ts Applcatons September 16, 2004 Lecturer: Danel A. Spelman Lecture 5 5.1 Introducton In ths lecture, we wll prove the followng theorem: Theorem 5.1.1. Let G be a planar graph

More information

Lecture 21: Numerical methods for pricing American type derivatives

Lecture 21: Numerical methods for pricing American type derivatives Lecture 21: Numercal methods for prcng Amercan type dervatves Xaoguang Wang STAT 598W Aprl 10th, 2014 (STAT 598W) Lecture 21 1 / 26 Outlne 1 Fnte Dfference Method Explct Method Penalty Method (STAT 598W)

More information

ENGI9496 Lecture Notes Multiport Models in Mechanics

ENGI9496 Lecture Notes Multiport Models in Mechanics ENGI9496 Moellng an Smulaton of Dynamc Systems Mechancs an Mechansms ENGI9496 Lecture Notes Multport Moels n Mechancs (New text Secton 4..3; Secton 9.1 generalzes to 3D moton) Defntons Generalze coornates

More information

APPROXIMATE PRICES OF BASKET AND ASIAN OPTIONS DUPONT OLIVIER. Premia 14

APPROXIMATE PRICES OF BASKET AND ASIAN OPTIONS DUPONT OLIVIER. Premia 14 APPROXIMAE PRICES OF BASKE AND ASIAN OPIONS DUPON OLIVIER Prema 14 Contents Introducton 1 1. Framewor 1 1.1. Baset optons 1.. Asan optons. Computng the prce 3. Lower bound 3.1. Closed formula for the prce

More information

More metrics on cartesian products

More metrics on cartesian products More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of

More information

CENTRAL LIMIT THEORY FOR THE NUMBER OF SEEDS IN A GROWTH MODEL IN d WITH INHOMOGENEOUS POISSON ARRIVALS

CENTRAL LIMIT THEORY FOR THE NUMBER OF SEEDS IN A GROWTH MODEL IN d WITH INHOMOGENEOUS POISSON ARRIVALS The Annals of Apple Probablty 1997, Vol. 7, No. 3, 82 814 CENTRAL LIMIT THEORY FOR THE NUMBER OF SEEDS IN A GROWTH MODEL IN WITH INHOMOGENEOUS POISSON ARRIVALS By S. N. Chu 1 an M. P. Qune Hong Kong Baptst

More information

TAIL BOUNDS FOR SUMS OF GEOMETRIC AND EXPONENTIAL VARIABLES

TAIL BOUNDS FOR SUMS OF GEOMETRIC AND EXPONENTIAL VARIABLES TAIL BOUNDS FOR SUMS OF GEOMETRIC AND EXPONENTIAL VARIABLES SVANTE JANSON Abstract. We gve explct bounds for the tal probabltes for sums of ndependent geometrc or exponental varables, possbly wth dfferent

More information

Lecture 20: Lift and Project, SDP Duality. Today we will study the Lift and Project method. Then we will prove the SDP duality theorem.

Lecture 20: Lift and Project, SDP Duality. Today we will study the Lift and Project method. Then we will prove the SDP duality theorem. prnceton u. sp 02 cos 598B: algorthms and complexty Lecture 20: Lft and Project, SDP Dualty Lecturer: Sanjeev Arora Scrbe:Yury Makarychev Today we wll study the Lft and Project method. Then we wll prove

More information

763622S ADVANCED QUANTUM MECHANICS Solution Set 1 Spring c n a n. c n 2 = 1.

763622S ADVANCED QUANTUM MECHANICS Solution Set 1 Spring c n a n. c n 2 = 1. 7636S ADVANCED QUANTUM MECHANICS Soluton Set 1 Sprng 013 1 Warm-up Show that the egenvalues of a Hermtan operator  are real and that the egenkets correspondng to dfferent egenvalues are orthogonal (b)

More information

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 6 Luca Trevisan September 12, 2017

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 6 Luca Trevisan September 12, 2017 U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 6 Luca Trevsan September, 07 Scrbed by Theo McKenze Lecture 6 In whch we study the spectrum of random graphs. Overvew When attemptng to fnd n polynomal

More information

arxiv: v1 [quant-ph] 6 Sep 2007

arxiv: v1 [quant-ph] 6 Sep 2007 An Explct Constructon of Quantum Expanders Avraham Ben-Aroya Oded Schwartz Amnon Ta-Shma arxv:0709.0911v1 [quant-ph] 6 Sep 2007 Abstract Quantum expanders are a natural generalzaton of classcal expanders.

More information

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg prnceton unv. F 17 cos 521: Advanced Algorthm Desgn Lecture 7: LP Dualty Lecturer: Matt Wenberg Scrbe: LP Dualty s an extremely useful tool for analyzng structural propertes of lnear programs. Whle there

More information

CSCE 790S Background Results

CSCE 790S Background Results CSCE 790S Background Results Stephen A. Fenner September 8, 011 Abstract These results are background to the course CSCE 790S/CSCE 790B, Quantum Computaton and Informaton (Sprng 007 and Fall 011). Each

More information

GENERIC CONTINUOUS SPECTRUM FOR MULTI-DIMENSIONAL QUASIPERIODIC SCHRÖDINGER OPERATORS WITH ROUGH POTENTIALS

GENERIC CONTINUOUS SPECTRUM FOR MULTI-DIMENSIONAL QUASIPERIODIC SCHRÖDINGER OPERATORS WITH ROUGH POTENTIALS GENERIC CONTINUOUS SPECTRUM FOR MULTI-DIMENSIONAL QUASIPERIODIC SCHRÖDINGER OPERATORS WITH ROUGH POTENTIALS YANG FAN AND RUI HAN Abstract. We stuy the mult-mensonal operator (H xu) n = m n = um + f(t n

More information

Stanford University CS254: Computational Complexity Notes 7 Luca Trevisan January 29, Notes for Lecture 7

Stanford University CS254: Computational Complexity Notes 7 Luca Trevisan January 29, Notes for Lecture 7 Stanford Unversty CS54: Computatonal Complexty Notes 7 Luca Trevsan January 9, 014 Notes for Lecture 7 1 Approxmate Countng wt an N oracle We complete te proof of te followng result: Teorem 1 For every

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 1 10/1/013 Martngale Concentraton Inequaltes and Applcatons Content. 1. Exponental concentraton for martngales wth bounded ncrements.

More information

Week 5: Neural Networks

Week 5: Neural Networks Week 5: Neural Networks Instructor: Sergey Levne Neural Networks Summary In the prevous lecture, we saw how we can construct neural networks by extendng logstc regresson. Neural networks consst of multple

More information

Supplement to Clustering with Statistical Error Control

Supplement to Clustering with Statistical Error Control Supplement to Clusterng wth Statstcal Error Control Mchael Vogt Unversty of Bonn Matthas Schmd Unversty of Bonn In ths supplement, we provde the proofs that are omtted n the paper. In partcular, we derve

More information

Expected Value and Variance

Expected Value and Variance MATH 38 Expected Value and Varance Dr. Neal, WKU We now shall dscuss how to fnd the average and standard devaton of a random varable X. Expected Value Defnton. The expected value (or average value, or

More information

Some basic inequalities. Definition. Let V be a vector space over the complex numbers. An inner product is given by a function, V V C

Some basic inequalities. Definition. Let V be a vector space over the complex numbers. An inner product is given by a function, V V C Some basc nequaltes Defnton. Let V be a vector space over the complex numbers. An nner product s gven by a functon, V V C (x, y) x, y satsfyng the followng propertes (for all x V, y V and c C) (1) x +

More information

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons

More information

Multi-dimensional Central Limit Theorem

Multi-dimensional Central Limit Theorem Mult-dmensonal Central Lmt heorem Outlne ( ( ( t as ( + ( + + ( ( ( Consder a sequence of ndependent random proceses t, t, dentcal to some ( t. Assume t = 0. Defne the sum process t t t t = ( t = (; t

More information

C/CS/Phy191 Problem Set 3 Solutions Out: Oct 1, 2008., where ( 00. ), so the overall state of the system is ) ( ( ( ( 00 ± 11 ), Φ ± = 1

C/CS/Phy191 Problem Set 3 Solutions Out: Oct 1, 2008., where ( 00. ), so the overall state of the system is ) ( ( ( ( 00 ± 11 ), Φ ± = 1 C/CS/Phy9 Problem Set 3 Solutons Out: Oct, 8 Suppose you have two qubts n some arbtrary entangled state ψ You apply the teleportaton protocol to each of the qubts separately What s the resultng state obtaned

More information

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011 Stanford Unversty CS359G: Graph Parttonng and Expanders Handout 4 Luca Trevsan January 3, 0 Lecture 4 In whch we prove the dffcult drecton of Cheeger s nequalty. As n the past lectures, consder an undrected

More information

PHZ 6607 Lecture Notes

PHZ 6607 Lecture Notes NOTE PHZ 6607 Lecture Notes 1. Lecture 2 1.1. Defntons Books: ( Tensor Analyss on Manfols ( The mathematcal theory of black holes ( Carroll (v Schutz Vector: ( In an N-Dmensonal space, a vector s efne

More information

Exercises of Chapter 2

Exercises of Chapter 2 Exercses of Chapter Chuang-Cheh Ln Department of Computer Scence and Informaton Engneerng, Natonal Chung Cheng Unversty, Mng-Hsung, Chay 61, Tawan. Exercse.6. Suppose that we ndependently roll two standard

More information

12 MATH 101A: ALGEBRA I, PART C: MULTILINEAR ALGEBRA. 4. Tensor product

12 MATH 101A: ALGEBRA I, PART C: MULTILINEAR ALGEBRA. 4. Tensor product 12 MATH 101A: ALGEBRA I, PART C: MULTILINEAR ALGEBRA Here s an outlne of what I dd: (1) categorcal defnton (2) constructon (3) lst of basc propertes (4) dstrbutve property (5) rght exactness (6) localzaton

More information

Simple Analyses of the Sparse Johnson-Lindenstrauss Transform

Simple Analyses of the Sparse Johnson-Lindenstrauss Transform Smple Analyses of the Sparse Johnson-Lndenstrauss Transform Mchael B. Cohen 1, T. S. Jayram 2, and Jelan Nelson 3 1 MIT, 32 Vassar Street, Cambrdge, MA 2139, USA mcohen@mt.edu 2 IBM Almaden Research Center,

More information

Lecture 3: Shannon s Theorem

Lecture 3: Shannon s Theorem CSE 533: Error-Correctng Codes (Autumn 006 Lecture 3: Shannon s Theorem October 9, 006 Lecturer: Venkatesan Guruswam Scrbe: Wdad Machmouch 1 Communcaton Model The communcaton model we are usng conssts

More information

SUCCESSIVE MINIMA AND LATTICE POINTS (AFTER HENK, GILLET AND SOULÉ) M(B) := # ( B Z N)

SUCCESSIVE MINIMA AND LATTICE POINTS (AFTER HENK, GILLET AND SOULÉ) M(B) := # ( B Z N) SUCCESSIVE MINIMA AND LATTICE POINTS (AFTER HENK, GILLET AND SOULÉ) S.BOUCKSOM Abstract. The goal of ths note s to present a remarably smple proof, due to Hen, of a result prevously obtaned by Gllet-Soulé,

More information

COS 511: Theoretical Machine Learning. Lecturer: Rob Schapire Lecture # 15 Scribe: Jieming Mao April 1, 2013

COS 511: Theoretical Machine Learning. Lecturer: Rob Schapire Lecture # 15 Scribe: Jieming Mao April 1, 2013 COS 511: heoretcal Machne Learnng Lecturer: Rob Schapre Lecture # 15 Scrbe: Jemng Mao Aprl 1, 013 1 Bref revew 1.1 Learnng wth expert advce Last tme, we started to talk about learnng wth expert advce.

More information

An efficient algorithm for multivariate Maclaurin Newton transformation

An efficient algorithm for multivariate Maclaurin Newton transformation Annales UMCS Informatca AI VIII, 2 2008) 5 14 DOI: 10.2478/v10065-008-0020-6 An effcent algorthm for multvarate Maclaurn Newton transformaton Joanna Kapusta Insttute of Mathematcs and Computer Scence,

More information

Lecture 2: Gram-Schmidt Vectors and the LLL Algorithm

Lecture 2: Gram-Schmidt Vectors and the LLL Algorithm NYU, Fall 2016 Lattces Mn Course Lecture 2: Gram-Schmdt Vectors and the LLL Algorthm Lecturer: Noah Stephens-Davdowtz 2.1 The Shortest Vector Problem In our last lecture, we consdered short solutons to

More information

2.3 Nilpotent endomorphisms

2.3 Nilpotent endomorphisms s a block dagonal matrx, wth A Mat dm U (C) In fact, we can assume that B = B 1 B k, wth B an ordered bass of U, and that A = [f U ] B, where f U : U U s the restrcton of f to U 40 23 Nlpotent endomorphsms

More information

Finding Dense Subgraphs in G(n, 1/2)

Finding Dense Subgraphs in G(n, 1/2) Fndng Dense Subgraphs n Gn, 1/ Atsh Das Sarma 1, Amt Deshpande, and Rav Kannan 1 Georga Insttute of Technology,atsh@cc.gatech.edu Mcrosoft Research-Bangalore,amtdesh,annan@mcrosoft.com Abstract. Fndng

More information

Perron Vectors of an Irreducible Nonnegative Interval Matrix

Perron Vectors of an Irreducible Nonnegative Interval Matrix Perron Vectors of an Irreducble Nonnegatve Interval Matrx Jr Rohn August 4 2005 Abstract As s well known an rreducble nonnegatve matrx possesses a unquely determned Perron vector. As the man result of

More information

New Liu Estimators for the Poisson Regression Model: Method and Application

New Liu Estimators for the Poisson Regression Model: Method and Application New Lu Estmators for the Posson Regresson Moel: Metho an Applcaton By Krstofer Månsson B. M. Golam Kbra, Pär Sölaner an Ghaz Shukur,3 Department of Economcs, Fnance an Statstcs, Jönköpng Unversty Jönköpng,

More information

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal Inner Product Defnton 1 () A Eucldean space s a fnte-dmensonal vector space over the reals R, wth an nner product,. Defnton 2 (Inner Product) An nner product, on a real vector space X s a symmetrc, blnear,

More information

10-801: Advanced Optimization and Randomized Methods Lecture 2: Convex functions (Jan 15, 2014)

10-801: Advanced Optimization and Randomized Methods Lecture 2: Convex functions (Jan 15, 2014) 0-80: Advanced Optmzaton and Randomzed Methods Lecture : Convex functons (Jan 5, 04) Lecturer: Suvrt Sra Addr: Carnege Mellon Unversty, Sprng 04 Scrbes: Avnava Dubey, Ahmed Hefny Dsclamer: These notes

More information

BOUNDEDNESS OF THE RIESZ TRANSFORM WITH MATRIX A 2 WEIGHTS

BOUNDEDNESS OF THE RIESZ TRANSFORM WITH MATRIX A 2 WEIGHTS BOUNDEDNESS OF THE IESZ TANSFOM WITH MATIX A WEIGHTS Introducton Let L = L ( n, be the functon space wth norm (ˆ f L = f(x C dx d < For a d d matrx valued functon W : wth W (x postve sem-defnte for all

More information

Bounds for Spectral Radius of Various Matrices Associated With Graphs

Bounds for Spectral Radius of Various Matrices Associated With Graphs 45 5 Vol.45, No.5 016 9 AVANCES IN MATHEMATICS (CHINA) Sep., 016 o: 10.11845/sxjz.015015b Bouns for Spectral Raus of Varous Matrces Assocate Wth Graphs CUI Shuyu 1, TIAN Guxan, (1. Xngzh College, Zhejang

More information

First day August 1, Problems and Solutions

First day August 1, Problems and Solutions FOURTH INTERNATIONAL COMPETITION FOR UNIVERSITY STUDENTS IN MATHEMATICS July 30 August 4, 997, Plovdv, BULGARIA Frst day August, 997 Problems and Solutons Problem. Let {ε n } n= be a sequence of postve

More information

Report on Image warping

Report on Image warping Report on Image warpng Xuan Ne, Dec. 20, 2004 Ths document summarzed the algorthms of our mage warpng soluton for further study, and there s a detaled descrpton about the mplementaton of these algorthms.

More information

Lecture 14: Bandits with Budget Constraints

Lecture 14: Bandits with Budget Constraints IEOR 8100-001: Learnng and Optmzaton for Sequental Decson Makng 03/07/16 Lecture 14: andts wth udget Constrants Instructor: Shpra Agrawal Scrbed by: Zhpeng Lu 1 Problem defnton In the regular Mult-armed

More information

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1 Random varables Measure of central tendences and varablty (means and varances) Jont densty functons and ndependence Measures of assocaton (covarance and correlaton) Interestng result Condtonal dstrbutons

More information

Approximate Smallest Enclosing Balls

Approximate Smallest Enclosing Balls Chapter 5 Approxmate Smallest Enclosng Balls 5. Boundng Volumes A boundng volume for a set S R d s a superset of S wth a smple shape, for example a box, a ball, or an ellpsod. Fgure 5.: Boundng boxes Q(P

More information

Convergence of random processes

Convergence of random processes DS-GA 12 Lecture notes 6 Fall 216 Convergence of random processes 1 Introducton In these notes we study convergence of dscrete random processes. Ths allows to characterze phenomena such as the law of large

More information

Bézier curves. Michael S. Floater. September 10, These notes provide an introduction to Bézier curves. i=0

Bézier curves. Michael S. Floater. September 10, These notes provide an introduction to Bézier curves. i=0 Bézer curves Mchael S. Floater September 1, 215 These notes provde an ntroducton to Bézer curves. 1 Bernsten polynomals Recall that a real polynomal of a real varable x R, wth degree n, s a functon of

More information

Introduction to Algorithms

Introduction to Algorithms Introducton to Algorthms 6.046J/8.40J Lecture 7 Prof. Potr Indyk Data Structures Role of data structures: Encapsulate data Support certan operatons (e.g., INSERT, DELETE, SEARCH) Our focus: effcency of

More information