Lecture 4: Constant Time SVD Approximation
|
|
- Homer Chester Moody
- 5 years ago
- Views:
Transcription
1 Spectral Algorthms and Representatons eb. 17, Mar. 3 and 8, 005 Lecture 4: Constant Tme SVD Approxmaton Lecturer: Santosh Vempala Scrbe: Jangzhuo Chen Ths topc conssts of three lectures 0/17, 03/03, 03/08), based on [KV04]. We are nterested n the followng problem. Problem: Gven A R m n, fnd D R m n wth ran D) to approxmate A. ormally, mn A D 1) D:ranD) Notaton. Let λ t,, v t) denote the t th sngular value, left sngular vector, rght sngular vector of A, respectvely. Let A denote the th column of A, A ) denote the th row of A. Let A λ t v t)t t1 Av t) v t)t. By a theorem of Ecart and Young, A s the optmal soluton to 1) and A A r t+1 λ t. So one way to solve 1) s to fnd the top rght sngular vectors of A, {v t) } t1. t1 1 Compute SVD Gven an m n matrx, t taes Θmn) tme to just read the nput. We are to fnd ts top rght sngular vectors. Notce that we may only get approxmatons snce the sngular values can be rratonal. So for the top rght sngular vector, we are to fnd ṽ such that Aṽ 1 ε)λ 1 and ṽ v 1) ε for some gven accuracy parameter ε. The followng Power method can fnd the top rght sngular vector. Power method: 1. Let v 0 be a random unt vector n R n.. Repeat v t+1 AT A)v t A T A)v t Remar 1.1. There are several questons concernng the power method. How to generate a random unt vector v 0 R n? All we need s a unform dstrbuton on the surface of unt sphere. We can use any sphercal dstrbuton to generate a random vector and scale t to unt length. 1
2 Does the teraton converge to the top rght sngular vector v 1)? How fast? If v t v 1), then A T A)v t A T λ 1 u 1) ) λ 1 A T u 1) ) λ 1 v1) and v t+1 AT A)v t A T A)v t v1) v t so v 1) s a fxed pont of the teraton. It can be shown that v t ṽ after O 1 ε log n) teratons. Snce each round taes Omn) tme, the tme complexty of the power method s O mn ε log n). Exercse: Prove bounds on the convergence of the power method. Hnt: use SVD. See Lemma 1 and Theorem 1 n [CKVW] for detals.) How to fnd top rght sngular vectors? nd the top one, compute A Avv T ) and repeat. It taes tme O mn ε Can we have a smaller tme complexty when we have a sparse matrx? log n). Suppose A has only M nonzero entres. Each teraton of the power method taes OM) tme. So t taes O M ε log n) to fnd v1). or a sparse matrx, can we acheve O M ε log n) tme complexty for sngular vectors? Notce that n search of the top sngular vector, each teraton of the power method taes OM) tme. After computng A Avv T ), however, the matrx s no longer sparse. ortunately, we can extend the power method as follows. 1. Randomly choose orthonormal matrx V 0 v0 1,..., v0 ) Rn.. Repeat V t+1 A T A)V t V t+1 V t+1 dag V t+1 ) 1 1,..., V t+1 ) 1) normalze each column) Note that f V t V v 1),..., v ) ), then A T A)V t AT Av 1),..., Av ) ) A T Aλ 1 u 1),..., λ u ) ) λ 1v 1),..., λ v) ) and V t+1 λ 1v 1),..., λ v) )dag λ 1,..., ) λ v 1),..., v ) ) V t Approxmate SVD Can we have a faster approxmate soluton to 1)? That s, we want to fnd D wth ran D), such that A D s small.
3 .1 Exstence of a Good Constant-Dmensonal Subspace It s shown n [KV04] that we only need to loo at O ε ) rows of A. Theorem 1. [KV04] Theorem ) Gven A R m n, nteger, and ε > 0, there exsts a subset S of ε rows of A, such that n ts span les a matrx Ã.e., every row of à s n span {S}) wth the followng property: A à mn A rand) D + ε A. ) Proof: Pc ε rows from the followng dstrbuton wth multplcty): P Pr{row s pced} A) A, 1,..., m. We need to show there s nonzero probablty ths subset has the desred property. It would be a bad dea to just pc top rows wth largest A ). See gure 1 for such an example.) A 3),..., A m) A 1) 0 A ) gure 1: Suppose A 1) A ) < A 3) A m). The optmal subspace s span { A 1), A ),..., A )}, whle the subspace spanned by the top rows, span { A 3),..., A +)}, s a bad subspace. Let S be the chosen subset. We wll dentfy vectors ŷ 1),..., ŷ ) n span {S} such that ŷ t) s close to v t), t 1,...,. Notce v t) s a lnear combnaton of rows of A: λ t v t) A T m 1 ut) A )T. How do we approxmate ths lnear combnaton? The random vector w t) defned as follows has mean λ t v t)t wth bounded varance. w t) 1 S A ) S A ) P Let s S. Wrte w t) 1 s s j1 X j where {X j } s j1 A ) ut) are..d. as X P wth probablty P. 3
4 The mean of w t) s: [ E w t)] E [X j ] m 1 A ) P P m 1 A ) λ t v t)t. The varance of w t) s: [ w E t) E [w t)] ] 1 s E [ X λ t v t)t ) X T λ t v t))] 1 [ E XX T ] λ ) t s m A ) P 1 s 1 1 m s 1 s 1 A λ t ) A )T P ) A λ t ) ) P λ t ) Let ŷ t) w t)t /λ t, V 1 span { ŷ 1),..., ŷ )}. We show that Proj V1 A approxmates A by showng an upper bound on E [ A Proj V1 A ]. Let ˆ t1 Avt) ŷ t)t. A Proj V1 A A Av t) ŷ t)t t1 n u )T A ˆ ) 1 u )T A ˆ ) n + 1 λ v )T λ ŷ t)t + 1 λ v )T w ) + 1 error of orthogonal projecton general error) +1 u )T A ˆ ) n +1 n +1 λ λ v )T Therefore, E [ A Proj V1 A ] [ w A A + E ) λ v )T ] 1 4
5 A A + s A A A + ε A when s ε ) The exstence of a subset S satsfyng the propertes n the theorem follows from the nequalty on the expectaton. Remar.1. Theorem 1 s an exstental theorem. We do not now n the defnton of w t). The correspondng algorthmc result s gven n [DK + 04], presented on 0/4 and 03/01.. Constant Tme SVD Approxmaton Algorthm Lemma 1. [KV04] Lemma ) Let M R a b. Let Q Q 1, Q,..., Q a be a probablty dstrbuton on [a] such that Q α M ), 1,..., a for some α [0, 1] so when α 1, we have M equaltes). Let σ 1,..., p ) be p ndependent samples from [a], each follows dstrbuton Q. Let N R p b wth N t) M t ), t 1,..., p. Then pqt E [ M T M N T N ] 1 αp M 4 3) Proof: We frst show E [ N T N ] M T M. [ N E T N ) ] r,s p E [N t,r N t,s ] t1 p t1 t1 Next we show a bound on E [ N T N ) r,s M T M ) r,s M t,rm t,s pq t Q t ) ] : E [ N T N ) r,s M T M ) r,s p t1 ) ] M t,rm t,s M T M ) r,s t1 E [N t,r N t,s ) ] E [N t,r N t,s ]) ) M p M t,r t,s t1 pq t ) Q t M p M t,r t,s p α M t) / M t1 M M M t,r t,s αp M t). t1 Thus, E [ M T M N T N ] 5
6 b [ N E T N ) r,s M T M ) ) ] r,s r,s1 M αp M αp t1 1 M t) M t) t1 1 αp M 4 b r,s1 M t,rm t,s Remar.. Lemma 1 suggests that we can approxmate the egenvectors of M T M or the rght sngular vectors of M T M and the subspace spanned by them) by the egenvectors of N T N or the rght sngular vectors of N T N and the subspace spanned by them). In our problem, f we sample the rows of A to get p n matrx S and sample the columns of S to get p p matrx W, then we may use the subspace spanned by the left sngular vectors of W to approxmate the subspace spanned by the left sngular vectors of S, and use the subspace spanned by the rght sngular vectors of S to approxmate the subspace spanned by the rght sngular vectors of A. But can we use the subspace spanned by the left sngular vectors of W to approxmate the subspace spanned by the rght sngular vectors of A? They are not even of the same dmenson. A ey observaton of [KV04] s that we can mae use of the subspace spanned by the left sngular vectors of S and get an approxmaton of the subspace spanned by the rght sngular vectors of S. Remar.3. Wth the Marov nequalty, Lemma 1 mples that wth probablty at least 1 1 θ αp, we can assume M T M N T N θ M. Algorthm: 1. Input: A R m n,, ε.. p f, ε) max 4 ε 3, 3 ε 4 ) 3. Row samplng) Let P P 1, P,..., P m be a probablty dstrbuton on [m] such that P c A), 1,..., m for some c [0, 1]. Let A 1,..., p be p ndependent samples from [m], each followng dstrbuton P. Let S R p n wth S t) A t ), t 1,..., p. ppt 4. Column samplng) Let P P 1, P,..., P n be a probablty dstrbuton on [n] such that P j c S j, j 1,..., n. Let j S 1,..., j p be p ndependent samples from [n], each followng dstrbuton P. Let W R p p wth W t Sj t pp ), t 1,..., p. jt 5. Compute the top left sngular vectors of W : u 1) W ),..., u ) W ). 6
7 6. lter) Let T {t : W T W ) γ W } where γ cε 8. or t T, let ˆvt) ST W ) W T W ). 7. Output ˆv t) for t T. The ran- approxmaton to A can be reconstructed as à A t T ˆvt)ˆv t)t.) Remar.4. Some comments about the algorthm. An mportant observaton that the algorthm s based on s that there exsts a submatrx W of A, whose sze s only p p, p f, ε), such that W contans an mplct approxmaton to A that satsfes ). The algorthm enables us to answer n constant tme the queston: does there exst a good ran- approxmaton to A? Samplng: or the algorthm we have the followng two assumptons on samplng. 1. We can pc row of A wth probablty Q c A), c [0, 1]. A. or any row, we can pc the j th entry wth probablty Q,j c A,j A ). Note that f no entry of A s much larger than the average, then samplng accordng to unform dstrbuton would be enough. To mplement the column samplng step n the algorthm, we can pc a row unformly from S and apply the second samplng assumpton. Suppose the entres of A come as a stream and we only have p p memory. How do we acheve the samplng assumptons? Consder a smpler queston. How to pc one from a stream of numbers a 1, a,..., such that P a a and only one number s ept at any tme? The answer s whle seeng a, replace the exstng number by a wth probablty Proof Setch: Defne the dfference between M and the projecton of M on subspace span { x ) I } as: M; x ), I) M M M I a. 1 a x ) x )T 4) If {x ) I} s an orthonormal bass, then M; x ), I) I x )T M T Mx ) Next we state the followng lemmas, whose proofs can be found n [KV04]. 7
8 Lemma. [KV04] Lemma 3) Suppose A T A S T S θ A. Then a) or any par of unt vectors z, z n the row space of A, z T A T Az z T S T Sz θ A. b) or any set of vectors z 1),..., z l), l n the row space of A, A; z ), [l]) S; z ), [l]) θ A. ollowng from Lemma 1, the samplng n the algorthm enables us to mae use of Lemma for several tmes. Lemma 3. [KV04] Clam 1) S; ˆv t), t T ) S T ; W ), t T ) ε 8 A Now we are ready to show that Lemma 1, b), and 3. Ã n the algorthm does satsfy ). The proof uses Theorem 1, 1. rom Lemma 1, wth some probablty, we can assume for some θ 40 cp. A T A S T S θ A and SS T W W T θ S. rom Theorem 1, there exst vectors x 1),..., x ) such that A; x t), t []) A A A ε 8 A A ε 8 A. 3. rom Lemma b), by pcng θ ε 8, S; x t), t []) A; x t), t []) θ A A ε 4 A. 4. Snce S and S T have the same sngular values, there exst vectors y t), t [] n the column space of S such that S T ; y t), t []) A ε 4 A. 5. rom Theorem 1, there exst vectors z t), t [] such that W T ; z t), t []) S T ; z t), t []) θ S A ε A, and specfcally, W ), t [] wll have ths property: W T ; W ), t []) A ε A. 8
9 6. rom Lemma b), S T ; W ), t T ) W T ; W ), t T ) θ S A 3ε 4 A. 7. Apply Lemma 3 and Lemma b): Ths mples ). A; ˆv t), t T ) S; ˆv t), t T ) θ A A ε A. References [CKVW] Cheng, D., Kannan, R., Vempala, S., and Wang, G.. A dvde-and-merge methodology for clusterng. To appear n Proceedngs of the ACM Symposum on Prncples of Database Systems, 005. [DK + 04] Drneas, P., reze, A., Kannan, R., Vempala, S., and Vnay, V. Clusterng large graphs va the sngular value decomposton. Machne Learnng, 56:9 33, 004. Prelmnary verson n Proceedngs of the 10th ACM-SIAM Symposum on Dscrete Algorthms SODA), Baltmore, [KV04] reze, A., Kannan, R., and Vempala, S. ast Monte-Carlo algorthms for fndng lowran approxmatons. Journal of the ACM JACM), 516): , 004. Prelmnary verson n Proceedngs of the 39th oundatons of Computer Scence OCS), Palo Alto,
Matrix Approximation via Sampling, Subspace Embedding. 1 Solving Linear Systems Using SVD
Matrx Approxmaton va Samplng, Subspace Embeddng Lecturer: Anup Rao Scrbe: Rashth Sharma, Peng Zhang 0/01/016 1 Solvng Lnear Systems Usng SVD Two applcatons of SVD have been covered so far. Today we loo
More information2.3 Nilpotent endomorphisms
s a block dagonal matrx, wth A Mat dm U (C) In fact, we can assume that B = B 1 B k, wth B an ordered bass of U, and that A = [f U ] B, where f U : U U s the restrcton of f to U 40 23 Nlpotent endomorphsms
More informationU.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016
U.C. Berkeley CS94: Spectral Methods and Expanders Handout 8 Luca Trevsan February 7, 06 Lecture 8: Spectral Algorthms Wrap-up In whch we talk about even more generalzatons of Cheeger s nequaltes, and
More informationLectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix
Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could
More informationLecture 10: May 6, 2013
TTIC/CMSC 31150 Mathematcal Toolkt Sprng 013 Madhur Tulsan Lecture 10: May 6, 013 Scrbe: Wenje Luo In today s lecture, we manly talked about random walk on graphs and ntroduce the concept of graph expander,
More informationSingular Value Decomposition: Theory and Applications
Sngular Value Decomposton: Theory and Applcatons Danel Khashab Sprng 2015 Last Update: March 2, 2015 1 Introducton A = UDV where columns of U and V are orthonormal and matrx D s dagonal wth postve real
More informationErrors for Linear Systems
Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch
More informationAPPENDIX A Some Linear Algebra
APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,
More informationU.C. Berkeley CS278: Computational Complexity Professor Luca Trevisan 2/21/2008. Notes for Lecture 8
U.C. Berkeley CS278: Computatonal Complexty Handout N8 Professor Luca Trevsan 2/21/2008 Notes for Lecture 8 1 Undrected Connectvty In the undrected s t connectvty problem (abbrevated ST-UCONN) we are gven
More information8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS
SECTION 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS 493 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS All the vector spaces you have studed thus far n the text are real vector spaces because the scalars
More informationLecture Notes on Linear Regression
Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume
More informationPh 219a/CS 219a. Exercises Due: Wednesday 23 October 2013
1 Ph 219a/CS 219a Exercses Due: Wednesday 23 October 2013 1.1 How far apart are two quantum states? Consder two quantum states descrbed by densty operators ρ and ρ n an N-dmensonal Hlbert space, and consder
More informationStanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011
Stanford Unversty CS359G: Graph Parttonng and Expanders Handout 4 Luca Trevsan January 3, 0 Lecture 4 In whch we prove the dffcult drecton of Cheeger s nequalty. As n the past lectures, consder an undrected
More informationCollege of Computer & Information Science Fall 2009 Northeastern University 20 October 2009
College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:
More informationEcon107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)
I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes
More informationn α j x j = 0 j=1 has a nontrivial solution. Here A is the n k matrix whose jth column is the vector for all t j=0
MODULE 2 Topcs: Lnear ndependence, bass and dmenson We have seen that f n a set of vectors one vector s a lnear combnaton of the remanng vectors n the set then the span of the set s unchanged f that vector
More information763622S ADVANCED QUANTUM MECHANICS Solution Set 1 Spring c n a n. c n 2 = 1.
7636S ADVANCED QUANTUM MECHANICS Soluton Set 1 Sprng 013 1 Warm-up Show that the egenvalues of a Hermtan operator  are real and that the egenkets correspondng to dfferent egenvalues are orthogonal (b)
More informationLecture 3. Ax x i a i. i i
18.409 The Behavor of Algorthms n Practce 2/14/2 Lecturer: Dan Spelman Lecture 3 Scrbe: Arvnd Sankar 1 Largest sngular value In order to bound the condton number, we need an upper bound on the largest
More informationLecture 2: Gram-Schmidt Vectors and the LLL Algorithm
NYU, Fall 2016 Lattces Mn Course Lecture 2: Gram-Schmdt Vectors and the LLL Algorthm Lecturer: Noah Stephens-Davdowtz 2.1 The Shortest Vector Problem In our last lecture, we consdered short solutons to
More informationProblem Set 9 Solutions
Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem
More informationMATH 5707 HOMEWORK 4 SOLUTIONS 2. 2 i 2p i E(X i ) + E(Xi 2 ) ä i=1. i=1
MATH 5707 HOMEWORK 4 SOLUTIONS CİHAN BAHRAN 1. Let v 1,..., v n R m, all lengths v are not larger than 1. Let p 1,..., p n [0, 1] be arbtrary and set w = p 1 v 1 + + p n v n. Then there exst ε 1,..., ε
More informationRandomness and Computation
Randomness and Computaton or, Randomzed Algorthms Mary Cryan School of Informatcs Unversty of Ednburgh RC 208/9) Lecture 0 slde Balls n Bns m balls, n bns, and balls thrown unformly at random nto bns usually
More information3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X
Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number
More informationCSCE 790S Background Results
CSCE 790S Background Results Stephen A. Fenner September 8, 011 Abstract These results are background to the course CSCE 790S/CSCE 790B, Quantum Computaton and Informaton (Sprng 007 and Fall 011). Each
More information10-701/ Machine Learning, Fall 2005 Homework 3
10-701/15-781 Machne Learnng, Fall 2005 Homework 3 Out: 10/20/05 Due: begnnng of the class 11/01/05 Instructons Contact questons-10701@autonlaborg for queston Problem 1 Regresson and Cross-valdaton [40
More informationLow correlation tensor decomposition via entropy maximization
CS369H: Herarches of Integer Programmng Relaxatons Sprng 2016-2017 Low correlaton tensor decomposton va entropy maxmzaton Lecture and notes by Sam Hopkns Scrbes: James Hong Overvew CS369H). These notes
More informationLinear, affine, and convex sets and hulls In the sequel, unless otherwise specified, X will denote a real vector space.
Lnear, affne, and convex sets and hulls In the sequel, unless otherwse specfed, X wll denote a real vector space. Lnes and segments. Gven two ponts x, y X, we defne xy = {x + t(y x) : t R} = {(1 t)x +
More informationDimensionality Reduction Notes 1
Dmensonalty Reducton Notes 1 Jelan Nelson mnlek@seas.harvard.edu August 10, 2015 1 Prelmnares Here we collect some notaton and basc lemmas used throughout ths note. Throughout, for a random varable X,
More informationCommunication Complexity 16:198: February Lecture 4. x ij y ij
Communcaton Complexty 16:198:671 09 February 2010 Lecture 4 Lecturer: Troy Lee Scrbe: Rajat Mttal 1 Homework problem : Trbes We wll solve the thrd queston n the homework. The goal s to show that the nondetermnstc
More informationFinding Dense Subgraphs in G(n, 1/2)
Fndng Dense Subgraphs n Gn, 1/ Atsh Das Sarma 1, Amt Deshpande, and Rav Kannan 1 Georga Insttute of Technology,atsh@cc.gatech.edu Mcrosoft Research-Bangalore,amtdesh,annan@mcrosoft.com Abstract. Fndng
More informationMath 217 Fall 2013 Homework 2 Solutions
Math 17 Fall 013 Homework Solutons Due Thursday Sept. 6, 013 5pm Ths homework conssts of 6 problems of 5 ponts each. The total s 30. You need to fully justfy your answer prove that your functon ndeed has
More informationStanford University CS254: Computational Complexity Notes 7 Luca Trevisan January 29, Notes for Lecture 7
Stanford Unversty CS54: Computatonal Complexty Notes 7 Luca Trevsan January 9, 014 Notes for Lecture 7 1 Approxmate Countng wt an N oracle We complete te proof of te followng result: Teorem 1 For every
More informationMATH 241B FUNCTIONAL ANALYSIS - NOTES EXAMPLES OF C ALGEBRAS
MATH 241B FUNCTIONAL ANALYSIS - NOTES EXAMPLES OF C ALGEBRAS These are nformal notes whch cover some of the materal whch s not n the course book. The man purpose s to gve a number of nontrval examples
More informationLearning Theory: Lecture Notes
Learnng Theory: Lecture Notes Lecturer: Kamalka Chaudhur Scrbe: Qush Wang October 27, 2012 1 The Agnostc PAC Model Recall that one of the constrants of the PAC model s that the data dstrbuton has to be
More information5 The Rational Canonical Form
5 The Ratonal Canoncal Form Here p s a monc rreducble factor of the mnmum polynomal m T and s not necessarly of degree one Let F p denote the feld constructed earler n the course, consstng of all matrces
More informationU.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017
U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 4s Luca Trevsan September 5, 07 Summary of Lecture 4 In whch we ntroduce semdefnte programmng and apply t to Max Cut. Semdefnte Programmng Recall that
More informationGrover s Algorithm + Quantum Zeno Effect + Vaidman
Grover s Algorthm + Quantum Zeno Effect + Vadman CS 294-2 Bomb 10/12/04 Fall 2004 Lecture 11 Grover s algorthm Recall that Grover s algorthm for searchng over a space of sze wors as follows: consder the
More informationMin Cut, Fast Cut, Polynomial Identities
Randomzed Algorthms, Summer 016 Mn Cut, Fast Cut, Polynomal Identtes Instructor: Thomas Kesselhem and Kurt Mehlhorn 1 Mn Cuts n Graphs Lecture (5 pages) Throughout ths secton, G = (V, E) s a mult-graph.
More information= = = (a) Use the MATLAB command rref to solve the system. (b) Let A be the coefficient matrix and B be the right-hand side of the system.
Chapter Matlab Exercses Chapter Matlab Exercses. Consder the lnear system of Example n Secton.. x x x y z y y z (a) Use the MATLAB command rref to solve the system. (b) Let A be the coeffcent matrx and
More informationSUCCESSIVE MINIMA AND LATTICE POINTS (AFTER HENK, GILLET AND SOULÉ) M(B) := # ( B Z N)
SUCCESSIVE MINIMA AND LATTICE POINTS (AFTER HENK, GILLET AND SOULÉ) S.BOUCKSOM Abstract. The goal of ths note s to present a remarably smple proof, due to Hen, of a result prevously obtaned by Gllet-Soulé,
More informationThe Geometry of Logit and Probit
The Geometry of Logt and Probt Ths short note s meant as a supplement to Chapters and 3 of Spatal Models of Parlamentary Votng and the notaton and reference to fgures n the text below s to those two chapters.
More informationA New Refinement of Jacobi Method for Solution of Linear System Equations AX=b
Int J Contemp Math Scences, Vol 3, 28, no 17, 819-827 A New Refnement of Jacob Method for Soluton of Lnear System Equatons AX=b F Naem Dafchah Department of Mathematcs, Faculty of Scences Unversty of Gulan,
More informationρ some λ THE INVERSE POWER METHOD (or INVERSE ITERATION) , for , or (more usually) to
THE INVERSE POWER METHOD (or INVERSE ITERATION) -- applcaton of the Power method to A some fxed constant ρ (whch s called a shft), x λ ρ If the egenpars of A are { ( λ, x ) } ( ), or (more usually) to,
More informationprinceton univ. F 13 cos 521: Advanced Algorithm Design Lecture 3: Large deviations bounds and applications Lecturer: Sanjeev Arora
prnceton unv. F 13 cos 521: Advanced Algorthm Desgn Lecture 3: Large devatons bounds and applcatons Lecturer: Sanjeev Arora Scrbe: Today s topc s devaton bounds: what s the probablty that a random varable
More information7. Products and matrix elements
7. Products and matrx elements 1 7. Products and matrx elements Based on the propertes of group representatons, a number of useful results can be derved. Consder a vector space V wth an nner product ψ
More informationA polynomial time algorithm for the ground state of one-dimensional gapped local Hamiltonians
A polynomal tme algorthm for the ground state of one-dmensonal gapped local Hamltonans Zeph Landau Umesh Vazran Thomas Vdck Aprl 16, 2015 Low entanglement approxmatons of the ground state Defnton 1. Gven
More informationLecture 10 Support Vector Machines II
Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed
More informationLinear Approximation with Regularization and Moving Least Squares
Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...
More informationNotes on Frequency Estimation in Data Streams
Notes on Frequency Estmaton n Data Streams In (one of) the data streamng model(s), the data s a sequence of arrvals a 1, a 2,..., a m of the form a j = (, v) where s the dentty of the tem and belongs to
More informationApproximate Smallest Enclosing Balls
Chapter 5 Approxmate Smallest Enclosng Balls 5. Boundng Volumes A boundng volume for a set S R d s a superset of S wth a smple shape, for example a box, a ball, or an ellpsod. Fgure 5.: Boundng boxes Q(P
More informationTHE CHINESE REMAINDER THEOREM. We should thank the Chinese for their wonderful remainder theorem. Glenn Stevens
THE CHINESE REMAINDER THEOREM KEITH CONRAD We should thank the Chnese for ther wonderful remander theorem. Glenn Stevens 1. Introducton The Chnese remander theorem says we can unquely solve any par of
More information18.1 Introduction and Recap
CS787: Advanced Algorthms Scrbe: Pryananda Shenoy and Shjn Kong Lecturer: Shuch Chawla Topc: Streamng Algorthmscontnued) Date: 0/26/2007 We contnue talng about streamng algorthms n ths lecture, ncludng
More informationCOS 511: Theoretical Machine Learning. Lecturer: Rob Schapire Lecture #16 Scribe: Yannan Wang April 3, 2014
COS 511: Theoretcal Machne Learnng Lecturer: Rob Schapre Lecture #16 Scrbe: Yannan Wang Aprl 3, 014 1 Introducton The goal of our onlne learnng scenaro from last class s C comparng wth best expert and
More informationLecture 4: November 17, Part 1 Single Buffer Management
Lecturer: Ad Rosén Algorthms for the anagement of Networs Fall 2003-2004 Lecture 4: November 7, 2003 Scrbe: Guy Grebla Part Sngle Buffer anagement In the prevous lecture we taled about the Combned Input
More informationLecture 12: Discrete Laplacian
Lecture 12: Dscrete Laplacan Scrbe: Tanye Lu Our goal s to come up wth a dscrete verson of Laplacan operator for trangulated surfaces, so that we can use t n practce to solve related problems We are mostly
More informationChapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems
Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons
More informationBeyond Zudilin s Conjectured q-analog of Schmidt s problem
Beyond Zudln s Conectured q-analog of Schmdt s problem Thotsaporn Ae Thanatpanonda thotsaporn@gmalcom Mathematcs Subect Classfcaton: 11B65 33B99 Abstract Usng the methodology of (rgorous expermental mathematcs
More informationLecture 21: Numerical methods for pricing American type derivatives
Lecture 21: Numercal methods for prcng Amercan type dervatves Xaoguang Wang STAT 598W Aprl 10th, 2014 (STAT 598W) Lecture 21 1 / 26 Outlne 1 Fnte Dfference Method Explct Method Penalty Method (STAT 598W)
More informationComposite Hypotheses testing
Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter
More informationThe Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction
ECONOMICS 5* -- NOTE (Summary) ECON 5* -- NOTE The Multple Classcal Lnear Regresson Model (CLRM): Specfcaton and Assumptons. Introducton CLRM stands for the Classcal Lnear Regresson Model. The CLRM s also
More informationExpected Value and Variance
MATH 38 Expected Value and Varance Dr. Neal, WKU We now shall dscuss how to fnd the average and standard devaton of a random varable X. Expected Value Defnton. The expected value (or average value, or
More informationSupplement to Clustering with Statistical Error Control
Supplement to Clusterng wth Statstcal Error Control Mchael Vogt Unversty of Bonn Matthas Schmd Unversty of Bonn In ths supplement, we provde the proofs that are omtted n the paper. In partcular, we derve
More informationLecture Space-Bounded Derandomization
Notes on Complexty Theory Last updated: October, 2008 Jonathan Katz Lecture Space-Bounded Derandomzaton 1 Space-Bounded Derandomzaton We now dscuss derandomzaton of space-bounded algorthms. Here non-trval
More informationFirst day August 1, Problems and Solutions
FOURTH INTERNATIONAL COMPETITION FOR UNIVERSITY STUDENTS IN MATHEMATICS July 30 August 4, 997, Plovdv, BULGARIA Frst day August, 997 Problems and Solutons Problem. Let {ε n } n= be a sequence of postve
More informationInner Product. Euclidean Space. Orthonormal Basis. Orthogonal
Inner Product Defnton 1 () A Eucldean space s a fnte-dmensonal vector space over the reals R, wth an nner product,. Defnton 2 (Inner Product) An nner product, on a real vector space X s a symmetrc, blnear,
More informationThe Prncpal Component Transform The Prncpal Component Transform s also called Karhunen-Loeve Transform (KLT, Hotellng Transform, oregenvector Transfor
Prncpal Component Transform Multvarate Random Sgnals A real tme sgnal x(t can be consdered as a random process and ts samples x m (m =0; ;N, 1 a random vector: The mean vector of X s X =[x0; ;x N,1] T
More informationPerron Vectors of an Irreducible Nonnegative Interval Matrix
Perron Vectors of an Irreducble Nonnegatve Interval Matrx Jr Rohn August 4 2005 Abstract As s well known an rreducble nonnegatve matrx possesses a unquely determned Perron vector. As the man result of
More informationA PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS
HCMC Unversty of Pedagogy Thong Nguyen Huu et al. A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS Thong Nguyen Huu and Hao Tran Van Department of mathematcs-nformaton,
More informationStanford University Graph Partitioning and Expanders Handout 3 Luca Trevisan May 8, 2013
Stanford Unversty Graph Parttonng and Expanders Handout 3 Luca Trevsan May 8, 03 Lecture 3 In whch we analyze the power method to approxmate egenvalues and egenvectors, and we descrbe some more algorthmc
More informationBOUNDEDNESS OF THE RIESZ TRANSFORM WITH MATRIX A 2 WEIGHTS
BOUNDEDNESS OF THE IESZ TANSFOM WITH MATIX A WEIGHTS Introducton Let L = L ( n, be the functon space wth norm (ˆ f L = f(x C dx d < For a d d matrx valued functon W : wth W (x postve sem-defnte for all
More informationErratum: A Generalized Path Integral Control Approach to Reinforcement Learning
Journal of Machne Learnng Research 00-9 Submtted /0; Publshed 7/ Erratum: A Generalzed Path Integral Control Approach to Renforcement Learnng Evangelos ATheodorou Jonas Buchl Stefan Schaal Department of
More information4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA
4 Analyss of Varance (ANOVA) 5 ANOVA 51 Introducton ANOVA ANOVA s a way to estmate and test the means of multple populatons We wll start wth one-way ANOVA If the populatons ncluded n the study are selected
More informationMore metrics on cartesian products
More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of
More informationLinear Regression Analysis: Terminology and Notation
ECON 35* -- Secton : Basc Concepts of Regresson Analyss (Page ) Lnear Regresson Analyss: Termnology and Notaton Consder the generc verson of the smple (two-varable) lnear regresson model. It s represented
More informationEigenvalues of Random Graphs
Spectral Graph Theory Lecture 2 Egenvalues of Random Graphs Danel A. Spelman November 4, 202 2. Introducton In ths lecture, we consder a random graph on n vertces n whch each edge s chosen to be n the
More informationp 1 c 2 + p 2 c 2 + p 3 c p m c 2
Where to put a faclty? Gven locatons p 1,..., p m n R n of m houses, want to choose a locaton c n R n for the fre staton. Want c to be as close as possble to all the house. We know how to measure dstance
More informationLecture 20: Lift and Project, SDP Duality. Today we will study the Lift and Project method. Then we will prove the SDP duality theorem.
prnceton u. sp 02 cos 598B: algorthms and complexty Lecture 20: Lft and Project, SDP Dualty Lecturer: Sanjeev Arora Scrbe:Yury Makarychev Today we wll study the Lft and Project method. Then we wll prove
More informationStatistical Inference. 2.3 Summary Statistics Measures of Center and Spread. parameters ( population characteristics )
Ismor Fscher, 8//008 Stat 54 / -8.3 Summary Statstcs Measures of Center and Spread Dstrbuton of dscrete contnuous POPULATION Random Varable, numercal True center =??? True spread =???? parameters ( populaton
More informationExercises of Chapter 2
Exercses of Chapter Chuang-Cheh Ln Department of Computer Scence and Informaton Engneerng, Natonal Chung Cheng Unversty, Mng-Hsung, Chay 61, Tawan. Exercse.6. Suppose that we ndependently roll two standard
More informationfind (x): given element x, return the canonical element of the set containing x;
COS 43 Sprng, 009 Dsjont Set Unon Problem: Mantan a collecton of dsjont sets. Two operatons: fnd the set contanng a gven element; unte two sets nto one (destructvely). Approach: Canoncal element method:
More informationprinceton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg
prnceton unv. F 17 cos 521: Advanced Algorthm Desgn Lecture 7: LP Dualty Lecturer: Matt Wenberg Scrbe: LP Dualty s an extremely useful tool for analyzng structural propertes of lnear programs. Whle there
More informationParametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010
Parametrc fractonal mputaton for mssng data analyss Jae Kwang Km Survey Workng Group Semnar March 29, 2010 1 Outlne Introducton Proposed method Fractonal mputaton Approxmaton Varance estmaton Multple mputaton
More informationSalmon: Lectures on partial differential equations. Consider the general linear, second-order PDE in the form. ,x 2
Salmon: Lectures on partal dfferental equatons 5. Classfcaton of second-order equatons There are general methods for classfyng hgher-order partal dfferental equatons. One s very general (applyng even to
More informationDynamic Programming. Preview. Dynamic Programming. Dynamic Programming. Dynamic Programming (Example: Fibonacci Sequence)
/24/27 Prevew Fbonacc Sequence Longest Common Subsequence Dynamc programmng s a method for solvng complex problems by breakng them down nto smpler sub-problems. It s applcable to problems exhbtng the propertes
More informationLecture 4: Universal Hash Functions/Streaming Cont d
CSE 5: Desgn and Analyss of Algorthms I Sprng 06 Lecture 4: Unversal Hash Functons/Streamng Cont d Lecturer: Shayan Oves Gharan Aprl 6th Scrbe: Jacob Schreber Dsclamer: These notes have not been subjected
More informationAdditional Codes using Finite Difference Method. 1 HJB Equation for Consumption-Saving Problem Without Uncertainty
Addtonal Codes usng Fnte Dfference Method Benamn Moll 1 HJB Equaton for Consumpton-Savng Problem Wthout Uncertanty Before consderng the case wth stochastc ncome n http://www.prnceton.edu/~moll/ HACTproect/HACT_Numercal_Appendx.pdf,
More informationGenericity of Critical Types
Genercty of Crtcal Types Y-Chun Chen Alfredo D Tllo Eduardo Fangold Syang Xong September 2008 Abstract Ely and Pesk 2008 offers an nsghtful characterzaton of crtcal types: a type s crtcal f and only f
More informationANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)
Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of
More informationThe Order Relation and Trace Inequalities for. Hermitian Operators
Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence
More informationNorms, Condition Numbers, Eigenvalues and Eigenvectors
Norms, Condton Numbers, Egenvalues and Egenvectors 1 Norms A norm s a measure of the sze of a matrx or a vector For vectors the common norms are: N a 2 = ( x 2 1/2 the Eucldean Norm (1a b 1 = =1 N x (1b
More informationMMA and GCMMA two methods for nonlinear optimization
MMA and GCMMA two methods for nonlnear optmzaton Krster Svanberg Optmzaton and Systems Theory, KTH, Stockholm, Sweden. krlle@math.kth.se Ths note descrbes the algorthms used n the author s 2007 mplementatons
More informationCOS 511: Theoretical Machine Learning. Lecturer: Rob Schapire Lecture # 15 Scribe: Jieming Mao April 1, 2013
COS 511: heoretcal Machne Learnng Lecturer: Rob Schapre Lecture # 15 Scrbe: Jemng Mao Aprl 1, 013 1 Bref revew 1.1 Learnng wth expert advce Last tme, we started to talk about learnng wth expert advce.
More informationU.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 6 Luca Trevisan September 12, 2017
U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 6 Luca Trevsan September, 07 Scrbed by Theo McKenze Lecture 6 In whch we study the spectrum of random graphs. Overvew When attemptng to fnd n polynomal
More informationC/CS/Phy191 Problem Set 3 Solutions Out: Oct 1, 2008., where ( 00. ), so the overall state of the system is ) ( ( ( ( 00 ± 11 ), Φ ± = 1
C/CS/Phy9 Problem Set 3 Solutons Out: Oct, 8 Suppose you have two qubts n some arbtrary entangled state ψ You apply the teleportaton protocol to each of the qubts separately What s the resultng state obtaned
More informationCOS 521: Advanced Algorithms Game Theory and Linear Programming
COS 521: Advanced Algorthms Game Theory and Lnear Programmng Moses Charkar February 27, 2013 In these notes, we ntroduce some basc concepts n game theory and lnear programmng (LP). We show a connecton
More informationPh 219a/CS 219a. Exercises Due: Wednesday 12 November 2008
1 Ph 19a/CS 19a Exercses Due: Wednesday 1 November 008.1 Whch state dd Alce make? Consder a game n whch Alce prepares one of two possble states: ether ρ 1 wth a pror probablty p 1, or ρ wth a pror probablty
More informationSTAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 16
STAT 39: MATHEMATICAL COMPUTATIONS I FALL 218 LECTURE 16 1 why teratve methods f we have a lnear system Ax = b where A s very, very large but s ether sparse or structured (eg, banded, Toepltz, banded plus
More informationxp(x µ) = 0 p(x = 0 µ) + 1 p(x = 1 µ) = µ
CSE 455/555 Sprng 2013 Homework 7: Parametrc Technques Jason J. Corso Computer Scence and Engneerng SUY at Buffalo jcorso@buffalo.edu Solutons by Yngbo Zhou Ths assgnment does not need to be submtted and
More informationDUE: WEDS FEB 21ST 2018
HOMEWORK # 1: FINITE DIFFERENCES IN ONE DIMENSION DUE: WEDS FEB 21ST 2018 1. Theory Beam bendng s a classcal engneerng analyss. The tradtonal soluton technque makes smplfyng assumptons such as a constant
More informationNotes prepared by Prof Mrs) M.J. Gholba Class M.Sc Part(I) Information Technology
Inverse transformatons Generaton of random observatons from gven dstrbutons Assume that random numbers,,, are readly avalable, where each tself s a random varable whch s unformly dstrbuted over the range(,).
More information