Dimensionality Reduction Notes 1
|
|
- Erin Stokes
- 5 years ago
- Views:
Transcription
1 Dmensonalty Reducton Notes 1 Jelan Nelson mnlek@seas.harvard.edu August 10, Prelmnares Here we collect some notaton and basc lemmas used throughout ths note. Throughout, for a random varable X, X p denotes (E X p ) 1/p. It s known that p s a norm for any p 1 (Mnkowsk s nequalty). It s also known X p X q whenever p q. Henceforth, whenever we dscuss p, we wll assume p 1. Lemma 1 (Khntchne nequalty). For any p 1, x R n, and (σ ) ndependent Rademachers, σ x p p x 2 Proof. Wthout loss of generalty we can assume p s an even nteger. Consder (g ) ndependent gaussans of mean zero and varance 1. Expand E( σ x ) p nto a sum of monomals. Any monomal wth odd exponents vanshes, as n the gaussan case. Meanwhle other monomals are nonnegatve wth all Rademacher moments beng 1, whle n the gaussan case the moments are at least 1. Thus the Rademacher case s term-by-term domnated by the gaussan case and σ x p g x p. But g x s a gaussan wth mean zero and varance x 2 2, and hence ts p-norm s x 2 (p!/(2 p/2 (p/2)!)) 1/p. We often use Jensen s nequalty below, especally for F (x) = x p (p 1). Lemma 2 (Jensen s nequalty). For F convex, F (E X) E F (X). 1
2 Before provng a couple concentraton nequaltes, we prove a lemma now whch lets us freely obtan tal bounds from moment bounds and vce versa (often we prove a moment bound and later nvoke a tal bound, or vce versa, wthout even mentonng any justfcaton). Lemma 3. Let Z be a scalar random varable. Consder the followng statements: (1a) There exsts σ > 0 s.t. p 1, Z p C 1 σ p. (1b) There exsts σ > 0 s.t. λ > 0, P( Z > λ) C 2 e C 2 λ2 /σ 2. (2a) There exsts K > 0 s.t. p 1, Z p C 3 Kp. (2b) There exsts K > 0 s.t. λ > 0, P( Z > λ) C 4 e C 4 λ/k. (3a) There exst σ, K > 0 s.t. p 1, Z p C 5 (σ p + Kp). (3b) There exst σ, K > 0 s.t. λ > 0, P( Z > λ) C 6 (e C 6 λ2 /σ 2 + e C 6 λ/k ). Then 1a s equvalent to 1b, 2a s equvalent to 2b, and 3a s equvalent to 3b, where the constants C, C n each case change by at most some absolute constant factor. Proof. We wll show only that 1a s equvalent to 1b; the other cases are argued dentcally. To show that 1a mples 1b, by Markov s nequalty ( ) C P(Z > λ) λ p E Z p 2 1 σ 2 p/2. λ 2 p Statement 1b follows by choosng p = max{1, 2C1λ 2 2 /σ 2 }. To show that 1b mples 1a, by ntegraton by parts we have E Z p = 0 px p 1 P( Z > λ)dλ 2C 2 p 0 px p 1 e C 2 λ2 /σ 2 dλ. The ntegral on the rght hand sde s exactly the pth moment of a gaussan random varable wth mean zero and varance σ 2 = σ 2 /(2C 2). Statement 1a then follows snce such a gaussan has p-norm Θ(σ p). 2
3 Now, the followng s a bread-and-butter trck for boundng pth moments of sums of ndependent random varables. A more general verson of ths lemma can be found as Lemma 6.3 n [LT91]. Lemma 4 (Symmetrzaton / Desymmetrzaton). Let Z 1,..., Z n be ndependent random varables. Let r 1,..., r n be ndependent Rademachers. Then Z E Z p 2 r Z p (symmetrzaton nequalty) and (1/2) r (Z E Z ) p Z p (desymmetrzaton nequalty). Proof. For the frst nequalty, let Y 1,..., Y n be ndependent of the Z but dentcally dstrbuted to them. Then Z E Z p = Z E Y p Y (Z Y ) p (Jensen) = r (Z Y ) p (1) 2 r X p (trangle nequalty) (1) follows snce the X Y are ndependent across and symmetrc. For the second nequalty, let Y be as before. Then r (Z E Z ) p = E r (Z Y ) p Y r (Z Y ) p (Jensen) = (Z Y ) p 2 Z p (trangle nequalty) 3
4 Lemma 5 (Decouplng [dlpng99]). Let x 1,..., x n be ndependent and mean zero, and x 1,..., x n dentcally dstrbuted as the x and ndependent of them. Then for any (a,j ) and for all p 1 j a,j x x j p 4,j a,j x x j p Proof. Let η 1,..., η n be ndependent Bernoull random varables each of expectaton 1/2. Then a,j x x j p = 4 E a,j x x j η 1 η j p η j j 4 j a,j x x j η (1 η j ) p (Jensen) (2) Hence there must be some fxed vector η {0, 1} n whch acheves a,j x x j η (1 η j ) p a,j x x j p j S where S = { : η = 1}. Let x S denote the S -dmensonal vector correspondng to the x for S. Then a,j x x j p = a,j x x j p S j / S S j / S = E E a,j x x xs j p ( E x = E x x S j = 0),j,j j / S a,j x x j p (Jensen) The followng proof of the Hanson-Wrght was shared to me by Sjoerd Drksen (personal communcaton). Theorem 1 (Hanson-Wrght nequalty [HW71]). For σ 1,..., σ n ndependent Rademachers and A R n n real and symmetrc, for all p 1 σ T Aσ E σ T Aσ p p A F + p A. 4
5 Proof. Wthout loss of generalty we assume n ths proof that p 2 (so that p/2 1). Then σ T Aσ E σ T Aσ p σ T Aσ p (Lemma 5) (3) p Ax 2 p (Khntchne) (4) = p Ax 2 2 1/2 p/2 (5) p Ax 2 2 1/2 p p ( A 2 F + Ax 2 2 E Ax 2 2 p ) 1/2 (trangle nequalty) p A F + p Ax 2 2 E Ax 2 2 1/2 p p A F + p x T A T Ax 1/2 p (Lemma 5) p A F + p 3/4 A T Ax 2 1/2 p (Khntchne) p A F + p 3/4 A 1/2 Ax 2 p 1/2 (6) Wrtng E = Ax 2 1/2 p and comparng (4) and (6), we see that for some constant C > 0, E 2 Cp 1/4 A 1/2 E C A F 0. Thus E must be smaller than the larger root of the above quadratc equaton, mplyng our desred upper bound on E 2. Remark 1. The square root trck n the proof of the Hanson-Wrght nequalty above s qute handy and can be used to prove several moment nequaltes (for example, you wll see how to prove the Bernsten nequalty wth t n tomorrow s lecture). As far as I am aware, the trck was frst used n a work of Rudelson [Rud99]. Remark 2. We could have upper bounded Eq. (5) by p A F + p Ax 2 2 E Ax 2 2 1/2 p/2 by the trangle nequalty. Now notce we have bounded the pth central moment of a symmetrc quadratc form (3) by the p/2th moment also of a symmetrc quadratc form. Wrtng p = 2 k, ths observaton leads to a proof by nducton on k, whch was the approach used n [DKN10]. 5
6 2 Johnson-Lndenstrauss (JL) lemma Frst we prove the Dstrbutonal JL Lemma (DJL). Lemma 6. DJL Lemma For any nteger n > 1 and ε, δ (0, 1/2), there exsts a dstrbuton D ε,δ over R m n for m ε 2 log(1/δ)) such that for any x R n of unt Eucldean norm, P ( Πx > ε) < δ Π D ε,δ Proof. Wrte Π,j = σ,j / m, where the σ,j are ndependent Rademachers. Also overload σ to mean these Rademachers arranged as a vector of length mn, by concatenatng rows of Π. Note then Πx 2 2 = A x σ 2 2 where Thus A x = 1 m x T x T (7) 0 0 x T P( Πx > ε) = P( A x σ 2 2 E A x σ 2 2 > ε), where we see that the rght-hand sde s readly handled by the Hanson- Wrght nequalty Theorem 1 wth A = A T x A x. Now observe A s a blockdagonal matrx wth each block equalng (1/m)xx T, and thus A = x 2 2/m = 1/m. We also have A 2 F = 1/m. Thus Hanson-Wrght yelds P( Πx > ε) e Cε2m + e Cεm, whch for ε < 1 s at most δ for m ε 2 log(1/δ). The followng s what s usually referred to as the Johnson-Lndenstrauss (JL) lemma [JL84]. In ths note we typcally refer to t as the Metrc JL lemma (or MJL) to dstngush t from DJL above. At some ponts we smply say JL nstead of DJL or MJL, but the verson meant wll be understood from context. 6
7 Corollary 1 (Metrc JL lemma (MJL)). For any X = {x 1,..., x N } R n and 0 < ε < 1/2, there exsts f : X R m for m = O(ε 2 log N) such that for all 1 < j N, (1 ε) x x j 2 f(x ) f(x j ) 2 (1 + ε) x x j 2. (8) Proof. Let D ε,δ be as n DJL wth δ < 1/ ( ) N 2. Consder a random f, where f(x) = Πx for Π drawn from D ε,δ. By DJL, each vector of the form (x x j )/ x x j 2 has ts norm preserved up to 1 + ε wth probablty strctly larger than 1 1/ ( ) N 2. Thus by a unon bound over all, j, all such vectors are preserved wth postve probablty, showng exstence of the desred f. 2.1 Example applcaton: k-means clusterng In the k-means clusterng problem the nput conssts of x 1,..., x N R n and a postve nteger k, and the goal s to output some partton P of [n] nto k dsjont subsets P 1,..., P k as well as some y = (y 1,..., y k ) (R n ) k (the y need not be equal to any of the x and can be chosen arbtrarly) so as to mnmze the cost functon cost P,y (x 1,..., x N ) = k j=1 P j x y j 2 2. That s, the x are clustered nto k clusters accordng to P, and the cost of a gven clusterng s the sum of squared Eucldean dstances to the cluster centers (the y j s). Unfortunately fndng the optmal clusterng for k-means s NP-hard, however effcent approxmaton algorthms do exst whch fnd a clusterngs that are close to optmal. It s easy to show, e.g. by takng the gradent of the cost functon, that for a fxed partton P of [n], the optmal choce of cluster centers y for that gven P s the one where, for the P j of postve sze, y j = (1/ P j ) P j x. Thus we can restrct our attenton to just optmzng over P. For a set of nput ponts X, we let cost P (X) denote nf y cost P,y (X). Lemma 7. Let the nput ponts to k-means be X = {x 1,..., x n }. Then for any 0 < ε < 1/2, f f : X R m s such that, j (1 ε) x x j 2 2 f(x ) f(x j ) 2 2 (1 + ε) x x j 2 2 7
8 then for ˆP a γ-approxmate optmal clusterng for f(x) and P an optmal clusterng for X, t holds that ( ) 1 + ε cost(x) γ cost ˆP 1 ε (X). P Proof. Fx a partton P of [n] and wrte P = (P 1,..., P k ). Then cost (X) = x 1 x 2 2 P P j j [k] P j P j = 1 x x, x + x 2 2 P j j [k] P j P j P j P j = 1 ( ) x x 2 2 x, x P j 2 j [k] P j P j = 1 x x P j j [k] P j P j Thus f f satsfes the condton of the lemma, then (1 ε) cost(x) cost(f(x)) (1 + ε) cost (X) P P for all parttons P smultaneously. Thus we have (1 ε) cost(x) cost(f(x)) γ cost(f(x)) γ (1 + ε) cost (X). ˆP ˆP P P P The lemma follows by comparng the rght hand sde wth the left. References [DKN10] Ilas Dakonkolas, Danel M. Kane, and Jelan Nelson. Bounded ndependence fools degree-2 threshold functons. In 51th Annual IEEE Symposum on Foundatons of Computer Scence (FOCS), pages 11 20, [dlpng99] Vctor de la Peña and Evarst Gné. Decouplng: From dependence to ndependence. Probablty and ts Applcatons. Sprnger- Verlag, New York,
9 [HW71] [JL84] Davd Lee Hanson and Farroll Tm Wrght. A bound on tal probabltes for quadratc forms n ndependent random varables. Ann. Math. Statst., 42: , Wllam B. Johnson and Joram Lndenstrauss. Extensons of Lpschtz mappngs nto a Hlbert space. Contemporary Mathematcs, 26: , [LT91] Mchel Ledoux and Mchel Talagrand. Probablty n Banach spaces. Sprnger-Verlag, Berln, [Rud99] Mark Rudelson. Random vectors n the sotropc poston. J. Functonal Analyss, 164(1):60 72,
CIS 700: algorithms for Big Data
CIS 700: algorthms for Bg Data Lecture 5: Dmenson Reducton Sldes at htt://grgory.us/bg-data-class.html Grgory Yaroslavtsev htt://grgory.us Today Dmensonalty reducton AMS as dmensonalty reducton Johnson-Lndenstrauss
More informationHANSON-WRIGHT INEQUALITY AND SUB-GAUSSIAN CONCENTRATION
HANSON-WRIGHT INEQUALITY AND SUB-GAUSSIAN CONCENTRATION MARK RUDELSON AND ROMAN VERSHYNIN Abstract. In ths expostory note, we gve a modern proof of Hanson-Wrght nequalty for quadratc forms n sub-gaussan
More informationDimensionality Reduction Notes 2
Dmensonalty Reducton Notes 2 Jelan Nelson mnlek@seas.harvard.edu August 11, 2015 1 Optmalty theorems for JL Yesterday we saw for MJL that we could acheve target dmenson m = O(ε 2 log N), and for DJL we
More informationP exp(tx) = 1 + t 2k M 2k. k N
1. Subgaussan tals Defnton. Say that a random varable X has a subgaussan dstrbuton wth scale factor σ< f P exp(tx) exp(σ 2 t 2 /2) for all real t. For example, f X s dstrbuted N(,σ 2 ) then t s subgaussan.
More informationHANSON-WRIGHT INEQUALITY AND SUB-GAUSSIAN CONCENTRATION
HANSON-WRIGHT INEQUALITY AND SUB-GAUSSIAN CONCENTRATION MARK RUDELSON AND ROMAN VERSHYNIN Abstract. In ths expostory note, we gve a modern proof of Hanson-Wrght nequalty for quadratc forms n sub-gaussan
More informationU.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017
U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 4s Luca Trevsan September 5, 07 Summary of Lecture 4 In whch we ntroduce semdefnte programmng and apply t to Max Cut. Semdefnte Programmng Recall that
More informationAPPENDIX A Some Linear Algebra
APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,
More informationAn introduction to chaining, and applications to sublinear algorithms
An ntroducton to channg, and applcatons to sublnear algorthms Jelan Nelson Harvard August 28, 2015 What s ths talk about? What s ths talk about? Gven a collecton of random varables X 1, X 2,...,, we would
More informationEigenvalues of Random Graphs
Spectral Graph Theory Lecture 2 Egenvalues of Random Graphs Danel A. Spelman November 4, 202 2. Introducton In ths lecture, we consder a random graph on n vertces n whch each edge s chosen to be n the
More informationLecture 4: Constant Time SVD Approximation
Spectral Algorthms and Representatons eb. 17, Mar. 3 and 8, 005 Lecture 4: Constant Tme SVD Approxmaton Lecturer: Santosh Vempala Scrbe: Jangzhuo Chen Ths topc conssts of three lectures 0/17, 03/03, 03/08),
More informationLecture Notes on Linear Regression
Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume
More informationLecture 4: September 12
36-755: Advanced Statstcal Theory Fall 016 Lecture 4: September 1 Lecturer: Alessandro Rnaldo Scrbe: Xao Hu Ta Note: LaTeX template courtesy of UC Berkeley EECS dept. Dsclamer: These notes have not been
More informationMatrix Approximation via Sampling, Subspace Embedding. 1 Solving Linear Systems Using SVD
Matrx Approxmaton va Samplng, Subspace Embeddng Lecturer: Anup Rao Scrbe: Rashth Sharma, Peng Zhang 0/01/016 1 Solvng Lnear Systems Usng SVD Two applcatons of SVD have been covered so far. Today we loo
More informationCSCE 790S Background Results
CSCE 790S Background Results Stephen A. Fenner September 8, 011 Abstract These results are background to the course CSCE 790S/CSCE 790B, Quantum Computaton and Informaton (Sprng 007 and Fall 011). Each
More informationVapnik-Chervonenkis theory
Vapnk-Chervonenks theory Rs Kondor June 13, 2008 For the purposes of ths lecture, we restrct ourselves to the bnary supervsed batch learnng settng. We assume that we have an nput space X, and an unknown
More informationEdge Isoperimetric Inequalities
November 7, 2005 Ross M. Rchardson Edge Isopermetrc Inequaltes 1 Four Questons Recall that n the last lecture we looked at the problem of sopermetrc nequaltes n the hypercube, Q n. Our noton of boundary
More informationHanson-Wright inequality and sub-gaussian concentration
Electron. Commun. Probab. 18 (013, no. 8, 1 9. DOI: 10.114/ECP.v18-865 ISSN: 1083-589X ELECTRONIC COMMUNICATIONS n PROBABILITY Hanson-Wrght nequalty and sub-gaussan concentraton Mark Rudelson Roman Vershynn
More informationChapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems
Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons
More informationLectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix
Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could
More informationTAIL BOUNDS FOR SUMS OF GEOMETRIC AND EXPONENTIAL VARIABLES
TAIL BOUNDS FOR SUMS OF GEOMETRIC AND EXPONENTIAL VARIABLES SVANTE JANSON Abstract. We gve explct bounds for the tal probabltes for sums of ndependent geometrc or exponental varables, possbly wth dfferent
More informationMore metrics on cartesian products
More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of
More informationExercise Solutions to Real Analysis
xercse Solutons to Real Analyss Note: References refer to H. L. Royden, Real Analyss xersze 1. Gven any set A any ɛ > 0, there s an open set O such that A O m O m A + ɛ. Soluton 1. If m A =, then there
More informationThe Geometry of Logit and Probit
The Geometry of Logt and Probt Ths short note s meant as a supplement to Chapters and 3 of Spatal Models of Parlamentary Votng and the notaton and reference to fgures n the text below s to those two chapters.
More informationProblem Set 9 Solutions
Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem
More informationBezier curves. Michael S. Floater. August 25, These notes provide an introduction to Bezier curves. i=0
Bezer curves Mchael S. Floater August 25, 211 These notes provde an ntroducton to Bezer curves. 1 Bernsten polynomals Recall that a real polynomal of a real varable x R, wth degree n, s a functon of the
More informationMATH 829: Introduction to Data Mining and Analysis The EM algorithm (part 2)
1/16 MATH 829: Introducton to Data Mnng and Analyss The EM algorthm (part 2) Domnque Gullot Departments of Mathematcal Scences Unversty of Delaware Aprl 20, 2016 Recall 2/16 We are gven ndependent observatons
More informationApproximate Smallest Enclosing Balls
Chapter 5 Approxmate Smallest Enclosng Balls 5. Boundng Volumes A boundng volume for a set S R d s a superset of S wth a smple shape, for example a box, a ball, or an ellpsod. Fgure 5.: Boundng boxes Q(P
More informationLearning Theory: Lecture Notes
Learnng Theory: Lecture Notes Lecturer: Kamalka Chaudhur Scrbe: Qush Wang October 27, 2012 1 The Agnostc PAC Model Recall that one of the constrants of the PAC model s that the data dstrbuton has to be
More informationGeneral viscosity iterative method for a sequence of quasi-nonexpansive mappings
Avalable onlne at www.tjnsa.com J. Nonlnear Sc. Appl. 9 (2016), 5672 5682 Research Artcle General vscosty teratve method for a sequence of quas-nonexpansve mappngs Cuje Zhang, Ynan Wang College of Scence,
More informationBézier curves. Michael S. Floater. September 10, These notes provide an introduction to Bézier curves. i=0
Bézer curves Mchael S. Floater September 1, 215 These notes provde an ntroducton to Bézer curves. 1 Bernsten polynomals Recall that a real polynomal of a real varable x R, wth degree n, s a functon of
More informationMMA and GCMMA two methods for nonlinear optimization
MMA and GCMMA two methods for nonlnear optmzaton Krster Svanberg Optmzaton and Systems Theory, KTH, Stockholm, Sweden. krlle@math.kth.se Ths note descrbes the algorthms used n the author s 2007 mplementatons
More informationFinding Dense Subgraphs in G(n, 1/2)
Fndng Dense Subgraphs n Gn, 1/ Atsh Das Sarma 1, Amt Deshpande, and Rav Kannan 1 Georga Insttute of Technology,atsh@cc.gatech.edu Mcrosoft Research-Bangalore,amtdesh,annan@mcrosoft.com Abstract. Fndng
More informationStanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011
Stanford Unversty CS359G: Graph Parttonng and Expanders Handout 4 Luca Trevsan January 3, 0 Lecture 4 In whch we prove the dffcult drecton of Cheeger s nequalty. As n the past lectures, consder an undrected
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 1 10/1/013 Martngale Concentraton Inequaltes and Applcatons Content. 1. Exponental concentraton for martngales wth bounded ncrements.
More informationprinceton univ. F 13 cos 521: Advanced Algorithm Design Lecture 3: Large deviations bounds and applications Lecturer: Sanjeev Arora
prnceton unv. F 13 cos 521: Advanced Algorthm Desgn Lecture 3: Large devatons bounds and applcatons Lecturer: Sanjeev Arora Scrbe: Today s topc s devaton bounds: what s the probablty that a random varable
More informationSupplementary material: Margin based PU Learning. Matrix Concentration Inequalities
Supplementary materal: Margn based PU Learnng We gve the complete proofs of Theorem and n Secton We frst ntroduce the well-known concentraton nequalty, so the covarance estmator can be bounded Then we
More informationU.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016
U.C. Berkeley CS94: Spectral Methods and Expanders Handout 8 Luca Trevsan February 7, 06 Lecture 8: Spectral Algorthms Wrap-up In whch we talk about even more generalzatons of Cheeger s nequaltes, and
More information2.3 Nilpotent endomorphisms
s a block dagonal matrx, wth A Mat dm U (C) In fact, we can assume that B = B 1 B k, wth B an ordered bass of U, and that A = [f U ] B, where f U : U U s the restrcton of f to U 40 23 Nlpotent endomorphsms
More informationStrong Markov property: Same assertion holds for stopping times τ.
Brownan moton Let X ={X t : t R + } be a real-valued stochastc process: a famlty of real random varables all defned on the same probablty space. Defne F t = nformaton avalable by observng the process up
More informationYong Joon Ryang. 1. Introduction Consider the multicommodity transportation problem with convex quadratic cost function. 1 2 (x x0 ) T Q(x x 0 )
Kangweon-Kyungk Math. Jour. 4 1996), No. 1, pp. 7 16 AN ITERATIVE ROW-ACTION METHOD FOR MULTICOMMODITY TRANSPORTATION PROBLEMS Yong Joon Ryang Abstract. The optmzaton problems wth quadratc constrants often
More informationLecture 4: Universal Hash Functions/Streaming Cont d
CSE 5: Desgn and Analyss of Algorthms I Sprng 06 Lecture 4: Unversal Hash Functons/Streamng Cont d Lecturer: Shayan Oves Gharan Aprl 6th Scrbe: Jacob Schreber Dsclamer: These notes have not been subjected
More informationMAT 578 Functional Analysis
MAT 578 Functonal Analyss John Qugg Fall 2008 Locally convex spaces revsed September 6, 2008 Ths secton establshes the fundamental propertes of locally convex spaces. Acknowledgment: although I wrote these
More informationLinear Approximation with Regularization and Moving Least Squares
Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...
More informationFeature Selection: Part 1
CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?
More informationThe Expectation-Maximization Algorithm
The Expectaton-Maxmaton Algorthm Charles Elan elan@cs.ucsd.edu November 16, 2007 Ths chapter explans the EM algorthm at multple levels of generalty. Secton 1 gves the standard hgh-level verson of the algorthm.
More informationLecture 10 Support Vector Machines II
Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed
More informationModule 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur
Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:
More informationReport on Image warping
Report on Image warpng Xuan Ne, Dec. 20, 2004 Ths document summarzed the algorthms of our mage warpng soluton for further study, and there s a detaled descrpton about the mplementaton of these algorthms.
More informationLecture 5 September 17, 2015
CS 229r: Algorthms for Bg Data Fall 205 Prof. Jelan Nelson Lecture 5 September 7, 205 Scrbe: Yakr Reshef Recap and overvew Last tme we dscussed the problem of norm estmaton for p-norms wth p > 2. We had
More informationLecture 3. Ax x i a i. i i
18.409 The Behavor of Algorthms n Practce 2/14/2 Lecturer: Dan Spelman Lecture 3 Scrbe: Arvnd Sankar 1 Largest sngular value In order to bound the condton number, we need an upper bound on the largest
More informationprinceton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg
prnceton unv. F 17 cos 521: Advanced Algorthm Desgn Lecture 7: LP Dualty Lecturer: Matt Wenberg Scrbe: LP Dualty s an extremely useful tool for analyzng structural propertes of lnear programs. Whle there
More informationLecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem.
Lecture 14 (03/27/18). Channels. Decodng. Prevew of the Capacty Theorem. A. Barg The concept of a communcaton channel n nformaton theory s an abstracton for transmttng dgtal (and analog) nformaton from
More informationModule 2. Random Processes. Version 2 ECE IIT, Kharagpur
Module Random Processes Lesson 6 Functons of Random Varables After readng ths lesson, ou wll learn about cdf of functon of a random varable. Formula for determnng the pdf of a random varable. Let, X be
More information2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification
E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton
More informationLecture 12: Discrete Laplacian
Lecture 12: Dscrete Laplacan Scrbe: Tanye Lu Our goal s to come up wth a dscrete verson of Laplacan operator for trangulated surfaces, so that we can use t n practce to solve related problems We are mostly
More informationError Probability for M Signals
Chapter 3 rror Probablty for M Sgnals In ths chapter we dscuss the error probablty n decdng whch of M sgnals was transmtted over an arbtrary channel. We assume the sgnals are represented by a set of orthonormal
More informationTHE CHINESE REMAINDER THEOREM. We should thank the Chinese for their wonderful remainder theorem. Glenn Stevens
THE CHINESE REMAINDER THEOREM KEITH CONRAD We should thank the Chnese for ther wonderful remander theorem. Glenn Stevens 1. Introducton The Chnese remander theorem says we can unquely solve any par of
More informationRandomness and Computation
Randomness and Computaton or, Randomzed Algorthms Mary Cryan School of Informatcs Unversty of Ednburgh RC 208/9) Lecture 0 slde Balls n Bns m balls, n bns, and balls thrown unformly at random nto bns usually
More informationCommunication Complexity 16:198: February Lecture 4. x ij y ij
Communcaton Complexty 16:198:671 09 February 2010 Lecture 4 Lecturer: Troy Lee Scrbe: Rajat Mttal 1 Homework problem : Trbes We wll solve the thrd queston n the homework. The goal s to show that the nondetermnstc
More informationComposite Hypotheses testing
Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter
More informationMath 426: Probability MWF 1pm, Gasson 310 Homework 4 Selected Solutions
Exercses from Ross, 3, : Math 26: Probablty MWF pm, Gasson 30 Homework Selected Solutons 3, p. 05 Problems 76, 86 3, p. 06 Theoretcal exercses 3, 6, p. 63 Problems 5, 0, 20, p. 69 Theoretcal exercses 2,
More informationMATH 5707 HOMEWORK 4 SOLUTIONS 2. 2 i 2p i E(X i ) + E(Xi 2 ) ä i=1. i=1
MATH 5707 HOMEWORK 4 SOLUTIONS CİHAN BAHRAN 1. Let v 1,..., v n R m, all lengths v are not larger than 1. Let p 1,..., p n [0, 1] be arbtrary and set w = p 1 v 1 + + p n v n. Then there exst ε 1,..., ε
More informationA New Refinement of Jacobi Method for Solution of Linear System Equations AX=b
Int J Contemp Math Scences, Vol 3, 28, no 17, 819-827 A New Refnement of Jacob Method for Soluton of Lnear System Equatons AX=b F Naem Dafchah Department of Mathematcs, Faculty of Scences Unversty of Gulan,
More informationEconomics 101. Lecture 4 - Equilibrium and Efficiency
Economcs 0 Lecture 4 - Equlbrum and Effcency Intro As dscussed n the prevous lecture, we wll now move from an envronment where we looed at consumers mang decsons n solaton to analyzng economes full of
More informationSome basic inequalities. Definition. Let V be a vector space over the complex numbers. An inner product is given by a function, V V C
Some basc nequaltes Defnton. Let V be a vector space over the complex numbers. An nner product s gven by a functon, V V C (x, y) x, y satsfyng the followng propertes (for all x V, y V and c C) (1) x +
More informationInner Product. Euclidean Space. Orthonormal Basis. Orthogonal
Inner Product Defnton 1 () A Eucldean space s a fnte-dmensonal vector space over the reals R, wth an nner product,. Defnton 2 (Inner Product) An nner product, on a real vector space X s a symmetrc, blnear,
More informationExpected Value and Variance
MATH 38 Expected Value and Varance Dr. Neal, WKU We now shall dscuss how to fnd the average and standard devaton of a random varable X. Expected Value Defnton. The expected value (or average value, or
More informationREAL ANALYSIS I HOMEWORK 1
REAL ANALYSIS I HOMEWORK CİHAN BAHRAN The questons are from Tao s text. Exercse 0.0.. If (x α ) α A s a collecton of numbers x α [0, + ] such that x α
More informationSupplement to Clustering with Statistical Error Control
Supplement to Clusterng wth Statstcal Error Control Mchael Vogt Unversty of Bonn Matthas Schmd Unversty of Bonn In ths supplement, we provde the proofs that are omtted n the paper. In partcular, we derve
More informationEstimation: Part 2. Chapter GREG estimation
Chapter 9 Estmaton: Part 2 9. GREG estmaton In Chapter 8, we have seen that the regresson estmator s an effcent estmator when there s a lnear relatonshp between y and x. In ths chapter, we generalzed the
More informationThe Order Relation and Trace Inequalities for. Hermitian Operators
Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence
More informationLinear, affine, and convex sets and hulls In the sequel, unless otherwise specified, X will denote a real vector space.
Lnear, affne, and convex sets and hulls In the sequel, unless otherwse specfed, X wll denote a real vector space. Lnes and segments. Gven two ponts x, y X, we defne xy = {x + t(y x) : t R} = {(1 t)x +
More informationTHE WEIGHTED WEAK TYPE INEQUALITY FOR THE STRONG MAXIMAL FUNCTION
THE WEIGHTED WEAK TYPE INEQUALITY FO THE STONG MAXIMAL FUNCTION THEMIS MITSIS Abstract. We prove the natural Fefferman-Sten weak type nequalty for the strong maxmal functon n the plane, under the assumpton
More informationMath 217 Fall 2013 Homework 2 Solutions
Math 17 Fall 013 Homework Solutons Due Thursday Sept. 6, 013 5pm Ths homework conssts of 6 problems of 5 ponts each. The total s 30. You need to fully justfy your answer prove that your functon ndeed has
More informationAffine transformations and convexity
Affne transformatons and convexty The purpose of ths document s to prove some basc propertes of affne transformatons nvolvng convex sets. Here are a few onlne references for background nformaton: http://math.ucr.edu/
More informationP R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering /
Theory and Applcatons of Pattern Recognton 003, Rob Polkar, Rowan Unversty, Glassboro, NJ Lecture 4 Bayes Classfcaton Rule Dept. of Electrcal and Computer Engneerng 0909.40.0 / 0909.504.04 Theory & Applcatons
More informationFirst Year Examination Department of Statistics, University of Florida
Frst Year Examnaton Department of Statstcs, Unversty of Florda May 7, 010, 8:00 am - 1:00 noon Instructons: 1. You have four hours to answer questons n ths examnaton.. You must show your work to receve
More informatione - c o m p a n i o n
OPERATIONS RESEARCH http://dxdoorg/0287/opre007ec e - c o m p a n o n ONLY AVAILABLE IN ELECTRONIC FORM 202 INFORMS Electronc Companon Generalzed Quantty Competton for Multple Products and Loss of Effcency
More informationA CHARACTERIZATION OF ADDITIVE DERIVATIONS ON VON NEUMANN ALGEBRAS
Journal of Mathematcal Scences: Advances and Applcatons Volume 25, 2014, Pages 1-12 A CHARACTERIZATION OF ADDITIVE DERIVATIONS ON VON NEUMANN ALGEBRAS JIA JI, WEN ZHANG and XIAOFEI QI Department of Mathematcs
More informationProjective change between two Special (α, β)- Finsler Metrics
Internatonal Journal of Trend n Research and Development, Volume 2(6), ISSN 2394-9333 www.jtrd.com Projectve change between two Specal (, β)- Fnsler Metrcs Gayathr.K 1 and Narasmhamurthy.S.K 2 1 Assstant
More informationLecture 3: Shannon s Theorem
CSE 533: Error-Correctng Codes (Autumn 006 Lecture 3: Shannon s Theorem October 9, 006 Lecturer: Venkatesan Guruswam Scrbe: Wdad Machmouch 1 Communcaton Model The communcaton model we are usng conssts
More informationSTEINHAUS PROPERTY IN BANACH LATTICES
DEPARTMENT OF MATHEMATICS TECHNICAL REPORT STEINHAUS PROPERTY IN BANACH LATTICES DAMIAN KUBIAK AND DAVID TIDWELL SPRING 2015 No. 2015-1 TENNESSEE TECHNOLOGICAL UNIVERSITY Cookevlle, TN 38505 STEINHAUS
More informationLecture 2: Gram-Schmidt Vectors and the LLL Algorithm
NYU, Fall 2016 Lattces Mn Course Lecture 2: Gram-Schmdt Vectors and the LLL Algorthm Lecturer: Noah Stephens-Davdowtz 2.1 The Shortest Vector Problem In our last lecture, we consdered short solutons to
More informationErrors for Linear Systems
Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch
More information8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS
SECTION 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS 493 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS All the vector spaces you have studed thus far n the text are real vector spaces because the scalars
More informationSingular Value Decomposition: Theory and Applications
Sngular Value Decomposton: Theory and Applcatons Danel Khashab Sprng 2015 Last Update: March 2, 2015 1 Introducton A = UDV where columns of U and V are orthonormal and matrx D s dagonal wth postve real
More informationPROCEEDINGS OF THE AMERICAN MATHEMATICAL SOCIETY Volume 125, Number 7, July 1997, Pages 2119{2125 S (97) THE STRONG OPEN SET CONDITION
PROCDINGS OF TH AMRICAN MATHMATICAL SOCITY Volume 125, Number 7, July 1997, Pages 2119{2125 S 0002-9939(97)03816-1 TH STRONG OPN ST CONDITION IN TH RANDOM CAS NORBRT PATZSCHK (Communcated by Palle. T.
More informationDr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur
Analyss of Varance and Desgn of Exerments-I MODULE III LECTURE - 2 EXPERIMENTAL DESIGN MODELS Dr. Shalabh Deartment of Mathematcs and Statstcs Indan Insttute of Technology Kanur 2 We consder the models
More informationDISCRIMINANTS AND RAMIFIED PRIMES. 1. Introduction A prime number p is said to be ramified in a number field K if the prime ideal factorization
DISCRIMINANTS AND RAMIFIED PRIMES KEITH CONRAD 1. Introducton A prme number p s sad to be ramfed n a number feld K f the prme deal factorzaton (1.1) (p) = po K = p e 1 1 peg g has some e greater than 1.
More informationCS286r Assign One. Answer Key
CS286r Assgn One Answer Key 1 Game theory 1.1 1.1.1 Let off-equlbrum strateges also be that people contnue to play n Nash equlbrum. Devatng from any Nash equlbrum s a weakly domnated strategy. That s,
More informationMaximizing the number of nonnegative subsets
Maxmzng the number of nonnegatve subsets Noga Alon Hao Huang December 1, 213 Abstract Gven a set of n real numbers, f the sum of elements of every subset of sze larger than k s negatve, what s the maxmum
More informationn α j x j = 0 j=1 has a nontrivial solution. Here A is the n k matrix whose jth column is the vector for all t j=0
MODULE 2 Topcs: Lnear ndependence, bass and dmenson We have seen that f n a set of vectors one vector s a lnear combnaton of the remanng vectors n the set then the span of the set s unchanged f that vector
More informationDifference Equations
Dfference Equatons c Jan Vrbk 1 Bascs Suppose a sequence of numbers, say a 0,a 1,a,a 3,... s defned by a certan general relatonshp between, say, three consecutve values of the sequence, e.g. a + +3a +1
More informationNP-Completeness : Proofs
NP-Completeness : Proofs Proof Methods A method to show a decson problem Π NP-complete s as follows. (1) Show Π NP. (2) Choose an NP-complete problem Π. (3) Show Π Π. A method to show an optmzaton problem
More informationNotes on Frequency Estimation in Data Streams
Notes on Frequency Estmaton n Data Streams In (one of) the data streamng model(s), the data s a sequence of arrvals a 1, a 2,..., a m of the form a j = (, v) where s the dentty of the tem and belongs to
More informationThe Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD
he Gaussan classfer Nuno Vasconcelos ECE Department, UCSD Bayesan decson theory recall that we have state of the world X observatons g decson functon L[g,y] loss of predctng y wth g Bayes decson rule s
More informationAPPROXIMATE PRICES OF BASKET AND ASIAN OPTIONS DUPONT OLIVIER. Premia 14
APPROXIMAE PRICES OF BASKE AND ASIAN OPIONS DUPON OLIVIER Prema 14 Contents Introducton 1 1. Framewor 1 1.1. Baset optons 1.. Asan optons. Computng the prce 3. Lower bound 3.1. Closed formula for the prce
More informationBOUNDEDNESS OF THE RIESZ TRANSFORM WITH MATRIX A 2 WEIGHTS
BOUNDEDNESS OF THE IESZ TANSFOM WITH MATIX A WEIGHTS Introducton Let L = L ( n, be the functon space wth norm (ˆ f L = f(x C dx d < For a d d matrx valued functon W : wth W (x postve sem-defnte for all
More informationANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)
Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of
More informationLecture 20: Lift and Project, SDP Duality. Today we will study the Lift and Project method. Then we will prove the SDP duality theorem.
prnceton u. sp 02 cos 598B: algorthms and complexty Lecture 20: Lft and Project, SDP Dualty Lecturer: Sanjeev Arora Scrbe:Yury Makarychev Today we wll study the Lft and Project method. Then we wll prove
More information