Supplement to: Subsampling Methods for Persistent Homology
|
|
- Kerry Neal
- 5 years ago
- Views:
Transcription
1 Suppleent to: Subsapling Methods for Persistent Hoology A. Technical results In this section, we present soe technical results that will be used to prove the ain theores. First, we expand the notation introduced in the body of the paper (Section 3). For any positive integer, let : X!D T be the diagra and : D T!L T the landscape, i.e. (X) = D X, for any X = {x,,x } Xand (D X )= D X = X, for any D X 2D T. Given µ 2P(X), we denote by µ the push-forward easure of µ by, that is µ =( ) µ. Siilarly, we denote by µ the push-forward (induced) easure of µ by, that is µ =( ) µ. For a fixed integer >0, consider the etric space (X, ) and the space X endowed with a etric. We ipose two conditions on : (C) Given a real nuber p, for any X = {x,...,x } X and Y = {y,...,y } X, (X, Y ) X i= Subsapling Methods for Persistent Hoology (x i,y i ) p! p, (9) (C2) For any X = {x,...,x } X and Y = {y,...,y } X, H(X, Y ) (X, Y ). (0) Two exaples of distance that satisfy conditions (C) and (C2) are the Hausdorff distance and the L p -distance (X, Y )=( P i= (x i,y i ) p ) p. Lea 5. For any probability easures µ, 2 P(X) and any etric : X X! R that satisfies (C), we have W,p(µ, ) p W,p (µ, ). Reark: The bound of the above lea is tight: it is an equality when µ is a Dirac easure and any other easure. Proof. Let 2P(X X) be a transport plan between µ and. Up to reordering the coponents of X 2, is a transport plan between µ and whose p-cost is given by (X, Y ) p d (X, Y ) X X X (x i,y i ) p d (x,y ) d (x,y ) = X X i= X X (x,y ) p d (x,y ). The lea follows by taking the iniu over all transport plans on both sides of this inequality. Lea 6. For any probability easures µ, 2 P(X) and any etric : X X! R that satisfies (C2), we have W db,p 2W,p(µ, ). Proof. This is a consequence of the stability theore for persistence diagras. Given X, Y X, define (X, Y )=(D X,D Y ). If 2P(X X ) is a transport plan between µ and then, is a transport plan between µ and. Its p-cost is given by d b (D X,D Y ) p d, (D X,D Y ) D T D T = d b ( (X), (Y )) p d (X, Y ) X X 2 H(X, Y ) p d (X, Y ) (stability theore ) X X 2 (X, Y ) p d (X, Y ). X X The lea follows by taking the iniu over all transport plans on both sides of this inequality. Lea 7. Let µ and be two probability easures on X. Let X µ and Y. E µ [ X] E W db,p. Proof. Let be a transport plan between µ and. For any t 2 R we have E µ [ X](t) E (t) p = E[ X(t) Y (t)] p E [ X(t) Y (t) p ] (Jensen inequality) E [d b (D X,D Y ) p ] (Stability of landscapes) = d b (D X,D Y ) p d (D X,D Y ) D T D T = C p ( ) p.
2 Subsapling Methods for Persistent Hoology The following lea is siilar to Theore 2 in Chazal et al. (20c). Lea 8. Let X be a saple of size fro a easure µ 2P(X) that satisfies the (a, b, r 0 )-standard assuption. /b. Let r =2 log a log E [H(X, X µ )] r 0 +2 a +2C (a, b) log a /b (r 0,)(r )+ /b (log ) 2, where C (a, b) is a constant depending on a and b. Proof. Let r > r 0. It can be proven that q := Cv (X r/2) b _, where Cv(X 2r) denotes the nuber of balls of radius r/2 that are necessary to cover X µ. Let C = {x,...,x p } be a set of centers such that B(x,r/2),...,B(x p,r/2) is a covering of X µ., P (H(X, X µ ) >r) P (H(X, C)+H(C, X µ ) >r) P (H(X, C) >r/2) P (9i 2{,,p} such that X \ B(x i,r/2) = ;) px P (X \ B(x i,r/2) = ;) i= b b inf b exp i=...p P(B(x i,r/2)) 2 b a 2 b rb. E [H(X, X µ )] = P (H(X, X µ ) >r) dr r 0 + P (H(X, X µ ) >r) dr. () If r r 0 then the last quantity in () is bounded by r 0 + P (H(X, X µ ) >r) dr, r>r otherwise () is bounded by r 0 + P (H(X, X µ ) >r) dr r 0 + r + P (H(X, X µ ) >r) dr. r>r In either case, we follow the strategy in Chazal et al. (20c) to obtain the following bound: P (H(X, X µ ) >r) dr r>r 2 C(a, b) which iplies that log a /b (log ) 2, E [H(X, X µ )] r 0 + r (r0,)(r )+ /b log +2C(a, b) a (log ) 2. B. Main Proofs Proof of Theore 5 It iediately follows fro the three following inequalities of Leas 5, 6 and 7: upper bound on the Wasserstein distance between the tensor product of easures: W,p(µ, ) p W,p (µ, ) fro easures on X to easures on D: W db,p 2W,p(µ, ) fro easures on D to difference of the expected landscapes: E µ [ X] E Proof of Theore 7 2W db,p ke µ ( X ) E ( Y )k = P µ (k X Y k >") d" ">0 = " 0 + P µ (k X ">" 0 Y k >") d". (2) The event {k X Y k >"} inside the integral iplies that " 0 2 " 2 <H(X, Y ) H(X, X µ)+h(x X )+H(Y,X ), (3) where X and Y are two saples of points fro µ and, respectively. Let " 0 =2H(X X ). By (3), it follows that at least one of the following conditions holds: H(X, X µ ) H(Y,X ),.
3 Subsapling Methods for Persistent Hoology We assue that the first condition holds (the other case follows siilarly). the last quantity in equation (2) can be bounded by " 0 + P ">" 0 =2H(X X )+ H(X, X µ ) u>0 d" P (H(X, X µ ) =2H(X X )+E[H(X, X µ )] /b log 2H(X X )+r 0 +8 a /b log +8C (a, b) a (log ) 2, u) du where the last inequality follows fro Lea 8. Proof of Theore 9 Lea 8. (r 0,)(r )+ It follows directly fro (8) and Proof of Theore 0 E hk Xµ i c n k h i 2E H(X C d n ) 2 P H(X C d n ) >r dr 2r 0 +2 [P (H(X S ) >r)] n dr 2r 0 +2 exp a n 2 b rb dr, where the last inequality follows fro Lea 8. If r r 0 then the last ter is upper bounded by 2r 0 +2 r>r exp a n 2 b rb dr, otherwise it is bounded by 2r 0 +2r +2 r>r b exp a 2 b rb n dr. In either case, r>r exp a n 2 b rb dr =2 2bn b (a) /b n u /b n exp( nu)du u>log log(2 b /b ) 2C 2 (a, b) a n [ log(2 b )] n+, where in the last inequality we applied the sae strategy used to prove Theore 2 in Chazal et al. (20c). C. About the (a, b, r 0 )-standard assuption The ai of this section is to explain why the (a, b, r 0 )- standard assuption is relevant, in particular when µ is a discrete easure. Our arguent is related to weighted epirical processes, which have been studied in details by Alexander; see Alexander (985; 987b;a). A new look on this proble has been proposed ore recently in Giné & Koltchinskii (2006); Giné et al. (2003) by using Talagrand concentration inequalities. The following result fro Alexander (985) will be sufficient here. Let (X,, ) be a easure etric space and let N be the epirical counterpart of. Proposition 9. Let C be a VC class of easurable sets of index v of X. for every, ">0there exists K such that h n o i sup N (C) (C) (C) : (C) Kv log N N,C2C >" = O(N (+ )v ). () Assue that µ is the discrete unifor easure on a point cloud X N = {x,...,x N } which has been sapled fro, thus µ = N. Assue oreover that satisfies an (a 0,b)-standard assuption (r 0 =0). Let r 0 be a positive function of N chosen further. For any (N) and any y 2 X µ : inf µ(b(y, r)) = inf N (B(y, r)) = inf (B(y, r)) ( ^ a 0 r b ) inf ( ^ a 0 r b ) inf ( sup x2x sup x2x,r 0 r 0(N) (B(y, r)) N (B(y, r)) (B(y, r)) (B(x, r)) N (B(x, r)) (B(x, r)) (B(x, r 0 )) N (B(x, r 0 )) (B(x, r 0 )) (5) Assue that the set of balls in (X, ) has a finite VCdiension v. For instance, in R d, the VC-diension of balls is d+. Under this assuption we apply Alexander s Proposition with (for instance) =and " =/2. Let K>0such that () is satisfied., by setting Kv r 0 (N) := a 0 /b log N, N we finally obtain using () and (5) that inf µ(b(y, r)) ^ a0,r r N 2 rb = O(N 2v ). )
4 Subsapling Methods for Persistent Hoology In this quite general context, we see that by taking r 0 /b, of the order of for large values of N the log N N (a, b, r 0 )-standard assuption is satisfied with high probability (in ). D. Robustness to Outliers The average landscape ethod is insensitive to outliers, as can be seen by the stability result of Theore 5. The probability ass of an outlier gives a inial contribution to the Wasserstein distance on the right hand side of the inequality. For exaple, suppose that X N = {x,...,x N } is a rando saple fro the unit circle S 2, and let Y N = X N \{x }[{(0, 0)}. See Figure 7. The landscapes X N and YN are very different because of the presence of the outlier (0, 0). On the other hand, the average landscapes constructed by ultiple subsaples of < N points fro X N and Y N are close to each other. Forally, let µ be the discrete unifor easure that put ass /N on each point of X N and siilarly let be the discrete unifor easure on Y N. The st Wasserstein distance between the two easure is /N and, according to Theore 5, the difference between the average landscapes is E µ [ X] E 2 N. More forally, we can show that the average landscape n can be ore accurate than the closest subsaple ethod when there are outliers. In fact, n can even be ore accurate than the landscape corresponding to a large saple of N points. Suppose that the large, given point cloud X N = {x,...,x N } has a sall fraction of outliers. Specifically, X N = G S B where G = {x,...,x G } are the good observations and B = {y,...,y B } are the outliers (bad observations). Let G = G, B = B so that N = G + B and let = B/N which we assue is sall but positive. Our target is the landscape based on the non-outliers, naely, G. The presence of the outliers eans that X N 6= G. Let =inf S S G >,for soe >0, where the infiu is over all subsets that contain at least one outlier. Thus, denotes the inial bias due to the outliers. We consider three estiators: X N : landscape fro full given saple X N ; n : average landscape fro n subsaples of size ; c n : landscape of closest subsaple, fro n subsaples of size. The last two estiators are defined in Section 3, and are constructed using n independent saples of size fro the discrete unifor easure that puts ass /N on each point of X N. Proposition 20. If = o(/n), then, for large enough n and, Ek n Gk < Ek XN Gk. (6) In addition, if n!then i P hek n Gk < k c n Gk!. (7) Proof. We say that a subsaple is clean if it contains no outliers and that it is infected if it contains at least one outlier. Let S,...,S n be the subsaples of X N of size. Let I = {i : S i is infected} and C = {i : S i is clean}. n = n 0 n 0 + n n where n 0 is the nuber of clean subsaples, n = n n 0 is the nuber of infected subsaples, 0 = (/n 0 ) P i2c S i and =(/n ) P i2i S i. Hence, k n Gk n 0 n k 0 Gk + n n k Gk n 0 n k 0 Gk + Tn 2n. A subsaple is clean with probability ( ). Thus, n 0 Binoial(n, ( ) ) and n Binoial(n, ( ) ). Let = ( ). By Hoeffding s inequality Tn P 2n > Tn T = P 2 2n 2 > T 2 2! 2 exp 2n. T Since = o(/n), we eventually have that = ( ) < T r log n 2n, Tn which iplies that P 2n > 2 < /n. So, except on a set of probability tending to 0, k n Gk n 0 n k 0 Gk + 2 k 0 G k + 2 and thus, as soon as n, and N are large enough, Ek n Gk = k XN Gk. This proves the first clai. To prove the second clai, note that the probability that at least one subsaple is infected is ( ) n e n!. So with probability tending to one, there will be an infected subsaple. This subsaple will iniize H(X, S j ) and the landscape based on this selected subsaple will have a bias of order.
5 Subsapling Methods for Persistent Hoology X N + (0,0) (Death+Birth)/2 (Death Birth)/2 Diagras D XN and D YN (di ) D XN D YN st Landscape (di ) landscape of X N landscape of Y N E Ψµ (λ) E Ψν (λ) Figure 7. Left: X N is the set of N =500points on the unit circle; Y N = X N \{x }[{(0, 0)}. Middle: persistence diagras (di ) of the VR filtrations on X N and Y N, in the sae plot, with different sybols. Right: landscapes of X N, Y N and the corresponding average landscapes constructed by subsapling =00points fro the two sets, for n =30ties. In practice, we can increase the robustness further, by using filtered subsapling. This can be done using the distance to the k-th nearest neighbor or using a kernel density estiator. For exaple, let bp h (x) = N NX j= K x Xi h be a kernel density estiator with bandwidth h and kernel K. Suppose that all subsaples are chosen fro the filtered set F = {X i : bp h (X i ) >t}. Suppose that the good observations G are sapled fro a distribution on a set A [0, ] d satisfying the (a, b)-standard condition with b<d, a>0 and that B consists of B observations sapled fro a unifor on [0, ] d. For any x 2 A, E[bp h (x)] ahb h d and for any outlier X i we have (for h sall enough) that bp h (X i )=/(nh d ). Hence, if we choose t such that nh d <t< a h d b then F = G with high probability.
E0 370 Statistical Learning Theory Lecture 5 (Aug 25, 2011)
E0 370 Statistical Learning Theory Lecture 5 Aug 5, 0 Covering Nubers, Pseudo-Diension, and Fat-Shattering Diension Lecturer: Shivani Agarwal Scribe: Shivani Agarwal Introduction So far we have seen how
More informationSymmetrization and Rademacher Averages
Stat 928: Statistical Learning Theory Lecture: Syetrization and Radeacher Averages Instructor: Sha Kakade Radeacher Averages Recall that we are interested in bounding the difference between epirical and
More informationA Simple Regression Problem
A Siple Regression Proble R. M. Castro March 23, 2 In this brief note a siple regression proble will be introduced, illustrating clearly the bias-variance tradeoff. Let Y i f(x i ) + W i, i,..., n, where
More information1 Rademacher Complexity Bounds
COS 511: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #10 Scribe: Max Goer March 07, 2013 1 Radeacher Coplexity Bounds Recall the following theore fro last lecture: Theore 1. With probability
More informationPrerequisites. We recall: Theorem 2 A subset of a countably innite set is countable.
Prerequisites 1 Set Theory We recall the basic facts about countable and uncountable sets, union and intersection of sets and iages and preiages of functions. 1.1 Countable and uncountable sets We can
More informationComputational and Statistical Learning Theory
Coputational and Statistical Learning Theory TTIC 31120 Prof. Nati Srebro Lecture 2: PAC Learning and VC Theory I Fro Adversarial Online to Statistical Three reasons to ove fro worst-case deterinistic
More informationBootstrapping Dependent Data
Bootstrapping Dependent Data One of the key issues confronting bootstrap resapling approxiations is how to deal with dependent data. Consider a sequence fx t g n t= of dependent rando variables. Clearly
More informationE0 370 Statistical Learning Theory Lecture 6 (Aug 30, 2011) Margin Analysis
E0 370 tatistical Learning Theory Lecture 6 (Aug 30, 20) Margin Analysis Lecturer: hivani Agarwal cribe: Narasihan R Introduction In the last few lectures we have seen how to obtain high confidence bounds
More informationWEIGHTED PERSISTENT HOMOLOGY SUMS OF RANDOM ČECH COMPLEXES
WEIGHTED PERSISTENT HOMOLOGY SUMS OF RANDOM ČECH COMPLEXES BENJAMIN SCHWEINHART Abstract. We study the asyptotic behavior of rando variables of the for Eα i x,..., x n = d b α b,d PH ix,...,x n where {
More informationMetric Entropy of Convex Hulls
Metric Entropy of Convex Hulls Fuchang Gao University of Idaho Abstract Let T be a precopact subset of a Hilbert space. The etric entropy of the convex hull of T is estiated in ters of the etric entropy
More information1 Proof of learning bounds
COS 511: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #4 Scribe: Akshay Mittal February 13, 2013 1 Proof of learning bounds For intuition of the following theore, suppose there exists a
More informationComputational and Statistical Learning Theory
Coputational and Statistical Learning Theory Proble sets 5 and 6 Due: Noveber th Please send your solutions to learning-subissions@ttic.edu Notations/Definitions Recall the definition of saple based Radeacher
More informationA Note on Online Scheduling for Jobs with Arbitrary Release Times
A Note on Online Scheduling for Jobs with Arbitrary Release Ties Jihuan Ding, and Guochuan Zhang College of Operations Research and Manageent Science, Qufu Noral University, Rizhao 7686, China dingjihuan@hotail.co
More informationA := A i : {A i } S. is an algebra. The same object is obtained when the union in required to be disjoint.
59 6. ABSTRACT MEASURE THEORY Having developed the Lebesgue integral with respect to the general easures, we now have a general concept with few specific exaples to actually test it on. Indeed, so far
More informationTail estimates for norms of sums of log-concave random vectors
Tail estiates for nors of sus of log-concave rando vectors Rados law Adaczak Rafa l Lata la Alexander E. Litvak Alain Pajor Nicole Toczak-Jaegerann Abstract We establish new tail estiates for order statistics
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 11 10/15/2008 ABSTRACT INTEGRATION I
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 11 10/15/2008 ABSTRACT INTEGRATION I Contents 1. Preliinaries 2. The ain result 3. The Rieann integral 4. The integral of a nonnegative
More informationBlock designs and statistics
Bloc designs and statistics Notes for Math 447 May 3, 2011 The ain paraeters of a bloc design are nuber of varieties v, bloc size, nuber of blocs b. A design is built on a set of v eleents. Each eleent
More informationCourse Notes for EE227C (Spring 2018): Convex Optimization and Approximation
Course Notes for EE7C (Spring 018: Convex Optiization and Approxiation Instructor: Moritz Hardt Eail: hardt+ee7c@berkeley.edu Graduate Instructor: Max Sichowitz Eail: sichow+ee7c@berkeley.edu October 15,
More information1 Bounding the Margin
COS 511: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #12 Scribe: Jian Min Si March 14, 2013 1 Bounding the Margin We are continuing the proof of a bound on the generalization error of AdaBoost
More informationIn this chapter, we consider several graph-theoretic and probabilistic models
THREE ONE GRAPH-THEORETIC AND STATISTICAL MODELS 3.1 INTRODUCTION In this chapter, we consider several graph-theoretic and probabilistic odels for a social network, which we do under different assuptions
More informationCENTRE FOR STOCHASTIC GEOMETRY AND ADVANCED BIOIMAGING. RESEARCH REPORT. Christophe A.N. Biscio and Jesper Møller
CENTRE FOR STOCHASTIC GEOMETRY AND ADVANCED BIOIMAGING www.csgb.dk RESEARCH REPORT 2016 Christophe A.N. Biscio and Jesper Møller The accuulated persistence function, a new useful functional suary statistic
More information1. INTRODUCTION AND RESULTS
SOME IDENTITIES INVOLVING THE FIBONACCI NUMBERS AND LUCAS NUMBERS Wenpeng Zhang Research Center for Basic Science, Xi an Jiaotong University Xi an Shaanxi, People s Republic of China (Subitted August 00
More informationSolutions 1. Introduction to Coding Theory - Spring 2010 Solutions 1. Exercise 1.1. See Examples 1.2 and 1.11 in the course notes.
Solutions 1 Exercise 1.1. See Exaples 1.2 and 1.11 in the course notes. Exercise 1.2. Observe that the Haing distance of two vectors is the iniu nuber of bit flips required to transfor one into the other.
More informationUniform Approximation and Bernstein Polynomials with Coefficients in the Unit Interval
Unifor Approxiation and Bernstein Polynoials with Coefficients in the Unit Interval Weiang Qian and Marc D. Riedel Electrical and Coputer Engineering, University of Minnesota 200 Union St. S.E. Minneapolis,
More informationTesting Properties of Collections of Distributions
Testing Properties of Collections of Distributions Reut Levi Dana Ron Ronitt Rubinfeld April 9, 0 Abstract We propose a fraework for studying property testing of collections of distributions, where the
More informationAcyclic Colorings of Directed Graphs
Acyclic Colorings of Directed Graphs Noah Golowich Septeber 9, 014 arxiv:1409.7535v1 [ath.co] 6 Sep 014 Abstract The acyclic chroatic nuber of a directed graph D, denoted χ A (D), is the iniu positive
More informationStatistics and Probability Letters
Statistics and Probability Letters 79 2009 223 233 Contents lists available at ScienceDirect Statistics and Probability Letters journal hoepage: www.elsevier.co/locate/stapro A CLT for a one-diensional
More informationUnderstanding Machine Learning Solution Manual
Understanding Machine Learning Solution Manual Written by Alon Gonen Edited by Dana Rubinstein Noveber 17, 2014 2 Gentle Start 1. Given S = ((x i, y i )), define the ultivariate polynoial p S (x) = i []:y
More information1 Generalization bounds based on Rademacher complexity
COS 5: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #0 Scribe: Suqi Liu March 07, 08 Last tie we started proving this very general result about how quickly the epirical average converges
More informationA Better Algorithm For an Ancient Scheduling Problem. David R. Karger Steven J. Phillips Eric Torng. Department of Computer Science
A Better Algorith For an Ancient Scheduling Proble David R. Karger Steven J. Phillips Eric Torng Departent of Coputer Science Stanford University Stanford, CA 9435-4 Abstract One of the oldest and siplest
More informationPolygonal Designs: Existence and Construction
Polygonal Designs: Existence and Construction John Hegean Departent of Matheatics, Stanford University, Stanford, CA 9405 Jeff Langford Departent of Matheatics, Drake University, Des Moines, IA 5011 G
More informationSolutions of some selected problems of Homework 4
Solutions of soe selected probles of Hoework 4 Sangchul Lee May 7, 2018 Proble 1 Let there be light A professor has two light bulbs in his garage. When both are burned out, they are replaced, and the next
More informationThe isomorphism problem of Hausdorff measures and Hölder restrictions of functions
The isoorphis proble of Hausdorff easures and Hölder restrictions of functions Doctoral thesis András Máthé PhD School of Matheatics Pure Matheatics Progra School Leader: Prof. Miklós Laczkovich Progra
More informationL p moments of random vectors via majorizing measures
L p oents of rando vectors via ajorizing easures Olivier Guédon, Mark Rudelson Abstract For a rando vector X in R n, we obtain bounds on the size of a saple, for which the epirical p-th oents of linear
More informationA Theoretical Analysis of a Warm Start Technique
A Theoretical Analysis of a War Start Technique Martin A. Zinkevich Yahoo! Labs 701 First Avenue Sunnyvale, CA Abstract Batch gradient descent looks at every data point for every step, which is wasteful
More informationThe degree of a typical vertex in generalized random intersection graph models
Discrete Matheatics 306 006 15 165 www.elsevier.co/locate/disc The degree of a typical vertex in generalized rando intersection graph odels Jerzy Jaworski a, Michał Karoński a, Dudley Stark b a Departent
More informationMulti-kernel Regularized Classifiers
Multi-kernel Regularized Classifiers Qiang Wu, Yiing Ying, and Ding-Xuan Zhou Departent of Matheatics, City University of Hong Kong Kowloon, Hong Kong, CHINA wu.qiang@student.cityu.edu.hk, yying@cityu.edu.hk,
More informationCombining Classifiers
Cobining Classifiers Generic ethods of generating and cobining ultiple classifiers Bagging Boosting References: Duda, Hart & Stork, pg 475-480. Hastie, Tibsharini, Friedan, pg 246-256 and Chapter 10. http://www.boosting.org/
More informationList Scheduling and LPT Oliver Braun (09/05/2017)
List Scheduling and LPT Oliver Braun (09/05/207) We investigate the classical scheduling proble P ax where a set of n independent jobs has to be processed on 2 parallel and identical processors (achines)
More informationarxiv: v2 [stat.ml] 23 Feb 2016
Perutational Radeacher Coplexity A New Coplexity Measure for Transductive Learning Ilya Tolstikhin 1, Nikita Zhivotovskiy 3, and Gilles Blanchard 4 arxiv:1505.0910v stat.ml 3 Feb 016 1 Max-Planck-Institute
More informationResearch Article On the Isolated Vertices and Connectivity in Random Intersection Graphs
International Cobinatorics Volue 2011, Article ID 872703, 9 pages doi:10.1155/2011/872703 Research Article On the Isolated Vertices and Connectivity in Rando Intersection Graphs Yilun Shang Institute for
More information1 Identical Parallel Machines
FB3: Matheatik/Inforatik Dr. Syaantak Das Winter 2017/18 Optiizing under Uncertainty Lecture Notes 3: Scheduling to Miniize Makespan In any standard scheduling proble, we are given a set of jobs J = {j
More informationNew upper bound for the B-spline basis condition number II. K. Scherer. Institut fur Angewandte Mathematik, Universitat Bonn, Bonn, Germany.
New upper bound for the B-spline basis condition nuber II. A proof of de Boor's 2 -conjecture K. Scherer Institut fur Angewandte Matheati, Universitat Bonn, 535 Bonn, Gerany and A. Yu. Shadrin Coputing
More informationDistributed Subgradient Methods for Multi-agent Optimization
1 Distributed Subgradient Methods for Multi-agent Optiization Angelia Nedić and Asuan Ozdaglar October 29, 2007 Abstract We study a distributed coputation odel for optiizing a su of convex objective functions
More informationNumerically repeated support splitting and merging phenomena in a porous media equation with strong absorption. Kenji Tomoeda
Journal of Math-for-Industry, Vol. 3 (C-), pp. Nuerically repeated support splitting and erging phenoena in a porous edia equation with strong absorption To the eory of y friend Professor Nakaki. Kenji
More informationSharp Time Data Tradeoffs for Linear Inverse Problems
Sharp Tie Data Tradeoffs for Linear Inverse Probles Saet Oyak Benjain Recht Mahdi Soltanolkotabi January 016 Abstract In this paper we characterize sharp tie-data tradeoffs for optiization probles used
More informationVARIATION FOR THE RIESZ TRANSFORM AND UNIFORM RECTIFIABILITY
VARIATION FOR THE RIESZ TRANSFORM AN UNIFORM RECTIFIABILITY ALBERT MAS AN XAVIER TOLSA Abstract. For 1 n < d integers and ρ >, we prove that an n-diensional Ahlfors- avid regular easure µ in R d is uniforly
More informationBayes Decision Rule and Naïve Bayes Classifier
Bayes Decision Rule and Naïve Bayes Classifier Le Song Machine Learning I CSE 6740, Fall 2013 Gaussian Mixture odel A density odel p(x) ay be ulti-odal: odel it as a ixture of uni-odal distributions (e.g.
More informationPerturbation on Polynomials
Perturbation on Polynoials Isaila Diouf 1, Babacar Diakhaté 1 & Abdoul O Watt 2 1 Départeent Maths-Infos, Université Cheikh Anta Diop, Dakar, Senegal Journal of Matheatics Research; Vol 5, No 3; 2013 ISSN
More informationComputable Shell Decomposition Bounds
Coputable Shell Decoposition Bounds John Langford TTI-Chicago jcl@cs.cu.edu David McAllester TTI-Chicago dac@autoreason.co Editor: Leslie Pack Kaelbling and David Cohn Abstract Haussler, Kearns, Seung
More information3.8 Three Types of Convergence
3.8 Three Types of Convergence 3.8 Three Types of Convergence 93 Suppose that we are given a sequence functions {f k } k N on a set X and another function f on X. What does it ean for f k to converge to
More informationComputable Shell Decomposition Bounds
Journal of Machine Learning Research 5 (2004) 529-547 Subitted 1/03; Revised 8/03; Published 5/04 Coputable Shell Decoposition Bounds John Langford David McAllester Toyota Technology Institute at Chicago
More informationA Bernstein-Markov Theorem for Normed Spaces
A Bernstein-Markov Theore for Nored Spaces Lawrence A. Harris Departent of Matheatics, University of Kentucky Lexington, Kentucky 40506-0027 Abstract Let X and Y be real nored linear spaces and let φ :
More informationThe Hilbert Schmidt version of the commutator theorem for zero trace matrices
The Hilbert Schidt version of the coutator theore for zero trace atrices Oer Angel Gideon Schechtan March 205 Abstract Let A be a coplex atrix with zero trace. Then there are atrices B and C such that
More informationarxiv: v1 [math.nt] 14 Sep 2014
ROTATION REMAINDERS P. JAMESON GRABER, WASHINGTON AND LEE UNIVERSITY 08 arxiv:1409.411v1 [ath.nt] 14 Sep 014 Abstract. We study properties of an array of nubers, called the triangle, in which each row
More informationLecture 20 November 7, 2013
CS 229r: Algoriths for Big Data Fall 2013 Prof. Jelani Nelson Lecture 20 Noveber 7, 2013 Scribe: Yun Willia Yu 1 Introduction Today we re going to go through the analysis of atrix copletion. First though,
More informationA Theoretical Framework for Deep Transfer Learning
A Theoretical Fraewor for Deep Transfer Learning Toer Galanti The School of Coputer Science Tel Aviv University toer22g@gail.co Lior Wolf The School of Coputer Science Tel Aviv University wolf@cs.tau.ac.il
More informationConcentration of ground states in stationary Mean-Field Games systems: part I
Concentration of ground states in stationary Mean-Field Gaes systes: part I Annalisa Cesaroni and Marco Cirant June 9, 207 Abstract In this paper we provide the existence of classical solutions to soe
More informationFoundations of Machine Learning Boosting. Mehryar Mohri Courant Institute and Google Research
Foundations of Machine Learning Boosting Mehryar Mohri Courant Institute and Google Research ohri@cis.nyu.edu Weak Learning Definition: concept class C is weakly PAC-learnable if there exists a (weak)
More informationLearnability and Stability in the General Learning Setting
Learnability and Stability in the General Learning Setting Shai Shalev-Shwartz TTI-Chicago shai@tti-c.org Ohad Shair The Hebrew University ohadsh@cs.huji.ac.il Nathan Srebro TTI-Chicago nati@uchicago.edu
More informationFeature Extraction Techniques
Feature Extraction Techniques Unsupervised Learning II Feature Extraction Unsupervised ethods can also be used to find features which can be useful for categorization. There are unsupervised ethods that
More informationOn the Necessary and Sufficient Conditions of a Meaningful Distance Function for High Dimensional Data Space
On the Necessary and Sufficient Conditions of a Meaningful Distance Function for High Diensional Data Space Chih-Ming Hsu Ming-Syan Chen Abstract The use of effective distance functions has been explored
More informationCharacterization of the Line Complexity of Cellular Automata Generated by Polynomial Transition Rules. Bertrand Stone
Characterization of the Line Coplexity of Cellular Autoata Generated by Polynoial Transition Rules Bertrand Stone Abstract Cellular autoata are discrete dynaical systes which consist of changing patterns
More informationConsistent Multiclass Algorithms for Complex Performance Measures. Supplementary Material
Consistent Multiclass Algoriths for Coplex Perforance Measures Suppleentary Material Notations. Let λ be the base easure over n given by the unifor rando variable (say U over n. Hence, for all easurable
More informationAN OPTIMAL SHRINKAGE FACTOR IN PREDICTION OF ORDERED RANDOM EFFECTS
Statistica Sinica 6 016, 1709-178 doi:http://dx.doi.org/10.5705/ss.0014.0034 AN OPTIMAL SHRINKAGE FACTOR IN PREDICTION OF ORDERED RANDOM EFFECTS Nilabja Guha 1, Anindya Roy, Yaakov Malinovsky and Gauri
More informationarxiv: v1 [cs.ds] 3 Feb 2014
arxiv:40.043v [cs.ds] 3 Feb 04 A Bound on the Expected Optiality of Rando Feasible Solutions to Cobinatorial Optiization Probles Evan A. Sultani The Johns Hopins University APL evan@sultani.co http://www.sultani.co/
More informationBounds on the Minimax Rate for Estimating a Prior over a VC Class from Independent Learning Tasks
Bounds on the Miniax Rate for Estiating a Prior over a VC Class fro Independent Learning Tasks Liu Yang Steve Hanneke Jaie Carbonell Deceber 01 CMU-ML-1-11 School of Coputer Science Carnegie Mellon University
More informationCurious Bounds for Floor Function Sums
1 47 6 11 Journal of Integer Sequences, Vol. 1 (018), Article 18.1.8 Curious Bounds for Floor Function Sus Thotsaporn Thanatipanonda and Elaine Wong 1 Science Division Mahidol University International
More informationIEEE TRANSACTIONS ON AUTOMATIC CONTROL, VOL. 60, NO. 2, FEBRUARY ETSP stands for the Euclidean traveling salesman problem.
IEEE TRANSACTIONS ON AUTOMATIC CONTROL, VOL. 60, NO., FEBRUARY 015 37 Target Assignent in Robotic Networks: Distance Optiality Guarantees and Hierarchical Strategies Jingjin Yu, Meber, IEEE, Soon-Jo Chung,
More informationSupport Vector Machines. Maximizing the Margin
Support Vector Machines Support vector achines (SVMs) learn a hypothesis: h(x) = b + Σ i= y i α i k(x, x i ) (x, y ),..., (x, y ) are the training exs., y i {, } b is the bias weight. α,..., α are the
More informationORIGAMI CONSTRUCTIONS OF RINGS OF INTEGERS OF IMAGINARY QUADRATIC FIELDS
#A34 INTEGERS 17 (017) ORIGAMI CONSTRUCTIONS OF RINGS OF INTEGERS OF IMAGINARY QUADRATIC FIELDS Jürgen Kritschgau Departent of Matheatics, Iowa State University, Aes, Iowa jkritsch@iastateedu Adriana Salerno
More informationU V. r In Uniform Field the Potential Difference is V Ed
SPHI/W nit 7.8 Electric Potential Page of 5 Notes Physics Tool box Electric Potential Energy the electric potential energy stored in a syste k of two charges and is E r k Coulobs Constant is N C 9 9. E
More information4 = (0.02) 3 13, = 0.25 because = 25. Simi-
Theore. Let b and be integers greater than. If = (. a a 2 a i ) b,then for any t N, in base (b + t), the fraction has the digital representation = (. a a 2 a i ) b+t, where a i = a i + tk i with k i =
More informationAlgebraic Montgomery-Yang problem: the log del Pezzo surface case
c 2014 The Matheatical Society of Japan J. Math. Soc. Japan Vol. 66, No. 4 (2014) pp. 1073 1089 doi: 10.2969/jsj/06641073 Algebraic Montgoery-Yang proble: the log del Pezzo surface case By DongSeon Hwang
More informationEstimating Entropy and Entropy Norm on Data Streams
Estiating Entropy and Entropy Nor on Data Streas Ait Chakrabarti 1, Khanh Do Ba 1, and S. Muthukrishnan 2 1 Departent of Coputer Science, Dartouth College, Hanover, NH 03755, USA 2 Departent of Coputer
More informationTight Bounds for Maximal Identifiability of Failure Nodes in Boolean Network Tomography
Tight Bounds for axial Identifiability of Failure Nodes in Boolean Network Toography Nicola Galesi Sapienza Università di Roa nicola.galesi@uniroa1.it Fariba Ranjbar Sapienza Università di Roa fariba.ranjbar@uniroa1.it
More informationFixed-to-Variable Length Distribution Matching
Fixed-to-Variable Length Distribution Matching Rana Ali Ajad and Georg Böcherer Institute for Counications Engineering Technische Universität München, Gerany Eail: raa2463@gail.co,georg.boecherer@tu.de
More informationA Note on Scheduling Tall/Small Multiprocessor Tasks with Unit Processing Time to Minimize Maximum Tardiness
A Note on Scheduling Tall/Sall Multiprocessor Tasks with Unit Processing Tie to Miniize Maxiu Tardiness Philippe Baptiste and Baruch Schieber IBM T.J. Watson Research Center P.O. Box 218, Yorktown Heights,
More informationRANDOM WALKS WITH WmOM INDICES AND NEGATIVE DRIm COmmONED TO STAY ~QTIVE
PROBABILITY AND MATHEMATICAL STATISTICS VOI. 4 FIISC. i (198.q, p 117-zw RANDOM WALKS WITH WOM INDICES AND NEGATIVE DRI COONED TO STAY ~QTIVE A. SZUBARGA AND P). SZYNAL (LUBJJN) t Abstract. Let {X,, k
More informationRobustness and Regularization of Support Vector Machines
Robustness and Regularization of Support Vector Machines Huan Xu ECE, McGill University Montreal, QC, Canada xuhuan@ci.cgill.ca Constantine Caraanis ECE, The University of Texas at Austin Austin, TX, USA
More informationLecture October 23. Scribes: Ruixin Qiang and Alana Shine
CSCI699: Topics in Learning and Gae Theory Lecture October 23 Lecturer: Ilias Scribes: Ruixin Qiang and Alana Shine Today s topic is auction with saples. 1 Introduction to auctions Definition 1. In a single
More informationPREPRINT 2006:17. Inequalities of the Brunn-Minkowski Type for Gaussian Measures CHRISTER BORELL
PREPRINT 2006:7 Inequalities of the Brunn-Minkowski Type for Gaussian Measures CHRISTER BORELL Departent of Matheatical Sciences Division of Matheatics CHALMERS UNIVERSITY OF TECHNOLOGY GÖTEBORG UNIVERSITY
More informationarxiv: v1 [stat.ot] 7 Jul 2010
Hotelling s test for highly correlated data P. Bubeliny e-ail: bubeliny@karlin.ff.cuni.cz Charles University, Faculty of Matheatics and Physics, KPMS, Sokolovska 83, Prague, Czech Republic, 8675. arxiv:007.094v
More informationSome Proofs: This section provides proofs of some theoretical results in section 3.
Testing Jups via False Discovery Rate Control Yu-Min Yen. Institute of Econoics, Acadeia Sinica, Taipei, Taiwan. E-ail: YMYEN@econ.sinica.edu.tw. SUPPLEMENTARY MATERIALS Suppleentary Materials contain
More informationRevealed Preference and Stochastic Demand Correspondence: A Unified Theory
Revealed Preference and Stochastic Deand Correspondence: A Unified Theory Indraneel Dasgupta School of Econoics, University of Nottingha, Nottingha NG7 2RD, UK. E-ail: indraneel.dasgupta@nottingha.ac.uk
More informationCS Lecture 13. More Maximum Likelihood
CS 6347 Lecture 13 More Maxiu Likelihood Recap Last tie: Introduction to axiu likelihood estiation MLE for Bayesian networks Optial CPTs correspond to epirical counts Today: MLE for CRFs 2 Maxiu Likelihood
More informationOn Certain C-Test Words for Free Groups
Journal of Algebra 247, 509 540 2002 doi:10.1006 jabr.2001.9001, available online at http: www.idealibrary.co on On Certain C-Test Words for Free Groups Donghi Lee Departent of Matheatics, Uni ersity of
More information1 Proving the Fundamental Theorem of Statistical Learning
THEORETICAL MACHINE LEARNING COS 5 LECTURE #7 APRIL 5, 6 LECTURER: ELAD HAZAN NAME: FERMI MA ANDDANIEL SUO oving te Fundaental Teore of Statistical Learning In tis section, we prove te following: Teore.
More informationCSE525: Randomized Algorithms and Probabilistic Analysis May 16, Lecture 13
CSE55: Randoied Algoriths and obabilistic Analysis May 6, Lecture Lecturer: Anna Karlin Scribe: Noah Siegel, Jonathan Shi Rando walks and Markov chains This lecture discusses Markov chains, which capture
More informationEstimating Parameters for a Gaussian pdf
Pattern Recognition and achine Learning Jaes L. Crowley ENSIAG 3 IS First Seester 00/0 Lesson 5 7 Noveber 00 Contents Estiating Paraeters for a Gaussian pdf Notation... The Pattern Recognition Proble...3
More informationConstrained Consensus and Optimization in Multi-Agent Networks arxiv: v2 [math.oc] 17 Dec 2008
LIDS Report 2779 1 Constrained Consensus and Optiization in Multi-Agent Networks arxiv:0802.3922v2 [ath.oc] 17 Dec 2008 Angelia Nedić, Asuan Ozdaglar, and Pablo A. Parrilo February 15, 2013 Abstract We
More informationAsymptotics of weighted random sums
Asyptotics of weighted rando sus José Manuel Corcuera, David Nualart, Mark Podolskij arxiv:402.44v [ath.pr] 6 Feb 204 February 7, 204 Abstract In this paper we study the asyptotic behaviour of weighted
More informationImproved Guarantees for Agnostic Learning of Disjunctions
Iproved Guarantees for Agnostic Learning of Disjunctions Pranjal Awasthi Carnegie Mellon University pawasthi@cs.cu.edu Avri Blu Carnegie Mellon University avri@cs.cu.edu Or Sheffet Carnegie Mellon University
More informationRademacher Complexity Margin Bounds for Learning with a Large Number of Classes
Radeacher Coplexity Margin Bounds for Learning with a Large Nuber of Classes Vitaly Kuznetsov Courant Institute of Matheatical Sciences, 25 Mercer street, New York, NY, 002 Mehryar Mohri Courant Institute
More informationCompressive Distilled Sensing: Sparse Recovery Using Adaptivity in Compressive Measurements
1 Copressive Distilled Sensing: Sparse Recovery Using Adaptivity in Copressive Measureents Jarvis D. Haupt 1 Richard G. Baraniuk 1 Rui M. Castro 2 and Robert D. Nowak 3 1 Dept. of Electrical and Coputer
More informationQuestion 1. Question 3. Question 4. Graduate Analysis I Exercise 4
Graduate Analysis I Exercise 4 Question 1 If f is easurable and λ is any real nuber, f + λ and λf are easurable. Proof. Since {f > a λ} is easurable, {f + λ > a} = {f > a λ} is easurable, then f + λ is
More informationOn the Inapproximability of Vertex Cover on k-partite k-uniform Hypergraphs
On the Inapproxiability of Vertex Cover on k-partite k-unifor Hypergraphs Venkatesan Guruswai and Rishi Saket Coputer Science Departent Carnegie Mellon University Pittsburgh, PA 1513. Abstract. Coputing
More informationPAC-Bayes Analysis Of Maximum Entropy Learning
PAC-Bayes Analysis Of Maxiu Entropy Learning John Shawe-Taylor and David R. Hardoon Centre for Coputational Statistics and Machine Learning Departent of Coputer Science University College London, UK, WC1E
More informationLORENTZ SPACES AND REAL INTERPOLATION THE KEEL-TAO APPROACH
LORENTZ SPACES AND REAL INTERPOLATION THE KEEL-TAO APPROACH GUILLERMO REY. Introduction If an operator T is bounded on two Lebesgue spaces, the theory of coplex interpolation allows us to deduce the boundedness
More informatione-companion ONLY AVAILABLE IN ELECTRONIC FORM
OPERATIONS RESEARCH doi 10.1287/opre.1070.0427ec pp. ec1 ec5 e-copanion ONLY AVAILABLE IN ELECTRONIC FORM infors 07 INFORMS Electronic Copanion A Learning Approach for Interactive Marketing to a Custoer
More information