Appendix to: On the Relation Between Low Density Separation, Spectral Clustering and Graph Cuts

Size: px
Start display at page:

Download "Appendix to: On the Relation Between Low Density Separation, Spectral Clustering and Graph Cuts"

Transcription

1 Appendix to: On the Relation Between Low Density Separation, Spectral Clustering and Graph Cuts Hariharan Narayanan Department of Computer Science University of Chicago Chicago IL 6637 ikhail Belkin Department of Computer Science and Engineering The Ohio State University Columbus, OH 43 Partha Niyogi Department of Computer Science University of Chicago Chicago IL 6637 A Regularity conditions on p and S We make the following assumptions about p:. p can be extended to a function p that is L Lipshitz and which is bounded above by p max.. For < t < t, min(p(x), K t (x, y)p(y)dy) p min. Note that this is a property of both of the boundary and p. We note that since p is L Lipshitz over R d, so is K t(x, z)p (z)dz. We assume that S has condition number /τ. We also make the following assumption about S:- The volume of the set of points whose distance to both S and is R, is O(R ) as R. This is reasonable, and is true if S is a manifold of codimension. B Proof of Theorem This follows from Theorem 4 (which is proved in a later section), by setting µ to be equal to ɛ d+. C Proof of Theorem In the proof we will use a generalization of cdiarmid s inequality from [7, 8]. We start with the with the following Definition Let Ω,..., Ω m be probability spaces. Let Ω = m Ω k and let Y be a random variable on Ω. We say that Y is strongly difference-bounded by (b, c, δ) if the following holds: there is a bad subset B Ω, where δ = Pr(ω B). If ω, ω Ω differ only in the kth coordinate, and ω B, then Y (ω) Y (ω ) c.

2 Furthermore, for any ω and ω differing only in the kth coordinate, Y (ω) Y (ω ) b. Theorem ([7, 8]) Let Ω,..., Ω m be probability spaces. Let Ω = m Ω k and let Y be a random variable on Ω which is strongly difference-bounded by (b, c, δ). Assume b c >. Let µ = E(Y ). Then for any r >, By Hoeffding s inequality P [ z x K t(x, z) N Pr( Y µ r) ( exp ( r 8mc ) + mbδ c ). E(K t (x, z)) > ɛ E(K t (x, z))] < e (N )E(Kt(x,z)) ɛ t e (N )p min ɛ t. We set ɛ to be t /N µ. Let e (N )p min ɛ t be δ/n. By the union bound, the probability that the above event happens for some x X is δ. The set of all ω Ω for which this occurs shall be denoted by B. Also, for any X, the largest possible value that /N π/t K t (x, y) {( y X z x K t(x, z))( z y K t(y, z))} / could take is π/t(n ). Then, x X E[β] α < ( ɛ ) α + δ π/t(n ). () Let q = (p min / t ). β is strongly difference-bounded by (b, c, δ) where c = O((qN t) ), b = O(N/ t). We now apply the generalization of cdiarmid s inequality in Theorem. Using the notation of Theorem, ( ( ) r Pr[ β E[β] > r] exp 8mc + Nbδ ) ( exp ( O(Nr q t) ) + O ( N 3 q exp ( O(Nqɛ c ) ))). () Putting this together with the relation between E[β] and α in (), the theorem is proved. We note that in (), the rate of convergence of E[β] to α is controlled by ɛ, which is t /N µ, and in (), the rate of convergence of β to E[β] depends on r, which we set to be t / tn µ. We note that in (), the dependence on r of the probability is exponential. Since we have assumed that u = t / (tn µ ) = o(), t /N µ = O(t d+ u). Thus the result follows. D Proof of Theorem 3 We shall prove theorem 3 through a sequence of lemmas. Without a loss of generality we can assume that τ = by rescaling, if necessary. Let R = dt ln(/t) and ɛ = z >R K t(, z)dx. Using the inequality ( ) d/ td K t (, z)dx R e R 4t + d = (et ln(/t)) d/ z >R (3)

3 E_ B_ O_ A H_ F_ R R R^ D_ G_ C Figure : A sphere of radius outside that is tangent to S A H_ F_ R C (R^)/ D_ G_ O_ B_ E_ Figure : A sphere of radius inside that is tangent to S we know that ɛ (et ln(/t)) d/. For any positive real t, ln(/t) t e. Therefore the assumption that ( ) t τ, (d) e e implies that R dt /e <. 3

4 Let the point y (represented as A in figures D and D) be at a distance r < R from. Let us choose a coordinate system where y = (r,,..., ) and the point nearest to it on is the origin. There is a unique such point since r < R <. Let this point be C. Let D lie on the segment AC, at a distance R / from C. Let D lie on the extended segment AC, at a distance R / from C. Thus C is the midpoint of D D. Definition. Denote the ball of radius tangent to at C that is outside by B.. Denote the ball of radius tangent to at C which is inside by B. 3. Let H be the halfspace containing C bounded by the hyperplane perpendicular to AC and passing through D. 4. Let H be the halfspace not containing C bounded by the hyperplane perpendicular to AC and passing through D. 5. Let H 3 be the halfspace not containing A, bounded by the hyperplane tangent to at C. 6. Let B be the ball with center y = A, whose boundary contains the intersection of H and B. 7. Let B be the ball with center y = A, whose boundary contains the intersection of H and B. Definition 3. h(r) := H 3 K t (x, y)dx.. f(r) := H B K t (x, y)dx. 3. g(r) := H B K t (x, y)dx. It follows that and K t (x, y)dx = h(r R /) H K t (x, y)dx = h(r + R /). H Observation Although h(r) is defined by an d-dimensional integral, this can be simplified to e (r x ) /4t h(r) = dx, 4πt by integrating out the coordinates x,..., x d. Lemma If r > R, the radius of B is R. x < Proof: By the similarity of triangles CF D and CE F in figure D, it follows that CF CE = CD CF. CE = and CD = R /. Therefore CF = R. Since CD F is right angled at D, and CD = R /, this proves the claim. Lemma The radius of B is R. Proof: By the similarity of triangles CF E and CD F in figure D CF = R. However, the distance of point y := A from F is CF. Therefore, the radius of B is R. Definition 4 Let the set of points x such that B(x, R) be denoted by. Let be S and S be S. Let =, be and S be. We shall denote ( + L/p min )R) by l. 4

5 Consider a point x, Then, K t (x, y)p(y)dy y x <R K t (x, y)p(y)dy ( ɛ)(p(x) LR) ( ɛ)p(x)( LR/p min ) = p(x)( O(l)) On the other hand, K t (x, y)p(y)dy K t (x, y)p(y)dy + K t (x, y)p(y)dy y x R y x >R p(x)( + l) + K t (, R) = p(x)( + O(( + L/p min )R)) Therefore, ψ t (x) = p(x)( ± O(( + L p min )R)). d Lemma 3 B(x, 5R) implies that dx Kt (x, z)p(z)dy = O(L). Proof: Consider the function p, which is equal to p on, but which has a larger support and is L Lipshitz as a function on R d. K t (x, z)p (z)dy is L Lipshitz and on points x where B(x, 5R), the contribution of points z outside, is o(). Therefore d dx Kt (x, z)p(z)dy = O(L). This implies that on the set of points x such that B(x, 5R), ψ t (X) is O(L) Lipshitz. We now estimate K t (y, z)p(z)dz for y S. Definition 5 For a point y S, such that d(y, ) < R < τ = let π(y) be the nearest point to y in S. Note that by the assumption that the condition number of S is, since R is smaller than, there is a unique candidate for π(y). Let y be as in Definition 5. Lemma 4 h(r + R /) ɛ < f(r) K t (y, z)dz. Proof: K t (x, y)dx > H B K t (x, y)dx(since H B ) K t (x, y)dx H > h(r + R /) ɛ B c K t (x, y)dx The last inequality follows from lemma. Lemma 5 > p(π(y))( O(l))(h(r + R /) ɛ). 5

6 Q P Q P Figure 3: The correspondence between points on and [ ] r Proof: (since H B ) H B > p(π(y))( O(l))(h(r + R /) ɛ). Lemma 6 Let r > R. Then, < ( + O(l))(h(r R /)p(π(y)) + ɛp max ). Proof: < R d B H R d B H B + B c < h(r R /)p(π(y))( + O(l)) + ɛp max ( + O(l)) < ( + O(l))(h(r R /)p(π(y)) + ɛp max ) The last inequality follows from lemma. Definition 6 Let [ ] r denote the set of points at a distance of r to [ ]. Let π r be map from [ ] r to [ ] that takes a point P on [ ] r to the foot of the perpendicular from P to. (This map is well-defined since r < τ =.) Lemma 7 Let y [ ] r. Let the Jacobian of a map f be denoted by Df. ( r) d Dπ r (y) ( + r) d. 6

7 Proof: Let P Q be a geodesic arc of infinitesimal length ds on joining P and Q. Let πr (P ) = P and πr (Q) = Q (see Figure D.) The radius of curvature of P Q is. Therefore the distance between P and Q is in the interval [ds( r), ds( + r)]. This implies that the Jacobian of the map π r has a magnitude that is always in the interval [( + r) d, ( r) d ]. Lemma 8 Proof: R d [ ] R dy ɛvol p max ( + O(l)). dy = R d [] R K t (x, y)ψ t (x)ψ t (y)dydx R d [S ] R K t (, z)p max ( + O(l))dzdx z >R < vol p max ( + O(l)). in line holds because the distance between x and y in the double integral is always R. Lemma 9 ( e α /4t ) π/t α h(r)dr π/t. Proof: Using observation, Setting y x := r, this becomes α r h(r)dr = aking the substitution r α := z, we have α α α e (x y ) /4t 4πt dx dy. e r /4t e r /4t dy dr = (r α)dr. 4πt α 4πt e (z+α) /4t 4πt zdz = e α /4t e z /4t zdz 4πt t π e α /4t Equality holds in the above calculation if and only if α =. Hence the proof is complete. Definition 7 Let [S ] [ ] r be r. Let [S ] [ ] r be r and [S ] [ ] r be r. We assume that vol( R ) < C R for some absolute constant C. Since the thickness of ( R ) is O(R) in two dimensions, this is a reasonable assumption to make. The assumption that has a d dimensional volume implies that vol = O(R). Putting these together to prove Theorem 3: 7

8 dy = dydr r O(p max/p min ) K t (x, y)dxdydr r ( R ) O(p max/p min ) K t (x, y)dxdydr + volsɛ r O(p max/p min ) ( vol ( R ) + ɛvol ). O(p max/p min ) ( C t µ + O(t µ vol ) ) S dy = dydr r R = ( dydr) r S + ( dydr) R r R ( dydr) r + ɛvol p max ( + O(l)). }{{} E (from lemma 8) R p max ( + O(l))dydr r R + ( + O(l))(h(r R /)p(π(y) + ɛp max )) + E R r The last line follows from Lemma 6. E + R ( + R ) d p max ( + O(l))vol ( )dr(from lemma 7) R + ( + R) d ( + O(l))( h(r)dr p(y)dy + ɛp max R). ( + O(l))( t/π p(y)dy + p max ((R + ɛr)vol ( ) + ɛvol )). ( + O(l))( p(y)dy( t/π + p max o(t µ )) + p max ɛvol ) p min Similarly, we see that S dy 8

9 = dydr r R > ( dydr) r R > p(π(y))( + O(l))f(r)dxdr r R > ( R) d ( O(l))( p(y)dy)(h(r + R /) ɛ)dr R > ( R) d ( O(( L/p min )R))( p(y)dy)(( (h(r)dr) ɛr R /) ( O(l))( p(y)dy)(( e R /4t ) t/π ɛr R /) ( O(l))( p(y)dy)( t/π o(t µ )). Noting only the dependence of the rate on t, and introducing the condition number τ, π ) dy = ( + o((t/τ )) µ t S S p(s)ds. Proof of Theorem 4: This follows directly from Theorem and Theorem 3. The only change made was that the t d+ term was eliminated since it is dominated by t ɛ when t is small. References []. Belkin and P. Niyogi (4). Semi-supervised Learning on Riemannian anifolds. In achine Learning 56, Special Issue on Clustering, []. Belkin and P. Niyogi. Toward a theoretical foundation for Laplacian-based manifold methods. COLT 5. [3] A. Blum and S. Chawla, Learning from labeled and unlabeled data using graph mincuts, ICL. [4] O.Chapelle, J. Weston, B. Scholkopf, Cluster kernels for semi-supervised learning, NIPS. [5] O. Chapelle and A. Zien, Semi-supervised Classification by Low Density Separation, AISTATS 5. [6] Chris Ding, Spectral Clustering, ICL 4 Tutorial. [7] Samuel Kutin, Partha Niyogi, Almost-everywhere Algorithmic Stability and Generalization Error., UAI, 75-8 [8] S. Kutin, TR--4, Extensions to cdiarmid s inequality when differences are bounded with high probability. Technical report TR--4 at the Department of Computer Science, University of Chicago. [9] S. Lafon, Diffusion aps and Geodesic Harmonics, Ph. D. Thesis, Yale University, 4. []. Hein, J.-Y. Audibert, U. von Luxburg, From Graphs to anifolds Weak and Strong Pointwise Consistency of Graph Laplacians, COLT 5. [] J. Shi and J. alik. Normalized cuts and image segmentation. [] U. von Luxburg,. Belkin, O. Bousquet, Consistency of Spectral Clustering, ax Planck Institute for Biological Cybernetics Technical Report TR 34, 4. [3] X. Zhu, J. Lafferty and Z. Ghahramani, Semi-supervised learning using Gaussian fields and harmonic functions, ICL 3. 9

Graphs, Geometry and Semi-supervised Learning

Graphs, Geometry and Semi-supervised Learning Graphs, Geometry and Semi-supervised Learning Mikhail Belkin The Ohio State University, Dept of Computer Science and Engineering and Dept of Statistics Collaborators: Partha Niyogi, Vikas Sindhwani In

More information

Learning from Labeled and Unlabeled Data: Semi-supervised Learning and Ranking p. 1/31

Learning from Labeled and Unlabeled Data: Semi-supervised Learning and Ranking p. 1/31 Learning from Labeled and Unlabeled Data: Semi-supervised Learning and Ranking Dengyong Zhou zhou@tuebingen.mpg.de Dept. Schölkopf, Max Planck Institute for Biological Cybernetics, Germany Learning from

More information

From graph to manifold Laplacian: The convergence rate

From graph to manifold Laplacian: The convergence rate Appl. Comput. Harmon. Anal. 2 (2006) 28 34 www.elsevier.com/locate/acha Letter to the Editor From graph to manifold Laplacian: The convergence rate A. Singer Department of athematics, Yale University,

More information

Limits of Spectral Clustering

Limits of Spectral Clustering Limits of Spectral Clustering Ulrike von Luxburg and Olivier Bousquet Max Planck Institute for Biological Cybernetics Spemannstr. 38, 72076 Tübingen, Germany {ulrike.luxburg,olivier.bousquet}@tuebingen.mpg.de

More information

Graph-Based Semi-Supervised Learning

Graph-Based Semi-Supervised Learning Graph-Based Semi-Supervised Learning Olivier Delalleau, Yoshua Bengio and Nicolas Le Roux Université de Montréal CIAR Workshop - April 26th, 2005 Graph-Based Semi-Supervised Learning Yoshua Bengio, Olivier

More information

How to learn from very few examples?

How to learn from very few examples? How to learn from very few examples? Dengyong Zhou Department of Empirical Inference Max Planck Institute for Biological Cybernetics Spemannstr. 38, 72076 Tuebingen, Germany Outline Introduction Part A

More information

Uniform convergence of adaptive graph-based regularization

Uniform convergence of adaptive graph-based regularization Uniform convergence of adaptive graph-based regularization atthias Hein ax Planck Institute for Biological Cybernetics, Tübingen, Germany Abstract. The regularization functional induced by the graph Laplacian

More information

Justin Solomon MIT, Spring 2017

Justin Solomon MIT, Spring 2017 Justin Solomon MIT, Spring 2017 http://pngimg.com/upload/hammer_png3886.png You can learn a lot about a shape by hitting it (lightly) with a hammer! What can you learn about its shape from vibration frequencies

More information

Analysis of Spectral Kernel Design based Semi-supervised Learning

Analysis of Spectral Kernel Design based Semi-supervised Learning Analysis of Spectral Kernel Design based Semi-supervised Learning Tong Zhang IBM T. J. Watson Research Center Yorktown Heights, NY 10598 Rie Kubota Ando IBM T. J. Watson Research Center Yorktown Heights,

More information

Semi-Supervised Learning with the Graph Laplacian: The Limit of Infinite Unlabelled Data

Semi-Supervised Learning with the Graph Laplacian: The Limit of Infinite Unlabelled Data Semi-Supervised Learning with the Graph Laplacian: The Limit of Infinite Unlabelled Data Boaz Nadler Dept. of Computer Science and Applied Mathematics Weizmann Institute of Science Rehovot, Israel 76 boaz.nadler@weizmann.ac.il

More information

IFT LAPLACIAN APPLICATIONS. Mikhail Bessmeltsev

IFT LAPLACIAN APPLICATIONS.   Mikhail Bessmeltsev IFT 6112 09 LAPLACIAN APPLICATIONS http://www-labs.iro.umontreal.ca/~bmpix/teaching/6112/2018/ Mikhail Bessmeltsev Rough Intuition http://pngimg.com/upload/hammer_png3886.png You can learn a lot about

More information

Diffusion Maps, Spectral Clustering and Eigenfunctions of Fokker-Planck Operators

Diffusion Maps, Spectral Clustering and Eigenfunctions of Fokker-Planck Operators Diffusion Maps, Spectral Clustering and Eigenfunctions of Fokker-Planck Operators Boaz Nadler Stéphane Lafon Ronald R. Coifman Department of Mathematics, Yale University, New Haven, CT 652. {boaz.nadler,stephane.lafon,ronald.coifman}@yale.edu

More information

Semi-supervised Learning using Sparse Eigenfunction Bases

Semi-supervised Learning using Sparse Eigenfunction Bases Semi-supervised Learning using Sparse Eigenfunction Bases Kaushik Sinha Dept. of Computer Science and Engineering Ohio State University Columbus, OH 43210 sinhak@cse.ohio-state.edu Mikhail Belkin Dept.

More information

An Analysis of the Convergence of Graph Laplacians

An Analysis of the Convergence of Graph Laplacians Daniel Ting Department of Statistics, UC Berkeley Ling Huang Intel Labs Berkeley Michael I. Jordan Department of EECS and Statistics, UC Berkeley dting@stat.berkeley.edu ling.huang@intel.com jordan@cs.berkeley.edu

More information

Contribution from: Springer Verlag Berlin Heidelberg 2005 ISBN

Contribution from: Springer Verlag Berlin Heidelberg 2005 ISBN Contribution from: Mathematical Physics Studies Vol. 7 Perspectives in Analysis Essays in Honor of Lennart Carleson s 75th Birthday Michael Benedicks, Peter W. Jones, Stanislav Smirnov (Eds.) Springer

More information

LECTURE 15: COMPLETENESS AND CONVEXITY

LECTURE 15: COMPLETENESS AND CONVEXITY LECTURE 15: COMPLETENESS AND CONVEXITY 1. The Hopf-Rinow Theorem Recall that a Riemannian manifold (M, g) is called geodesically complete if the maximal defining interval of any geodesic is R. On the other

More information

Data Analysis and Manifold Learning Lecture 7: Spectral Clustering

Data Analysis and Manifold Learning Lecture 7: Spectral Clustering Data Analysis and Manifold Learning Lecture 7: Spectral Clustering Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline of Lecture 7 What is spectral

More information

Add Math (4047/02) Year t years $P

Add Math (4047/02) Year t years $P Add Math (4047/0) Requirement : Answer all questions Total marks : 100 Duration : hour 30 minutes 1. The price, $P, of a company share on 1 st January has been increasing each year from 1995 to 015. The

More information

Manifold Denoising. Matthias Hein Markus Maier Max Planck Institute for Biological Cybernetics Tübingen, Germany

Manifold Denoising. Matthias Hein Markus Maier Max Planck Institute for Biological Cybernetics Tübingen, Germany Manifold Denoising Matthias Hein Markus Maier Max Planck Institute for Biological Cybernetics Tübingen, Germany {first.last}@tuebingen.mpg.de Abstract We consider the problem of denoising a noisily sampled

More information

Semi-Supervised Learning in Reproducing Kernel Hilbert Spaces Using Local Invariances

Semi-Supervised Learning in Reproducing Kernel Hilbert Spaces Using Local Invariances Semi-Supervised Learning in Reproducing Kernel Hilbert Spaces Using Local Invariances Wee Sun Lee,2, Xinhua Zhang,2, and Yee Whye Teh Department of Computer Science, National University of Singapore. 2

More information

Regularizing objective functionals in semi-supervised learning

Regularizing objective functionals in semi-supervised learning Regularizing objective functionals in semi-supervised learning Dejan Slepčev Carnegie Mellon University February 9, 2018 1 / 47 References S,Thorpe, Analysis of p-laplacian regularization in semi-supervised

More information

2. Metric Spaces. 2.1 Definitions etc.

2. Metric Spaces. 2.1 Definitions etc. 2. Metric Spaces 2.1 Definitions etc. The procedure in Section for regarding R as a topological space may be generalized to many other sets in which there is some kind of distance (formally, sets with

More information

On the Convergence of Spectral Clustering on Random Samples: the Normalized Case

On the Convergence of Spectral Clustering on Random Samples: the Normalized Case On the Convergence of Spectral Clustering on Random Samples: the Normalized Case Ulrike von Luxburg 1, Olivier Bousquet 1, and Mikhail Belkin 2 1 Max Planck Institute for Biological Cybernetics, Tübingen,

More information

The crucial role of statistics in manifold learning

The crucial role of statistics in manifold learning The crucial role of statistics in manifold learning Tyrus Berry Postdoc, Dept. of Mathematical Sciences, GMU Statistics Seminar GMU Feb., 26 Postdoctoral position supported by NSF ANALYSIS OF POINT CLOUDS

More information

Semi-Supervised Learning by Multi-Manifold Separation

Semi-Supervised Learning by Multi-Manifold Separation Semi-Supervised Learning by Multi-Manifold Separation Xiaojin (Jerry) Zhu Department of Computer Sciences University of Wisconsin Madison Joint work with Andrew Goldberg, Zhiting Xu, Aarti Singh, and Rob

More information

A graph based approach to semi-supervised learning

A graph based approach to semi-supervised learning A graph based approach to semi-supervised learning 1 Feb 2011 Two papers M. Belkin, P. Niyogi, and V Sindhwani. Manifold regularization: a geometric framework for learning from labeled and unlabeled examples.

More information

Active and Semi-supervised Kernel Classification

Active and Semi-supervised Kernel Classification Active and Semi-supervised Kernel Classification Zoubin Ghahramani Gatsby Computational Neuroscience Unit University College London Work done in collaboration with Xiaojin Zhu (CMU), John Lafferty (CMU),

More information

Laplacian Eigenmaps for Dimensionality Reduction and Data Representation

Laplacian Eigenmaps for Dimensionality Reduction and Data Representation Introduction and Data Representation Mikhail Belkin & Partha Niyogi Department of Electrical Engieering University of Minnesota Mar 21, 2017 1/22 Outline Introduction 1 Introduction 2 3 4 Connections to

More information

Back to the future: Radial Basis Function networks revisited

Back to the future: Radial Basis Function networks revisited Back to the future: Radial Basis Function networks revisited Qichao Que, Mikhail Belkin Department of Computer Science and Engineering Ohio State University Columbus, OH 4310 que, mbelkin@cse.ohio-state.edu

More information

Semi-Supervised Learning with Graphs

Semi-Supervised Learning with Graphs Semi-Supervised Learning with Graphs Xiaojin (Jerry) Zhu LTI SCS CMU Thesis Committee John Lafferty (co-chair) Ronald Rosenfeld (co-chair) Zoubin Ghahramani Tommi Jaakkola 1 Semi-supervised Learning classifiers

More information

Semi-Supervised Learning

Semi-Supervised Learning Semi-Supervised Learning getting more for less in natural language processing and beyond Xiaojin (Jerry) Zhu School of Computer Science Carnegie Mellon University 1 Semi-supervised Learning many human

More information

Global vs. Multiscale Approaches

Global vs. Multiscale Approaches Harmonic Analysis on Graphs Global vs. Multiscale Approaches Weizmann Institute of Science, Rehovot, Israel July 2011 Joint work with Matan Gavish (WIS/Stanford), Ronald Coifman (Yale), ICML 10' Challenge:

More information

Paradoxical Euler: Integrating by Differentiating. Andrew Fabian Hieu D. Nguyen. Department of Mathematics Rowan University, Glassboro, NJ 08028

Paradoxical Euler: Integrating by Differentiating. Andrew Fabian Hieu D. Nguyen. Department of Mathematics Rowan University, Glassboro, NJ 08028 Paradoxical Euler: Integrating by Differentiating Andrew Fabian Hieu D Nguyen Department of Mathematics Rowan University, Glassboro, NJ 0808 3-3-09 I Introduction Every student of calculus learns that

More information

Semi-Supervised Learning with Graphs. Xiaojin (Jerry) Zhu School of Computer Science Carnegie Mellon University

Semi-Supervised Learning with Graphs. Xiaojin (Jerry) Zhu School of Computer Science Carnegie Mellon University Semi-Supervised Learning with Graphs Xiaojin (Jerry) Zhu School of Computer Science Carnegie Mellon University 1 Semi-supervised Learning classification classifiers need labeled data to train labeled data

More information

Geometry on Probability Spaces

Geometry on Probability Spaces Geometry on Probability Spaces Steve Smale Toyota Technological Institute at Chicago 427 East 60th Street, Chicago, IL 60637, USA E-mail: smale@math.berkeley.edu Ding-Xuan Zhou Department of Mathematics,

More information

Beyond Scalar Affinities for Network Analysis or Vector Diffusion Maps and the Connection Laplacian

Beyond Scalar Affinities for Network Analysis or Vector Diffusion Maps and the Connection Laplacian Beyond Scalar Affinities for Network Analysis or Vector Diffusion Maps and the Connection Laplacian Amit Singer Princeton University Department of Mathematics and Program in Applied and Computational Mathematics

More information

THE HIDDEN CONVEXITY OF SPECTRAL CLUSTERING

THE HIDDEN CONVEXITY OF SPECTRAL CLUSTERING THE HIDDEN CONVEXITY OF SPECTRAL CLUSTERING Luis Rademacher, Ohio State University, Computer Science and Engineering. Joint work with Mikhail Belkin and James Voss This talk A new approach to multi-way

More information

Discrete vs. Continuous: Two Sides of Machine Learning

Discrete vs. Continuous: Two Sides of Machine Learning Discrete vs. Continuous: Two Sides of Machine Learning Dengyong Zhou Department of Empirical Inference Max Planck Institute for Biological Cybernetics Spemannstr. 38, 72076 Tuebingen, Germany Oct. 18,

More information

KUIPER S THEOREM ON CONFORMALLY FLAT MANIFOLDS

KUIPER S THEOREM ON CONFORMALLY FLAT MANIFOLDS KUIPER S THEOREM ON CONFORMALLY FLAT MANIFOLDS RALPH HOWARD DEPARTMENT OF MATHEMATICS UNIVERSITY OF SOUTH CAROLINA COLUMBIA, S.C. 29208, USA HOWARD@MATH.SC.EDU 1. Introduction These are notes to that show

More information

Spectral Clustering. Zitao Liu

Spectral Clustering. Zitao Liu Spectral Clustering Zitao Liu Agenda Brief Clustering Review Similarity Graph Graph Laplacian Spectral Clustering Algorithm Graph Cut Point of View Random Walk Point of View Perturbation Theory Point of

More information

An Iterated Graph Laplacian Approach for Ranking on Manifolds

An Iterated Graph Laplacian Approach for Ranking on Manifolds An Iterated Graph Laplacian Approach for Ranking on Manifolds Xueyuan Zhou Department of Computer Science University of Chicago Chicago, IL zhouxy@cs.uchicago.edu Mikhail Belkin Department of Computer

More information

March 13, Paper: R.R. Coifman, S. Lafon, Diffusion maps ([Coifman06]) Seminar: Learning with Graphs, Prof. Hein, Saarland University

March 13, Paper: R.R. Coifman, S. Lafon, Diffusion maps ([Coifman06]) Seminar: Learning with Graphs, Prof. Hein, Saarland University Kernels March 13, 2008 Paper: R.R. Coifman, S. Lafon, maps ([Coifman06]) Seminar: Learning with Graphs, Prof. Hein, Saarland University Kernels Figure: Example Application from [LafonWWW] meaningful geometric

More information

Semi-Supervised Learning on Riemannian Manifolds

Semi-Supervised Learning on Riemannian Manifolds Machine Learning, 56, 209 239, 2004 c 2004 Kluwer Academic Publishers. Manufactured in The Netherlands. Semi-Supervised Learning on Riemannian Manifolds MIKHAIL BELKIN misha@math.uchicago.edu PARTHA NIYOGI

More information

The Minimal Element Theorem

The Minimal Element Theorem The Minimal Element Theorem The CMC Dynamics Theorem deals with describing all of the periodic or repeated geometric behavior of a properly embedded CMC surface with bounded second fundamental form in

More information

Spectral Clustering. Spectral Clustering? Two Moons Data. Spectral Clustering Algorithm: Bipartioning. Spectral methods

Spectral Clustering. Spectral Clustering? Two Moons Data. Spectral Clustering Algorithm: Bipartioning. Spectral methods Spectral Clustering Seungjin Choi Department of Computer Science POSTECH, Korea seungjin@postech.ac.kr 1 Spectral methods Spectral Clustering? Methods using eigenvectors of some matrices Involve eigen-decomposition

More information

King Fahd University of Petroleum and Minerals Prep-Year Math Program Math (001) - Term 181 Recitation (1.1)

King Fahd University of Petroleum and Minerals Prep-Year Math Program Math (001) - Term 181 Recitation (1.1) Recitation (1.1) Question 1: Find a point on the y-axis that is equidistant from the points (5, 5) and (1, 1) Question 2: Find the distance between the points P(2 x, 7 x) and Q( 2 x, 4 x) where x 0. Question

More information

1 First and second variational formulas for area

1 First and second variational formulas for area 1 First and second variational formulas for area In this chapter, we will derive the first and second variational formulas for the area of a submanifold. This will be useful in our later discussion on

More information

Course 212: Academic Year Section 1: Metric Spaces

Course 212: Academic Year Section 1: Metric Spaces Course 212: Academic Year 1991-2 Section 1: Metric Spaces D. R. Wilkins Contents 1 Metric Spaces 3 1.1 Distance Functions and Metric Spaces............. 3 1.2 Convergence and Continuity in Metric Spaces.........

More information

Contents. MATH 32B-2 (18W) (L) G. Liu / (TA) A. Zhou Calculus of Several Variables. 1 Multiple Integrals 3. 2 Vector Fields 9

Contents. MATH 32B-2 (18W) (L) G. Liu / (TA) A. Zhou Calculus of Several Variables. 1 Multiple Integrals 3. 2 Vector Fields 9 MATH 32B-2 (8W) (L) G. Liu / (TA) A. Zhou Calculus of Several Variables Contents Multiple Integrals 3 2 Vector Fields 9 3 Line and Surface Integrals 5 4 The Classical Integral Theorems 9 MATH 32B-2 (8W)

More information

Semi-Supervised Learning through Principal Directions Estimation

Semi-Supervised Learning through Principal Directions Estimation Semi-Supervised Learning through Principal Directions Estimation Olivier Chapelle, Bernhard Schölkopf, Jason Weston Max Planck Institute for Biological Cybernetics, 72076 Tübingen, Germany {first.last}@tuebingen.mpg.de

More information

Practice Final Solutions

Practice Final Solutions Practice Final Solutions Math 1, Fall 17 Problem 1. Find a parameterization for the given curve, including bounds on the parameter t. Part a) The ellipse in R whose major axis has endpoints, ) and 6, )

More information

y mx 25m 25 4 circle. Then the perpendicular distance of tangent from the centre (0, 0) is the radius. Since tangent

y mx 25m 25 4 circle. Then the perpendicular distance of tangent from the centre (0, 0) is the radius. Since tangent Mathematics. The sides AB, BC and CA of ABC have, 4 and 5 interior points respectively on them as shown in the figure. The number of triangles that can be formed using these interior points is () 80 ()

More information

Manifold Regularization

Manifold Regularization 9.520: Statistical Learning Theory and Applications arch 3rd, 200 anifold Regularization Lecturer: Lorenzo Rosasco Scribe: Hooyoung Chung Introduction In this lecture we introduce a class of learning algorithms,

More information

IYGB. Special Paper U. Time: 3 hours 30 minutes. Created by T. Madas. Created by T. Madas

IYGB. Special Paper U. Time: 3 hours 30 minutes. Created by T. Madas. Created by T. Madas IYGB Special Paper U Time: 3 hours 30 minutes Candidates may NOT use any calculator Information for Candidates This practice paper follows the Advanced Level Mathematics Core Syllabus Booklets of Mathematical

More information

Spectral Techniques for Clustering

Spectral Techniques for Clustering Nicola Rebagliati 1/54 Spectral Techniques for Clustering Nicola Rebagliati 29 April, 2010 Nicola Rebagliati 2/54 Thesis Outline 1 2 Data Representation for Clustering Setting Data Representation and Methods

More information

Random Walks on Hyperbolic Groups III

Random Walks on Hyperbolic Groups III Random Walks on Hyperbolic Groups III Steve Lalley University of Chicago January 2014 Hyperbolic Groups Definition, Examples Geometric Boundary Ledrappier-Kaimanovich Formula Martin Boundary of FRRW on

More information

for all subintervals I J. If the same is true for the dyadic subintervals I D J only, we will write ϕ BMO d (J). In fact, the following is true

for all subintervals I J. If the same is true for the dyadic subintervals I D J only, we will write ϕ BMO d (J). In fact, the following is true 3 ohn Nirenberg inequality, Part I A function ϕ L () belongs to the space BMO() if sup ϕ(s) ϕ I I I < for all subintervals I If the same is true for the dyadic subintervals I D only, we will write ϕ BMO

More information

LECTURE 15-16: PROPER ACTIONS AND ORBIT SPACES

LECTURE 15-16: PROPER ACTIONS AND ORBIT SPACES LECTURE 15-16: PROPER ACTIONS AND ORBIT SPACES 1. Proper actions Suppose G acts on M smoothly, and m M. Then the orbit of G through m is G m = {g m g G}. If m, m lies in the same orbit, i.e. m = g m for

More information

Graphs in Machine Learning

Graphs in Machine Learning Graphs in Machine Learning Michal Valko Inria Lille - Nord Europe, France TA: Pierre Perrault Partially based on material by: Mikhail Belkin, Jerry Zhu, Olivier Chapelle, Branislav Kveton October 30, 2017

More information

Random Walks, Random Fields, and Graph Kernels

Random Walks, Random Fields, and Graph Kernels Random Walks, Random Fields, and Graph Kernels John Lafferty School of Computer Science Carnegie Mellon University Based on work with Avrim Blum, Zoubin Ghahramani, Risi Kondor Mugizi Rwebangira, Jerry

More information

Extensions to McDiarmid s inequality when differences are bounded with high probability

Extensions to McDiarmid s inequality when differences are bounded with high probability Extensions to McDiarmid s inequality when differences are bounded with high probability Samuel Kutin April 12, 2002 Abstract The method of independent bounded differences McDiarmid, 1989) gives largedeviation

More information

(D) (A) Q.3 To which of the following circles, the line y x + 3 = 0 is normal at the point ? 2 (A) 2

(D) (A) Q.3 To which of the following circles, the line y x + 3 = 0 is normal at the point ? 2 (A) 2 CIRCLE [STRAIGHT OBJECTIVE TYPE] Q. The line x y + = 0 is tangent to the circle at the point (, 5) and the centre of the circles lies on x y = 4. The radius of the circle is (A) 3 5 (B) 5 3 (C) 5 (D) 5

More information

Rigidity and Non-rigidity Results on the Sphere

Rigidity and Non-rigidity Results on the Sphere Rigidity and Non-rigidity Results on the Sphere Fengbo Hang Xiaodong Wang Department of Mathematics Michigan State University Oct., 00 1 Introduction It is a simple consequence of the maximum principle

More information

B. Appendix B. Topological vector spaces

B. Appendix B. Topological vector spaces B.1 B. Appendix B. Topological vector spaces B.1. Fréchet spaces. In this appendix we go through the definition of Fréchet spaces and their inductive limits, such as they are used for definitions of function

More information

Efficient Iterative Semi-Supervised Classification on Manifold

Efficient Iterative Semi-Supervised Classification on Manifold Efficient Iterative Semi-Supervised Classification on Manifold Mehrdad Farajtabar, Hamid R. Rabiee, Amirreza Shaban, Ali Soltani-Farani Digital Media Lab, AICTC Research Center, Department of Computer

More information

The prototypes of smooth manifolds

The prototypes of smooth manifolds The prototypes of smooth manifolds The prototype smooth manifolds are the open subsets of R n. If U is an open subset of R n, a smooth map from U to R m is an m-tuple of real valued functions (f 1, f 2,...,

More information

Manifold Regularization

Manifold Regularization Manifold Regularization Vikas Sindhwani Department of Computer Science University of Chicago Joint Work with Mikhail Belkin and Partha Niyogi TTI-C Talk September 14, 24 p.1 The Problem of Learning is

More information

Advances in Manifold Learning Presented by: Naku Nak l Verm r a June 10, 2008

Advances in Manifold Learning Presented by: Naku Nak l Verm r a June 10, 2008 Advances in Manifold Learning Presented by: Nakul Verma June 10, 008 Outline Motivation Manifolds Manifold Learning Random projection of manifolds for dimension reduction Introduction to random projections

More information

Joint distribution optimal transportation for domain adaptation

Joint distribution optimal transportation for domain adaptation Joint distribution optimal transportation for domain adaptation Changhuang Wan Mechanical and Aerospace Engineering Department The Ohio State University March 8 th, 2018 Joint distribution optimal transportation

More information

Online Manifold Regularization: A New Learning Setting and Empirical Study

Online Manifold Regularization: A New Learning Setting and Empirical Study Online Manifold Regularization: A New Learning Setting and Empirical Study Andrew B. Goldberg 1, Ming Li 2, Xiaojin Zhu 1 1 Computer Sciences, University of Wisconsin Madison, USA. {goldberg,jerryzhu}@cs.wisc.edu

More information

MATH 425, FINAL EXAM SOLUTIONS

MATH 425, FINAL EXAM SOLUTIONS MATH 425, FINAL EXAM SOLUTIONS Each exercise is worth 50 points. Exercise. a The operator L is defined on smooth functions of (x, y by: Is the operator L linear? Prove your answer. L (u := arctan(xy u

More information

INVERSION IN THE PLANE BERKELEY MATH CIRCLE

INVERSION IN THE PLANE BERKELEY MATH CIRCLE INVERSION IN THE PLANE BERKELEY MATH CIRCLE ZVEZDELINA STANKOVA MILLS COLLEGE/UC BERKELEY SEPTEMBER 26TH 2004 Contents 1. Definition of Inversion in the Plane 1 Properties of Inversion 2 Problems 2 2.

More information

Lecture 18: March 15

Lecture 18: March 15 CS71 Randomness & Computation Spring 018 Instructor: Alistair Sinclair Lecture 18: March 15 Disclaimer: These notes have not been subjected to the usual scrutiny accorded to formal publications. They may

More information

Lecture 5: Probabilistic tools and Applications II

Lecture 5: Probabilistic tools and Applications II T-79.7003: Graphs and Networks Fall 2013 Lecture 5: Probabilistic tools and Applications II Lecturer: Charalampos E. Tsourakakis Oct. 11, 2013 5.1 Overview In the first part of today s lecture we will

More information

MAC Calculus II Spring Homework #6 Some Solutions.

MAC Calculus II Spring Homework #6 Some Solutions. MAC 2312-15931-Calculus II Spring 23 Homework #6 Some Solutions. 1. Find the centroid of the region bounded by the curves y = 2x 2 and y = 1 2x 2. Solution. It is obvious, by inspection, that the centroid

More information

SMSTC (2017/18) Geometry and Topology 2.

SMSTC (2017/18) Geometry and Topology 2. SMSTC (2017/18) Geometry and Topology 2 Lecture 1: Differentiable Functions and Manifolds in R n Lecturer: Diletta Martinelli (Notes by Bernd Schroers) a wwwsmstcacuk 11 General remarks In this lecture

More information

Recognise the Equation of a Circle. Solve Problems about Circles Centred at O. Co-Ordinate Geometry of the Circle - Outcomes

Recognise the Equation of a Circle. Solve Problems about Circles Centred at O. Co-Ordinate Geometry of the Circle - Outcomes 1 Co-Ordinate Geometry of the Circle - Outcomes Recognise the equation of a circle. Solve problems about circles centred at the origin. Solve problems about circles not centred at the origin. Determine

More information

Diffusion Maps, Spectral Clustering and Eigenfunctions of Fokker-Planck Operators

Diffusion Maps, Spectral Clustering and Eigenfunctions of Fokker-Planck Operators Diffusion Maps, Spectral Clustering and Eigenfunctions of Fokker-Planck Operators Boaz Nadler Stéphane Lafon Ronald R. Coifman Department of Mathematics, Yale University, New Haven, CT 652. {boaz.nadler,stephane.lafon,ronald.coifman}@yale.edu

More information

Heat Kernel and Analysis on Manifolds Excerpt with Exercises. Alexander Grigor yan

Heat Kernel and Analysis on Manifolds Excerpt with Exercises. Alexander Grigor yan Heat Kernel and Analysis on Manifolds Excerpt with Exercises Alexander Grigor yan Department of Mathematics, University of Bielefeld, 33501 Bielefeld, Germany 2000 Mathematics Subject Classification. Primary

More information

Maharashtra State Board Class X Mathematics Geometry Board Paper 2015 Solution. Time: 2 hours Total Marks: 40

Maharashtra State Board Class X Mathematics Geometry Board Paper 2015 Solution. Time: 2 hours Total Marks: 40 Maharashtra State Board Class X Mathematics Geometry Board Paper 05 Solution Time: hours Total Marks: 40 Note:- () Solve all questions. Draw diagrams wherever necessary. ()Use of calculator is not allowed.

More information

1 / 23

1 / 23 CBSE-XII-017 EXAMINATION CBSE-X-008 EXAMINATION MATHEMATICS Series: RLH/ Paper & Solution Code: 30//1 Time: 3 Hrs. Max. Marks: 80 General Instuctions : (i) All questions are compulsory. (ii) The question

More information

Data dependent operators for the spatial-spectral fusion problem

Data dependent operators for the spatial-spectral fusion problem Data dependent operators for the spatial-spectral fusion problem Wien, December 3, 2012 Joint work with: University of Maryland: J. J. Benedetto, J. A. Dobrosotskaya, T. Doster, K. W. Duke, M. Ehler, A.

More information

Banach Journal of Mathematical Analysis ISSN: (electronic)

Banach Journal of Mathematical Analysis ISSN: (electronic) Banach J Math Anal (009), no, 64 76 Banach Journal of Mathematical Analysis ISSN: 75-8787 (electronic) http://wwwmath-analysisorg ON A GEOMETRIC PROPERTY OF POSITIVE DEFINITE MATRICES CONE MASATOSHI ITO,

More information

Eigenvalues and eigenfunctions of the Laplacian. Andrew Hassell

Eigenvalues and eigenfunctions of the Laplacian. Andrew Hassell Eigenvalues and eigenfunctions of the Laplacian Andrew Hassell 1 2 The setting In this talk I will consider the Laplace operator,, on various geometric spaces M. Here, M will be either a bounded Euclidean

More information

EECS 275 Matrix Computation

EECS 275 Matrix Computation EECS 275 Matrix Computation Ming-Hsuan Yang Electrical Engineering and Computer Science University of California at Merced Merced, CA 95344 http://faculty.ucmerced.edu/mhyang Lecture 23 1 / 27 Overview

More information

(x, y) = d(x, y) = x y.

(x, y) = d(x, y) = x y. 1 Euclidean geometry 1.1 Euclidean space Our story begins with a geometry which will be familiar to all readers, namely the geometry of Euclidean space. In this first chapter we study the Euclidean distance

More information

g 2 (x) (1/3)M 1 = (1/3)(2/3)M.

g 2 (x) (1/3)M 1 = (1/3)(2/3)M. COMPACTNESS If C R n is closed and bounded, then by B-W it is sequentially compact: any sequence of points in C has a subsequence converging to a point in C Conversely, any sequentially compact C R n is

More information

SMT Power Round Solutions : Poles and Polars

SMT Power Round Solutions : Poles and Polars SMT Power Round Solutions : Poles and Polars February 18, 011 1 Definition and Basic Properties 1 Note that the unit circles are not necessary in the solutions. They just make the graphs look nicer. (1).0

More information

Section 6. Laplacian, volume and Hessian comparison theorems

Section 6. Laplacian, volume and Hessian comparison theorems Section 6. Laplacian, volume and Hessian comparison theorems Weimin Sheng December 27, 2009 Two fundamental results in Riemannian geometry are the Laplacian and Hessian comparison theorems for the distance

More information

Hyperbolic Analytic Geometry

Hyperbolic Analytic Geometry Chapter 6 Hyperbolic Analytic Geometry 6.1 Saccheri Quadrilaterals Recall the results on Saccheri quadrilaterals from Chapter 4. Let S be a convex quadrilateral in which two adjacent angles are right angles.

More information

Minimizing properties of geodesics and convex neighborhoods

Minimizing properties of geodesics and convex neighborhoods Minimizing properties of geodesics and convex neighorhoods Seminar Riemannian Geometry Summer Semester 2015 Prof. Dr. Anna Wienhard, Dr. Gye-Seon Lee Hyukjin Bang July 10, 2015 1. Normal spheres In the

More information

arxiv: v1 [stat.ml] 23 Jul 2008

arxiv: v1 [stat.ml] 23 Jul 2008 Data Spectroscopy: Eigenspace of Convolution Operators and Clustering arxiv:87.3719v1 [stat.ml] 23 Jul 28 Tao Shi, Mikhail Belkin and Bin Yu The Ohio State University and University of California, Berkeley

More information

Practice Problems For Test 3

Practice Problems For Test 3 Practice Problems For Test 3 Power Series Preliminary Material. Find the interval of convergence of the following. Be sure to determine the convergence at the endpoints. (a) ( ) k (x ) k (x 3) k= k (b)

More information

MLCC Clustering. Lorenzo Rosasco UNIGE-MIT-IIT

MLCC Clustering. Lorenzo Rosasco UNIGE-MIT-IIT MLCC 2018 - Clustering Lorenzo Rosasco UNIGE-MIT-IIT About this class We will consider an unsupervised setting, and in particular the problem of clustering unlabeled data into coherent groups. MLCC 2018

More information

MATH 829: Introduction to Data Mining and Analysis Clustering II

MATH 829: Introduction to Data Mining and Analysis Clustering II his lecture is based on U. von Luxburg, A Tutorial on Spectral Clustering, Statistics and Computing, 17 (4), 2007. MATH 829: Introduction to Data Mining and Analysis Clustering II Dominique Guillot Departments

More information

From Graphs to Manifolds Weak and Strong Pointwise Consistency of Graph Laplacians

From Graphs to Manifolds Weak and Strong Pointwise Consistency of Graph Laplacians From Graphs to anifolds Weak and Strong Pointwise Consistency of Graph Laplacians atthias Hein 1, Jean-Yves Audibert, and Ulrike von Luxburg 3 1 ax Planck Institute for Biological Cybernetics, Tübingen,

More information

McGill University Math 354: Honors Analysis 3

McGill University Math 354: Honors Analysis 3 Practice problems McGill University Math 354: Honors Analysis 3 not for credit Problem 1. Determine whether the family of F = {f n } functions f n (x) = x n is uniformly equicontinuous. 1st Solution: The

More information

Draft version September 15, 2015

Draft version September 15, 2015 Novi Sad J. Math. Vol. XX, No. Y, 0ZZ,??-?? ON NEARLY QUASI-EINSTEIN WARPED PRODUCTS 1 Buddhadev Pal and Arindam Bhattacharyya 3 Abstract. We study nearly quasi-einstein warped product manifolds for arbitrary

More information

Spectral Clustering. Guokun Lai 2016/10

Spectral Clustering. Guokun Lai 2016/10 Spectral Clustering Guokun Lai 2016/10 1 / 37 Organization Graph Cut Fundamental Limitations of Spectral Clustering Ng 2002 paper (if we have time) 2 / 37 Notation We define a undirected weighted graph

More information

Unsupervised Learning Techniques Class 07, 1 March 2006 Andrea Caponnetto

Unsupervised Learning Techniques Class 07, 1 March 2006 Andrea Caponnetto Unsupervised Learning Techniques 9.520 Class 07, 1 March 2006 Andrea Caponnetto About this class Goal To introduce some methods for unsupervised learning: Gaussian Mixtures, K-Means, ISOMAP, HLLE, Laplacian

More information