Learning from persistence diagrams
|
|
- Mabel Reeves
- 5 years ago
- Views:
Transcription
1 Learning from persistence diagrams Ulrich Bauer TUM July 7, 2015 ACAT final meeting IST Austria Joint work with: Jan Reininghaus, Stefan Huber, Roland Kwitt 1 / 28
2 ? 2 / 28
3 ? 2 / 28
4 3 / 28
5 3 / 28
6 3 / 28
7 3 / 28
8 3 / 28
9 3 / 28
10 3 / 28
11 3 / 28
12 3 / 28
13 3 / 28
14 4 / 28
15 δ 4 / 28
16 δ 4 / 28
17 δ 4 / 28
18 δ 4 / 28
19 δ 4 / 28
20 δ 4 / 28
21 Homology inference Problem (Homological reconstruction) Given a finite sample P Ω, construct a shape X that is geometrically close to Ω and satisfies H (X) H (Ω). 5 / 28
22 Homology inference Problem (Homological reconstruction) Given a finite sample P Ω, construct a shape X that is geometrically close to Ω and satisfies H (X) H (Ω). Problem (Homology inference) Determine the homology H (Ω) of a shape Ω R d from a finite sample P Ω. 5 / 28
23 Homology reconstruction using union of balls B δ (P): δ-neighborhood of P Theorem (Niyogi, Smale, Weinberger 2006) Let Ω be a submanifold of R d. Let P Ω be such that Ω B δ (P) and δ satisfies a certain (strong) sampling condition δ < C(Ω). Then H (Ω) H (B δ (P)). 6 / 28
24 Homology reconstruction using union of balls B δ (P): δ-neighborhood of P Theorem (Niyogi, Smale, Weinberger 2006) Let Ω be a submanifold of R d. Let P Ω be such that Ω B δ (P) and δ satisfies a certain (strong) sampling condition δ < C(Ω). Then H (Ω) H (B δ (P)). 6 / 28
25 Homology reconstruction using union of balls B δ (P): δ-neighborhood of P Theorem (Niyogi, Smale, Weinberger 2006) Let Ω be a submanifold of R d. Let P Ω be such that Ω B δ (P) and δ satisfies a certain (strong) sampling condition δ < C(Ω). Then H (Ω) H (B δ (P)). 6 / 28
26 Homology reconstruction using union of balls B δ (P): δ-neighborhood of P Theorem (Niyogi, Smale, Weinberger 2006) Let Ω be a submanifold of R d. Let P Ω be such that Ω B δ (P) and δ satisfies a certain (strong) sampling condition δ < C(Ω). Then H (Ω) H (B δ (P)). 6 / 28
27 7 / 28
28 / 28
29 / 28
30 / 28
31 / 28
32 / 28
33 / 28
34 / 28
35 / 28
36 / 28
37 / 28
38 / 28
39 / 28
40 / 28
41 / 28
42 / 28
43 / 28
44 / 28
45 / 28
46 Homology inference using persistent homology Theorem (Cohen-Steiner, Edelsbrunner, Harer 2005) Let Ω R d. Let P Ω be such that Then Ω B δ (P) for some δ > 0 (sampling density), P B є (Ω) for some є > 0 (sampling error), H (Ω B δ+є (Ω)) is an isomorphism, and H (B δ+є (Ω) B 2(δ+є) (Ω)) is a monomorphism. H (Ω) im H (B δ (P) B 2δ+є (P)). 8 / 28
47 Homology inference using persistent homology Theorem (Cohen-Steiner, Edelsbrunner, Harer 2005) Let Ω R d. Let P Ω be such that Then Ω B δ (P) for some δ > 0 (sampling density), P B є (Ω) for some є > 0 (sampling error), H (Ω B δ+є (Ω)) is an isomorphism, and H (B δ+є (Ω) B 2(δ+є) (Ω)) is a monomorphism. H (Ω) im H (B δ (P) B 2δ+є (P)). 8 / 28
48 Homology inference using persistent homology Theorem (Cohen-Steiner, Edelsbrunner, Harer 2005) Let Ω R d. Let P Ω be such that Then Ω B δ (P) for some δ > 0 (sampling density), P B є (Ω) for some є > 0 (sampling error), H (Ω B δ+є (Ω)) is an isomorphism, and H (B δ+є (Ω) B 2(δ+є) (Ω)) is a monomorphism. H (Ω) im H (B δ (P) B 2δ+є (P)). 8 / 28
49 Homology inference using persistent homology Theorem (Cohen-Steiner, Edelsbrunner, Harer 2005) Let Ω R d. Let P Ω be such that Then Ω B δ (P) for some δ > 0 (sampling density), P B є (Ω) for some є > 0 (sampling error), H (Ω B δ+є (Ω)) is an isomorphism, and H (B δ+є (Ω) B 2(δ+є) (Ω)) is a monomorphism. H (Ω) im H (B δ (P) B 2δ+є (P)). 8 / 28
50 Homology inference using persistent homology Theorem (Cohen-Steiner, Edelsbrunner, Harer 2005) Let Ω R d. Let P Ω be such that Then Ω B δ (P) for some δ > 0 (sampling density), P B є (Ω) for some є > 0 (sampling error), H (Ω B δ+є (Ω)) is an isomorphism, and H (B δ+є (Ω) B 2(δ+є) (Ω)) is a monomorphism. H (Ω) im H (B δ (P) B 2δ+є (P)). 8 / 28
51 Homology inference using persistent homology Theorem (Cohen-Steiner, Edelsbrunner, Harer 2005) Let Ω R d. Let P Ω be such that Then Ω B δ (P) for some δ > 0 (sampling density), P B є (Ω) for some є > 0 (sampling error), H (Ω B δ+є (Ω)) is an isomorphism, and H (B δ+є (Ω) B 2(δ+є) (Ω)) is a monomorphism. H (Ω) im H (B δ (P) B 2δ+є (P)). 8 / 28
52 Homology inference using persistent homology Theorem (Cohen-Steiner, Edelsbrunner, Harer 2005) Let Ω R d. Let P Ω be such that Then Ω B δ (P) for some δ > 0 (sampling density), P B є (Ω) for some є > 0 (sampling error), H (Ω B δ+є (Ω)) is an isomorphism, and H (B δ+є (Ω) B 2(δ+є) (Ω)) is a monomorphism. H (Ω) im H (B δ (P) B 2δ+є (P)). 8 / 28
53 Homology inference using persistent homology Theorem (Cohen-Steiner, Edelsbrunner, Harer 2005) Let Ω R d. Let P Ω be such that Then Ω B δ (P) for some δ > 0 (sampling density), P B є (Ω) for some є > 0 (sampling error), H (Ω B δ+є (Ω)) is an isomorphism, and H (B δ+є (Ω) B 2(δ+є) (Ω)) is a monomorphism. H (Ω) im H (B δ (P) B 2δ+є (P)). 8 / 28
54 / 28
55 The pipeline of topological data analysis Data point cloud distance Geometry function sublevel sets Topology topological spaces homology Algebra vector spaces barcode Combinatorics intervals 10 / 28
56 Stability of persistence barcodes for functions Theorem (Cohen-Steiner, Edelsbrunner, Harer 2005) If two functions f, g K R have distance f g δ then there exists a δ-matching of their barcodes. 11 / 28
57 Stability of persistence barcodes for functions Theorem (Cohen-Steiner, Edelsbrunner, Harer 2005) If two functions f, g K R have distance f g δ then there exists a δ-matching of their barcodes. matching of sets X, Y: bijection of subsets X X, Y Y 11 / 28
58 Stability of persistence barcodes for functions Theorem (Cohen-Steiner, Edelsbrunner, Harer 2005) If two functions f, g K R have distance f g δ then there exists a δ-matching of their barcodes. matching of sets X, Y: bijection of subsets X X, Y Y δ-matching of barcodes: 11 / 28
59 Stability of persistence barcodes for functions Theorem (Cohen-Steiner, Edelsbrunner, Harer 2005) If two functions f, g K R have distance f g δ then there exists a δ-matching of their barcodes. δ matching of sets X, Y: bijection of subsets X X, Y Y δ-matching of barcodes: matched intervals have endpoints within distance δ 11 / 28
60 Stability of persistence barcodes for functions Theorem (Cohen-Steiner, Edelsbrunner, Harer 2005) If two functions f, g K R have distance f g δ then there exists a δ-matching of their barcodes. δ 2δ matching of sets X, Y: bijection of subsets X X, Y Y δ-matching of barcodes: matched intervals have endpoints within distance δ unmatched intervals have length 2δ 11 / 28
61 Stability of persistence barcodes for functions Theorem (Cohen-Steiner, Edelsbrunner, Harer 2005) If two functions f, g K R have distance f g δ then there exists a δ-matching of their barcodes. δ 2δ matching of sets X, Y: bijection of subsets X X, Y Y δ-matching of barcodes: matched intervals have endpoints within distance δ unmatched intervals have length 2δ Bottleneck distance d B : infimum δ admitting a δ-matching 11 / 28
62 Persistence of functions Instead of point clouds, we can also study the persistence barcode of functions Example: surface mesh of the corpus callosum Surface is filtered by sublevel sets of the function 12 / 28
63 Persistence barcodes and persistence diagrams / 28
64 Topological machine learning Task: shape retrieval Task: object recognition Task: texture recognition Topological data analysis 14 / 28
65 Topological machine learning Task: shape retrieval SVM PCA Task: object recognition k-means Task: texture recognition Topological data analysis Machine learning 14 / 28
66 Topological machine learning Task: shape retrieval SVM Kernel PCA Task: object recognition k-means Task: texture recognition Topological data analysis Machine learning 14 / 28
67 Linear machine learning methods Many machine learning methods solve linear geometric problems on vector spaces Principal component analysis (PCA): find orthogonal directions of largest variance 15 / 28
68 Linear machine learning methods Many machine learning methods solve linear geometric problems on vector spaces Principal component analysis (PCA): find orthogonal directions of largest variance Support vector machines (SVM): find best-separating hyperplane 15 / 28
69 Kernels and kernel methods Let X be a set, and H a vector space with inner product, H. Consider a map Φ X H 16 / 28
70 Kernels and kernel methods Let X be a set, and H a vector space with inner product, H. Consider a map Φ X H Then k(, ) = Φ( ), Φ( ) H X X R is a kernel (and Φ is the associated feature map) 16 / 28
71 Kernels and kernel methods Let X be a set, and H a vector space with inner product, H. Consider a map Φ X H Then k(, ) = Φ( ), Φ( ) H X X R is a kernel (and Φ is the associated feature map) For many linear methods: Feature map Φ not required explicitly 16 / 28
72 Kernels and kernel methods Let X be a set, and H a vector space with inner product, H. Consider a map Φ X H Then k(, ) = Φ( ), Φ( ) H X X R is a kernel (and Φ is the associated feature map) For many linear methods: Feature map Φ not required explicitly No coordinate basis for H required 16 / 28
73 Kernels and kernel methods Let X be a set, and H a vector space with inner product, H. Consider a map Φ X H Then k(, ) = Φ( ), Φ( ) H X X R is a kernel (and Φ is the associated feature map) For many linear methods: Feature map Φ not required explicitly No coordinate basis for H required Computations can be performed by evaluating k(, ) 16 / 28
74 Kernels and kernel methods Let X be a set, and H a vector space with inner product, H. Consider a map Φ X H Then k(, ) = Φ( ), Φ( ) H X X R is a kernel (and Φ is the associated feature map) For many linear methods: Feature map Φ not required explicitly No coordinate basis for H required Computations can be performed by evaluating k(, ) This is called the kernel trick. 16 / 28
75 A feature map for persistence diagrams 17 / 28
76 A feature map for persistence diagrams 17 / 28
77 A feature map for persistence diagrams 17 / 28
78 A feature map for persistence diagrams 17 / 28
79 Smoothing persistence diagrams Define domain Ω = {(b, d) b d}. 18 / 28
80 Smoothing persistence diagrams Define domain Ω = {(b, d) b d}. For a persistence diagram D, consider the solution u Ω [0, ) R, (x, t) u(x, t) of the heat equation t u = x u u = 0 in Ω [0, ), on Ω [0, ), u = δ y on Ω {0} y D 18 / 28
81 Smoothing persistence diagrams Define domain Ω = {(b, d) b d}. For a persistence diagram D, consider the solution u Ω [0, ) R, (x, t) u(x, t) of the heat equation t u = x u u = 0 in Ω [0, ), on Ω [0, ), u = δ y on Ω {0} y D For σ > 0, define feature map Φ σ (D) = u t=σ. 18 / 28
82 Smoothing persistence diagrams Define domain Ω = {(b, d) b d}. For a persistence diagram D, consider the solution u Ω [0, ) R, (x, t) u(x, t) of the heat equation t u = x u u = 0 in Ω [0, ), on Ω [0, ), u = δ y on Ω {0} y D For σ > 0, define feature map Φ σ (D) = u t=σ. This yields the persistence scale space (PSS) kernel k σ (D f, D g ) = Φ σ (D f ), Φ σ (D g ) L 2 (Ω). 18 / 28
83 Extending the domain Extend domain from Ω to R 2 19 / 28
84 Extending the domain Extend domain from Ω to R 2 Remove boundary condition, change initial condition: t u = x u in Ω [0, ), u = δ y δ y on R 2 {0}, y D where y = (b, a) is y = (a, b) reflected at the diagonal. 19 / 28
85 Extending the domain Extend domain from Ω to R 2 Remove boundary condition, change initial condition: t u = x u in Ω [0, ), u = δ y δ y on R 2 {0}, y D where y = (b, a) is y = (a, b) reflected at the diagonal. Restricting to Ω yields a solution for the original equation. 19 / 28
86 Smoothing persistence diagrams 20 / 28
87 Smoothing persistence diagrams 20 / 28
88 Smoothing persistence diagrams 20 / 28
89 Smoothing persistence diagrams 20 / 28
90 A closed-form solution Closed form solution (convolve with a Gaussian): u(x, t) = 1 4πt x y 2 x y 2 exp exp. y D 4t 4t 21 / 28
91 A closed-form solution Closed form solution (convolve with a Gaussian): u(x, t) = 1 4πt x y 2 x y 2 exp exp. y D 4t 4t Closed form expression for PSS kernel: k σ (D f, D g ) = 1 8πσ y z exp 2 y D f 8σ z D g exp y z 2. 8σ 21 / 28
92 Stability Let Φ σ (D f ) be the smoothing of a persistence diagram D f. Theorem For two persistence diagrams D f and D g and σ > 0 we have Φ σ (D f ) Φ σ (D g ) L 2 C σ d W 1 (D f, D g ). 22 / 28
93 Stability Let Φ σ (D f ) be the smoothing of a persistence diagram D f. Theorem For two persistence diagrams D f and D g and σ > 0 we have Φ σ (D f ) Φ σ (D g ) L 2 C σ d W 1 (D f, D g ). Here: stability with respect to d W1 (D f, D g ), where d Wp (D f, D g ) = ( inf µ D f D g x µ(x) p ) 1 p x D f 22 / 28
94 Stability Let Φ σ (D f ) be the smoothing of a persistence diagram D f. Theorem For two persistence diagrams D f and D g and σ > 0 we have Φ σ (D f ) Φ σ (D g ) L 2 C σ d W 1 (D f, D g ). Here: stability with respect to d W1 (D f, D g ), where d Wp (D f, D g ) = ( inf µ D f D g x µ(x) p ) 1 p x D f Note: bottleneck distance d B (D f, D g ) = lim p d Wp (D f, D g ) 22 / 28
95 Persistence landscapes Introduced by Bubenik et at. (2013). death birth 23 / 28
96 Persistence landscapes Introduced by Bubenik et at. (2013). death birth 23 / 28
97 Persistence landscapes Introduced by Bubenik et at. (2013). λ k (x) death (a, b) x a b birth 23 / 28
98 Persistence landscapes Introduced by Bubenik et at. (2013). death (a, b) λ k (x) a b x 23 / 28
99 Persistence landscapes Introduced by Bubenik et at. (2013). death (a, b) λ k (x) λ 1 a b x 23 / 28
100 Persistence landscapes Introduced by Bubenik et at. (2013). death (a, b) λ k (x) λ 2 λ 1 a b x 23 / 28
101 Persistence landscapes Introduced by Bubenik et at. (2013). death (a, b) λ k (x) λ 2 λ 1 a λ 3 b x 23 / 28
102 Persistence landscapes The persistence landscape of a persistence diagram D f can be interpreted as a function Φ L (D f ) in L p (R 2 ). λ(k, x) 0 x 1 2 k / 28
103 Persistence landscapes The persistence landscape of a persistence diagram D f can be interpreted as a function Φ L (D f ) in L p (R 2 ). λ(k, x) 0 x 1 2 k... Stablility: Φ L (D f ) Φ L (D g ) d B (D f, D g ) 24 / 28
104 Persistence landscapes The persistence landscape of a persistence diagram D f can be interpreted as a function Φ L (D f ) in L p (R 2 ). λ(k, x) 0 x 1 2 k... Stablility: Φ L (D f ) Φ L (D g ) d B (D f, D g ) But: no stability for Φ σ (D f ) Φ σ (D g ) L 2 24 / 28
105 Smoothed persistence diagrams vs. landscapes SHREC 2014 dataset of shapes: 300 surfaces, 20 classes (poses). Sublevel set filtration of heat kernel signature (HKS) on each surface Additional scale parameter t 2 Tasks: Retrieval: Get the nearest neighbour in L 2. Classification: Train a SVM with the kernel. 25 / 28
106 Experiments: Retrieval HKS t i d k L d kσ d k L d kσ t t t t t t t t t t Top 3 (2014) Table: Nearest-neighbor retrieval performance. Left: SHREC 2014 (synthetic); Right: SHREC 2014 (real). 26 / 28
107 Experiments: Classification HKS t i k L k σ t ± ± t ± ± t ± ± t ± ± t ± ± t ± ± t ± ± t ± ± t ± ± t ± ± Table: Classification performance on SHREC 2014 (synthetic). 27 / 28
108 Experiments: Classification HKS t i k L k σ t ± ± t ± ± t ± ± t ± ± t ± ± t ± ± t ± ± t ± ± t ± ± t ± ± Table: Classification performance on SHREC 2014 (real). 27 / 28
A Kernel on Persistence Diagrams for Machine Learning
A Kernel on Persistence Diagrams for Machine Learning Jan Reininghaus 1 Stefan Huber 1 Roland Kwitt 2 Ulrich Bauer 1 1 Institute of Science and Technology Austria 2 FB Computer Science Universität Salzburg,
More informationPersistent homology and nonparametric regression
Cleveland State University March 10, 2009, BIRS: Data Analysis using Computational Topology and Geometric Statistics joint work with Gunnar Carlsson (Stanford), Moo Chung (Wisconsin Madison), Peter Kim
More informationA Gaussian Type Kernel for Persistence Diagrams
TRIPODS Summer Bootcamp: Topology and Machine Learning Brown University, August 2018 A Gaussian Type Kernel for Persistence Diagrams Mathieu Carrière joint work with S. Oudot and M. Cuturi Persistence
More informationNonparametric regression for topology. applied to brain imaging data
, applied to brain imaging data Cleveland State University October 15, 2010 Motivation from Brain Imaging MRI Data Topology Statistics Application MRI Data Topology Statistics Application Cortical surface
More informationMultivariate Topological Data Analysis
Cleveland State University November 20, 2008, Duke University joint work with Gunnar Carlsson (Stanford), Peter Kim and Zhiming Luo (Guelph), and Moo Chung (Wisconsin Madison) Framework Ideal Truth (Parameter)
More informationPersistent Homology of Data, Groups, Function Spaces, and Landscapes.
Persistent Homology of Data, Groups, Function Spaces, and Landscapes. Shmuel Weinberger Department of Mathematics University of Chicago William Benter Lecture City University of Hong Kong Liu Bie Ju Centre
More informationTopological Simplification in 3-Dimensions
Topological Simplification in 3-Dimensions David Letscher Saint Louis University Computational & Algorithmic Topology Sydney Letscher (SLU) Topological Simplification CATS 2017 1 / 33 Motivation Original
More informationStatistics and Topological Data Analysis
Geometry Understanding in Higher Dimensions CHAIRE D INFORMATIQUE ET SCIENCES NUMÉRIQUES Collège de France - June 2017 Statistics and Topological Data Analysis Bertrand MICHEL Ecole Centrale de Nantes
More informationPersistent Local Systems
Persistent Local Systems Amit Patel Department of Mathematics Colorado State University joint work with Robert MacPherson School of Mathematics Institute for Advanced Study Abel Symposium on Topological
More informationSome Topics in Computational Topology
Some Topics in Computational Topology Yusu Wang Ohio State University AMS Short Course 2014 Introduction Much recent developments in computational topology Both in theory and in their applications E.g,
More informationSome Topics in Computational Topology. Yusu Wang. AMS Short Course 2014
Some Topics in Computational Topology Yusu Wang AMS Short Course 2014 Introduction Much recent developments in computational topology Both in theory and in their applications E.g, the theory of persistence
More informationGeometric Inference for Probability distributions
Geometric Inference for Probability distributions F. Chazal 1 D. Cohen-Steiner 2 Q. Mérigot 2 1 Geometrica, INRIA Saclay, 2 Geometrica, INRIA Sophia-Antipolis 2009 June 1 Motivation What is the (relevant)
More informationPersistent Homology: Course Plan
Persistent Homology: Course Plan Andrey Blinov 16 October 2017 Abstract This is a list of potential topics for each class of the prospective course, with references to the literature. Each class will have
More informationDIMENSIONALITY REDUCTION METHODS IN INDEPENDENT SUBSPACE ANALYSIS FOR SIGNAL DETECTION. Mijail Guillemard, Armin Iske, Sara Krause-Solberg
DIMENSIONALIY EDUCION MEHODS IN INDEPENDEN SUBSPACE ANALYSIS FO SIGNAL DEECION Mijail Guillemard, Armin Iske, Sara Krause-Solberg Department of Mathematics, University of Hamburg, {guillemard, iske, krause-solberg}@math.uni-hamburg.de
More informationHomology through Interleaving
Notes by Tamal K. Dey, OSU 91 Homology through Interleaving 32 Concept of interleaving A discrete set P R k is assumed to be a sample of a set X R k, if it lies near to it which we can quantify with the
More informationPersistence based Clustering
July 9, 2009 TGDA Workshop Persistence based Clustering Primoz Skraba joint work with Frédéric Chazal, Steve Y. Oudot, Leonidas J. Guibas 1-1 Clustering Input samples 2-1 Clustering Input samples Important
More informationTopological Data Analysis for Brain Networks
Topological Data Analysis for Brain Networks Relating Functional Brain Network Topology to Clinical Measures of Behavior Bei Wang Phillips 1,2 University of Utah Joint work with Eleanor Wong 1,2, Sourabh
More informationarxiv: v1 [cs.cv] 13 Jul 2017
Deep Learning with Topological Signatures Christoph Hofer Department of Computer Science University of Salzburg Salzburg, Austria, 5020 chofer@cosy.sbg.ac.at Roland Kwitt Department of Computer Science
More informationDistributions of Persistence Diagrams and Approximations
Distributions of Persistence Diagrams and Approximations Vasileios Maroulas Department of Mathematics Department of Business Analytics and Statistics University of Tennessee August 31, 2018 Thanks V. Maroulas
More informationNeuronal structure detection using Persistent Homology
Neuronal structure detection using Persistent Homology J. Heras, G. Mata and J. Rubio Department of Mathematics and Computer Science, University of La Rioja Seminario de Informática Mirian Andrés March
More informationPersistence Theory: From Quiver Representations to Data Analysis
Persistence Theory: From Quiver Representations to Data Analysis Comments and Corrections Steve Oudot June 29, 2017 Comments page viii, bottom of page: the following names should be added to the acknowledgements:
More informationStatistical Machine Learning
Statistical Machine Learning Christoph Lampert Spring Semester 2015/2016 // Lecture 12 1 / 36 Unsupervised Learning Dimensionality Reduction 2 / 36 Dimensionality Reduction Given: data X = {x 1,..., x
More informationStabilizing the unstable output of persistent homology computations
Stabilizing the unstable output of persistent homology computations Peter Bubenik with Paul Bendich (Duke) and Alex Wagner (Florida) May 5, 2017 Conference on Applied and Computational Algebraic Topology
More informationarxiv: v1 [math.at] 27 Jul 2012
STATISTICAL TOPOLOGY USING PERSISTENCE LANDSCAPES PETER BUBENIK arxiv:1207.6437v1 [math.at] 27 Jul 2012 Abstract. We define a new descriptor for persistent homology, which we call the persistence landscape,
More informationCS4495/6495 Introduction to Computer Vision. 8C-L3 Support Vector Machines
CS4495/6495 Introduction to Computer Vision 8C-L3 Support Vector Machines Discriminative classifiers Discriminative classifiers find a division (surface) in feature space that separates the classes Several
More informationarxiv: v5 [math.st] 20 Dec 2018
Tropical Sufficient Statistics for Persistent Homology Anthea Monod 1,, Sara Kališnik 2, Juan Ángel Patiño-Galindo1, and Lorin Crawford 3 5 1 Department of Systems Biology, Columbia University, New York,
More informationBirth and death in discrete Morse theory
Birth and death in discrete Morse theory Neža Mramor Kosta joint work with Henry C. King, Kevin Knudson University of Ljubljana, Slovenia ACAT Bremen 2013 Motivation Discrete Morse theory Parametric discrete
More informationData Preprocessing Tasks
Data Tasks 1 2 3 Data Reduction 4 We re here. 1 Dimensionality Reduction Dimensionality reduction is a commonly used approach for generating fewer features. Typically used because too many features can
More informationKernel Methods. Foundations of Data Analysis. Torsten Möller. Möller/Mori 1
Kernel Methods Foundations of Data Analysis Torsten Möller Möller/Mori 1 Reading Chapter 6 of Pattern Recognition and Machine Learning by Bishop Chapter 12 of The Elements of Statistical Learning by Hastie,
More informationMachine learning for pervasive systems Classification in high-dimensional spaces
Machine learning for pervasive systems Classification in high-dimensional spaces Department of Communications and Networking Aalto University, School of Electrical Engineering stephan.sigg@aalto.fi Version
More informationComparing persistence diagrams through complex vectors
Comparing persistence diagrams through complex vectors Barbara Di Fabio ARCES, University of Bologna Final Project Meeting Vienna, July 9, 2015 B. Di Fabio, M. Ferri, Comparing persistence diagrams through
More informationPersistence modules in symplectic topology
Leonid Polterovich, Tel Aviv Cologne, 2017 based on joint works with Egor Shelukhin, Vukašin Stojisavljević and a survey (in progress) with Jun Zhang Morse homology f -Morse function, ρ-generic metric,
More informationKernel Methods. Machine Learning A W VO
Kernel Methods Machine Learning A 708.063 07W VO Outline 1. Dual representation 2. The kernel concept 3. Properties of kernels 4. Examples of kernel machines Kernel PCA Support vector regression (Relevance
More informationSupplementary Figures
Death time Supplementary Figures D i, Å Supplementary Figure 1: Correlation of the death time of 2-dimensional homology classes and the diameters of the largest included sphere D i when using methane CH
More informationStratification Learning through Homology Inference
Stratification Learning through Homology Inference Paul Bendich Sayan Mukherjee Institute of Science and Technology Austria Department of Statistical Science Klosterneuburg, Austria Duke University, Durham
More informationMachine Learning Basics
Security and Fairness of Deep Learning Machine Learning Basics Anupam Datta CMU Spring 2019 Image Classification Image Classification Image classification pipeline Input: A training set of N images, each
More informationWITNESSED K-DISTANCE
WITNESSED K-DISTANCE LEONIDAS GUIBAS, DMITRIY MOROZOV, AND QUENTIN MÉRIGOT Abstract. Distance function to a compact set plays a central role in several areas of computational geometry. Methods that rely
More informationMotivating the Covariance Matrix
Motivating the Covariance Matrix Raúl Rojas Computer Science Department Freie Universität Berlin January 2009 Abstract This note reviews some interesting properties of the covariance matrix and its role
More informationCITS 4402 Computer Vision
CITS 4402 Computer Vision A/Prof Ajmal Mian Adj/A/Prof Mehdi Ravanbakhsh Lecture 06 Object Recognition Objectives To understand the concept of image based object recognition To learn how to match images
More informationGeometric Inference for Probability distributions
Geometric Inference for Probability distributions F. Chazal 1 D. Cohen-Steiner 2 Q. Mérigot 2 1 Geometrica, INRIA Saclay, 2 Geometrica, INRIA Sophia-Antipolis 2009 July 8 F. Chazal, D. Cohen-Steiner, Q.
More informationWitnessed k-distance
Witnessed k-distance Leonidas J. Guibas, Quentin Mérigot, Dmitriy Morozov To cite this version: Leonidas J. Guibas, Quentin Mérigot, Dmitriy Morozov. Witnessed k-distance. Discrete and Computational Geometry,
More informationGeometric Inference on Kernel Density Estimates
Geometric Inference on Kernel Density Estimates Bei Wang 1 Jeff M. Phillips 2 Yan Zheng 2 1 Scientific Computing and Imaging Institute, University of Utah 2 School of Computing, University of Utah Feb
More informationGaussian Processes. 1 What problems can be solved by Gaussian Processes?
Statistical Techniques in Robotics (16-831, F1) Lecture#19 (Wednesday November 16) Gaussian Processes Lecturer: Drew Bagnell Scribe:Yamuna Krishnamurthy 1 1 What problems can be solved by Gaussian Processes?
More informationDistance Metric Learning
Distance Metric Learning Technical University of Munich Department of Informatics Computer Vision Group November 11, 2016 M.Sc. John Chiotellis: Distance Metric Learning 1 / 36 Outline Computer Vision
More informationBeyond the Point Cloud: From Transductive to Semi-Supervised Learning
Beyond the Point Cloud: From Transductive to Semi-Supervised Learning Vikas Sindhwani, Partha Niyogi, Mikhail Belkin Andrew B. Goldberg goldberg@cs.wisc.edu Department of Computer Sciences University of
More informationPrincipal Component Analysis
CSci 5525: Machine Learning Dec 3, 2008 The Main Idea Given a dataset X = {x 1,..., x N } The Main Idea Given a dataset X = {x 1,..., x N } Find a low-dimensional linear projection The Main Idea Given
More informationTopological Data Analysis - II. Afra Zomorodian Department of Computer Science Dartmouth College
Topological Data Analysis - II Afra Zomorodian Department of Computer Science Dartmouth College September 4, 2007 1 Plan Yesterday: Motivation Topology Simplicial Complexes Invariants Homology Algebraic
More informationCS798: Selected topics in Machine Learning
CS798: Selected topics in Machine Learning Support Vector Machine Jakramate Bootkrajang Department of Computer Science Chiang Mai University Jakramate Bootkrajang CS798: Selected topics in Machine Learning
More informationInterleaving Distance between Merge Trees
Noname manuscript No. (will be inserted by the editor) Interleaving Distance between Merge Trees Dmitriy Morozov Kenes Beketayev Gunther H. Weber the date of receipt and acceptance should be inserted later
More informationPersistence Landscape of Functional Signal and Its. Application to Epileptic Electroencaphalogram Data
Persistence Landscape of Functional Signal and Its Application to Epileptic Electroencaphalogram Data Yuan Wang, Hernando Ombao, Moo K. Chung 1 Department of Biostatistics and Medical Informatics University
More informationStatistical learning theory, Support vector machines, and Bioinformatics
1 Statistical learning theory, Support vector machines, and Bioinformatics Jean-Philippe.Vert@mines.org Ecole des Mines de Paris Computational Biology group ENS Paris, november 25, 2003. 2 Overview 1.
More informationComputer Vision Group Prof. Daniel Cremers. 2. Regression (cont.)
Prof. Daniel Cremers 2. Regression (cont.) Regression with MLE (Rep.) Assume that y is affected by Gaussian noise : t = f(x, w)+ where Thus, we have p(t x, w, )=N (t; f(x, w), 2 ) 2 Maximum A-Posteriori
More informationMetrics on Persistence Modules and an Application to Neuroscience
Metrics on Persistence Modules and an Application to Neuroscience Magnus Bakke Botnan NTNU November 28, 2014 Part 1: Basics The Study of Sublevel Sets A pair (M, f ) where M is a topological space and
More informationGeometric View of Machine Learning Nearest Neighbor Classification. Slides adapted from Prof. Carpuat
Geometric View of Machine Learning Nearest Neighbor Classification Slides adapted from Prof. Carpuat What we know so far Decision Trees What is a decision tree, and how to induce it from data Fundamental
More informationTropical Coordinates on the Space of Persistence Barcodes
Tropical Coordinates on the Space of Persistence Barcodes arxiv:1604.00113v3 [math.at] 8 Dec 017 Sara Kališnik Max Planck Institute for Mathematics in the Sciences, sara.kalisnik@mis.mpg.de Wesleyan University,
More informationUNIVERSITY of PENNSYLVANIA CIS 520: Machine Learning Final, Fall 2013
UNIVERSITY of PENNSYLVANIA CIS 520: Machine Learning Final, Fall 2013 Exam policy: This exam allows two one-page, two-sided cheat sheets; No other materials. Time: 2 hours. Be sure to write your name and
More informationPersistent Homology. 24 Notes by Tamal K. Dey, OSU
24 Notes by Tamal K. Dey, OSU Persistent Homology Suppose we have a noisy point set (data) sampled from a space, say a curve inr 2 as in Figure 12. Can we get the information that the sampled space had
More informationNon-linear Dimensionality Reduction
Non-linear Dimensionality Reduction CE-725: Statistical Pattern Recognition Sharif University of Technology Spring 2013 Soleymani Outline Introduction Laplacian Eigenmaps Locally Linear Embedding (LLE)
More informationApproximate Inference Part 1 of 2
Approximate Inference Part 1 of 2 Tom Minka Microsoft Research, Cambridge, UK Machine Learning Summer School 2009 http://mlg.eng.cam.ac.uk/mlss09/ Bayesian paradigm Consistent use of probability theory
More informationApproximate Inference Part 1 of 2
Approximate Inference Part 1 of 2 Tom Minka Microsoft Research, Cambridge, UK Machine Learning Summer School 2009 http://mlg.eng.cam.ac.uk/mlss09/ 1 Bayesian paradigm Consistent use of probability theory
More informationStatistical Pattern Recognition
Statistical Pattern Recognition Feature Extraction Hamid R. Rabiee Jafar Muhammadi, Alireza Ghasemi, Payam Siyari Spring 2014 http://ce.sharif.edu/courses/92-93/2/ce725-2/ Agenda Dimensionality Reduction
More informationA graph based approach to semi-supervised learning
A graph based approach to semi-supervised learning 1 Feb 2011 Two papers M. Belkin, P. Niyogi, and V Sindhwani. Manifold regularization: a geometric framework for learning from labeled and unlabeled examples.
More informationPrincipal Component Analysis
Machine Learning Michaelmas 2017 James Worrell Principal Component Analysis 1 Introduction 1.1 Goals of PCA Principal components analysis (PCA) is a dimensionality reduction technique that can be used
More informationCompressed Sensing and Neural Networks
and Jan Vybíral (Charles University & Czech Technical University Prague, Czech Republic) NOMAD Summer Berlin, September 25-29, 2017 1 / 31 Outline Lasso & Introduction Notation Training the network Applications
More informationLinear vs Non-linear classifier. CS789: Machine Learning and Neural Network. Introduction
Linear vs Non-linear classifier CS789: Machine Learning and Neural Network Support Vector Machine Jakramate Bootkrajang Department of Computer Science Chiang Mai University Linear classifier is in the
More informationNonlinear Methods. Data often lies on or near a nonlinear low-dimensional curve aka manifold.
Nonlinear Methods Data often lies on or near a nonlinear low-dimensional curve aka manifold. 27 Laplacian Eigenmaps Linear methods Lower-dimensional linear projection that preserves distances between all
More informationx. Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ 2 ).
.8.6 µ =, σ = 1 µ = 1, σ = 1 / µ =, σ =.. 3 1 1 3 x Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ ). The Gaussian distribution Probably the most-important distribution in all of statistics
More informationThe Persistent Homology of Distance Functions under Random Projection
The Persistent Homology of Distance Functions under Random Projection Donald R. Sheehy University of Connecticut don.r.sheehy@gmail.com Abstract Given n points P in a Euclidean space, the Johnson-Linden-strauss
More informationAdvances in Manifold Learning Presented by: Naku Nak l Verm r a June 10, 2008
Advances in Manifold Learning Presented by: Nakul Verma June 10, 008 Outline Motivation Manifolds Manifold Learning Random projection of manifolds for dimension reduction Introduction to random projections
More informationA sampling theory for compact sets
ENS Lyon January 2010 A sampling theory for compact sets F. Chazal Geometrica Group INRIA Saclay To download these slides: http://geometrica.saclay.inria.fr/team/fred.chazal/teaching/distancefunctions.pdf
More informationMachine Learning - MT & 14. PCA and MDS
Machine Learning - MT 2016 13 & 14. PCA and MDS Varun Kanade University of Oxford November 21 & 23, 2016 Announcements Sheet 4 due this Friday by noon Practical 3 this week (continue next week if necessary)
More informationStatistical Techniques in Robotics (16-831, F12) Lecture#21 (Monday November 12) Gaussian Processes
Statistical Techniques in Robotics (16-831, F12) Lecture#21 (Monday November 12) Gaussian Processes Lecturer: Drew Bagnell Scribe: Venkatraman Narayanan 1, M. Koval and P. Parashar 1 Applications of Gaussian
More informationMachine Learning. B. Unsupervised Learning B.2 Dimensionality Reduction. Lars Schmidt-Thieme, Nicolas Schilling
Machine Learning B. Unsupervised Learning B.2 Dimensionality Reduction Lars Schmidt-Thieme, Nicolas Schilling Information Systems and Machine Learning Lab (ISMLL) Institute for Computer Science University
More informationDimension reduction methods: Algorithms and Applications Yousef Saad Department of Computer Science and Engineering University of Minnesota
Dimension reduction methods: Algorithms and Applications Yousef Saad Department of Computer Science and Engineering University of Minnesota Université du Littoral- Calais July 11, 16 First..... to the
More informationThe Ring of Algebraic Functions on Persistence Bar Codes
arxiv:304.0530v [math.ra] Apr 03 The Ring of Algebraic Functions on Persistence Bar Codes Aaron Adcock Erik Carlsson Gunnar Carlsson Introduction April 3, 03 Persistent homology ([3], [3]) is a fundamental
More information7. The Stability Theorem
Vin de Silva Pomona College CBMS Lecture Series, Macalester College, June 2017 The Persistence Stability Theorem Stability Theorem (Cohen-Steiner, Edelsbrunner, Harer 2007) The following transformation
More informationIntroduction to Machine Learning. Introduction to ML - TAU 2016/7 1
Introduction to Machine Learning Introduction to ML - TAU 2016/7 1 Course Administration Lecturers: Amir Globerson (gamir@post.tau.ac.il) Yishay Mansour (Mansour@tau.ac.il) Teaching Assistance: Regev Schweiger
More informationA Kernel Between Sets of Vectors
A Kernel Between Sets of Vectors Risi Kondor Tony Jebara Columbia University, New York, USA. 1 A Kernel between Sets of Vectors In SVM, Gassian Processes, Kernel PCA, kernel K de nes feature map Φ : X
More informationFantope Regularization in Metric Learning
Fantope Regularization in Metric Learning CVPR 2014 Marc T. Law (LIP6, UPMC), Nicolas Thome (LIP6 - UPMC Sorbonne Universités), Matthieu Cord (LIP6 - UPMC Sorbonne Universités), Paris, France Introduction
More informationREFINMENTS OF HOMOLOGY. Homological spectral package.
REFINMENTS OF HOMOLOGY (homological spectral package). Dan Burghelea Department of Mathematics Ohio State University, Columbus, OH In analogy with linear algebra over C we propose REFINEMENTS of the simplest
More informationA (computer friendly) alternative to Morse Novikov theory for real and angle valued maps
A (computer friendly) alternative to Morse Novikov theory for real and angle valued maps Dan Burghelea Department of mathematics Ohio State University, Columbus, OH based on joint work with Stefan Haller
More informationProximity of Persistence Modules and their Diagrams
Proximity of Persistence Modules and their Diagrams Frédéric Chazal David Cohen-Steiner Marc Glisse Leonidas J. Guibas Steve Y. Oudot November 28, 2008 Abstract Topological persistence has proven to be
More informationPersistent Local Homology: Theory, Applications, Algorithms
Persistent Local Homology: Theory, Applications, Algorithms Paul Bendich Duke University Feb. 4, 2014 Basic Goal: stratification learning Q: Given a point cloud sampled from a stratified space, how do
More informationPAC-learning, VC Dimension and Margin-based Bounds
More details: General: http://www.learning-with-kernels.org/ Example of more complex bounds: http://www.research.ibm.com/people/t/tzhang/papers/jmlr02_cover.ps.gz PAC-learning, VC Dimension and Margin-based
More informationKronecker Decomposition for Image Classification
university of innsbruck institute of computer science intelligent and interactive systems Kronecker Decomposition for Image Classification Sabrina Fontanella 1,2, Antonio Rodríguez-Sánchez 1, Justus Piater
More informationEECS490: Digital Image Processing. Lecture #26
Lecture #26 Moments; invariant moments Eigenvector, principal component analysis Boundary coding Image primitives Image representation: trees, graphs Object recognition and classes Minimum distance classifiers
More informationThe Intersection of Statistics and Topology:
The Intersection of Statistics and Topology: Confidence Sets for Persistence Diagrams Brittany Terese Fasy joint work with S. Balakrishnan, F. Lecci, A. Rinaldo, A. Singh, L. Wasserman 3 June 2014 [FLRWBS]
More information5.6 Nonparametric Logistic Regression
5.6 onparametric Logistic Regression Dmitri Dranishnikov University of Florida Statistical Learning onparametric Logistic Regression onparametric? Doesnt mean that there are no parameters. Just means that
More informationLinking non-binned spike train kernels to several existing spike train metrics
Linking non-binned spike train kernels to several existing spike train metrics Benjamin Schrauwen Jan Van Campenhout ELIS, Ghent University, Belgium Benjamin.Schrauwen@UGent.be Abstract. This work presents
More informationLinear & nonlinear classifiers
Linear & nonlinear classifiers Machine Learning Hamid Beigy Sharif University of Technology Fall 1394 Hamid Beigy (Sharif University of Technology) Linear & nonlinear classifiers Fall 1394 1 / 34 Table
More informationSupport Vector Machine (continued)
Support Vector Machine continued) Overlapping class distribution: In practice the class-conditional distributions may overlap, so that the training data points are no longer linearly separable. We need
More informationDepartment of Computer Science and Engineering
Linear algebra methods for data mining with applications to materials Yousef Saad Department of Computer Science and Engineering University of Minnesota ICSC 2012, Hong Kong, Jan 4-7, 2012 HAPPY BIRTHDAY
More informationGaussian Processes (10/16/13)
STA561: Probabilistic machine learning Gaussian Processes (10/16/13) Lecturer: Barbara Engelhardt Scribes: Changwei Hu, Di Jin, Mengdi Wang 1 Introduction In supervised learning, we observe some inputs
More informationMultiple Similarities Based Kernel Subspace Learning for Image Classification
Multiple Similarities Based Kernel Subspace Learning for Image Classification Wang Yan, Qingshan Liu, Hanqing Lu, and Songde Ma National Laboratory of Pattern Recognition, Institute of Automation, Chinese
More informationSupport Vector Machines: Maximum Margin Classifiers
Support Vector Machines: Maximum Margin Classifiers Machine Learning and Pattern Recognition: September 16, 2008 Piotr Mirowski Based on slides by Sumit Chopra and Fu-Jie Huang 1 Outline What is behind
More informationReconnaissance d objetsd et vision artificielle
Reconnaissance d objetsd et vision artificielle http://www.di.ens.fr/willow/teaching/recvis09 Lecture 6 Face recognition Face detection Neural nets Attention! Troisième exercice de programmation du le
More informationFinal Overview. Introduction to ML. Marek Petrik 4/25/2017
Final Overview Introduction to ML Marek Petrik 4/25/2017 This Course: Introduction to Machine Learning Build a foundation for practice and research in ML Basic machine learning concepts: max likelihood,
More informationVector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.
Vector spaces DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_fall17/index.html Carlos Fernandez-Granda Vector space Consists of: A set V A scalar
More informationContribution from: Springer Verlag Berlin Heidelberg 2005 ISBN
Contribution from: Mathematical Physics Studies Vol. 7 Perspectives in Analysis Essays in Honor of Lennart Carleson s 75th Birthday Michael Benedicks, Peter W. Jones, Stanislav Smirnov (Eds.) Springer
More informationMTTTS16 Learning from Multiple Sources
MTTTS16 Learning from Multiple Sources 5 ECTS credits Autumn 2018, University of Tampere Lecturer: Jaakko Peltonen Lecture 6: Multitask learning with kernel methods and nonparametric models On this lecture:
More information