Minimax Rates. Homology Inference
|
|
- Elijah Lucas
- 5 years ago
- Views:
Transcription
1 Minimax Rates for Homology Inference Don Sheehy Joint work with Sivaraman Balakrishan, Alessandro Rinaldo, Aarti Singh, and Larry Wasserman
2 Something like a joke.
3 Something like a joke. What is topological inference?
4 Something like a joke. What is topological inference? It s when you infer the topology of a space given only a finite subset.
5 Something like a joke. What is topological inference? It s when you infer the topology of a space given only a finite subset.
6 We add geometric and statistical hypotheses to make the problem well-posed. Geometric Assumption: The underlying space is a smooth manifold M. Statistical Assumption: The points are drawn i.i.d. from a distribution derived from M.
7 We add geometric and statistical hypotheses to make the problem well-posed. Geometric Assumption: The underlying space is a smooth manifold M. Statistical Assumption: The points are drawn i.i.d. from a distribution derived from M.
8 We add geometric and statistical hypotheses to make the problem well-posed. Geometric Assumption: The underlying space is a smooth manifold M. Statistical Assumption: The points are drawn i.i.d. from a distribution derived from M.
9 We add geometric and statistical hypotheses to make the problem well-posed. Geometric Assumption: The underlying space is a smooth manifold M. Statistical Assumption: The points are drawn i.i.d. from a distribution derived from M.
10
11 Input: n points from a d-manifold M in D-dimensions.
12 Input: n points from a d-manifold M in D-dimensions. Output: The homology of M.
13 Input: n points from a d-manifold M in D-dimensions. Output: The homology of M. Upper bound: What is the worst case complexity?
14 Input: n points from a d-manifold M in D-dimensions. Output: The homology of M. Upper bound: What is the worst case complexity? Lower Bound: What is the worst case complexity of the best possible algorithm?
15 sampled i.i.d. Input: n points from a d-manifold M in D-dimensions. Output: The homology of M. Upper bound: What is the worst case complexity? Lower Bound: What is the worst case complexity of the best possible algorithm?
16 sampled i.i.d. distribution supported on Input: n points from a d-manifold M in D-dimensions. Output: The homology of M. Upper bound: What is the worst case complexity? Lower Bound: What is the worst case complexity of the best possible algorithm?
17 sampled i.i.d. distribution supported on Input: n points from a d-manifold M in D-dimensions. with noise Output: The homology of M. Upper bound: What is the worst case complexity? Lower Bound: What is the worst case complexity of the best possible algorithm?
18 sampled i.i.d. distribution supported on Input: n points from a d-manifold M in D-dimensions. an estimate of with noise Output: The homology of M. Upper bound: What is the worst case complexity? Lower Bound: What is the worst case complexity of the best possible algorithm?
19 sampled i.i.d. distribution supported on Input: n points from a d-manifold M in D-dimensions. an estimate of with noise Output: The homology of M. Upper bound: What is the worst case complexity? Lower Bound: What is the worst case complexity of the best possible algorithm? probability of giving a wrong answer
20 sampled i.i.d. distribution supported on Input: n points from a d-manifold M in D-dimensions. an estimate of with noise Output: The homology of M. Upper bound: What is the worst case complexity? Lower Bound: What is the worst case complexity of the best possible algorithm? probability of giving a wrong answer The Goal: Matching Bounds (asymptotically)
21 Minimax risk is the error probability of the best estimator on the hardest examples.
22 Minimax risk is the error probability of the best estimator on the hardest examples. Minimax Risk: R n =inf Ĥ sup Q Q Q n (Ĥ H(M))
23 Minimax risk is the error probability of the best estimator on the hardest examples. Minimax Risk: R n =inf Ĥ sup Q Q Q n (Ĥ H(M)) the best estimator
24 Minimax risk is the error probability of the best estimator on the hardest examples. Minimax Risk: R n =inf Ĥ sup Q Q Q n (Ĥ H(M)) the best estimator the hardest distribution
25 Minimax risk is the error probability of the best estimator on the hardest examples. Minimax Risk: R n =inf Ĥ sup Q Q Q n (Ĥ H(M)) the best estimator the hardest distribution product distribution
26 Minimax risk is the error probability of the best estimator on the hardest examples. Minimax Risk: R n =inf Ĥ sup Q Q the best estimator the hardest distribution Q n (Ĥ H(M)) product distribution the true homology
27 Minimax risk is the error probability of the best estimator on the hardest examples. Minimax Risk: R n =inf Ĥ sup Q Q the best estimator the hardest distribution Q n (Ĥ H(M)) product distribution the true homology Sample Complexity: n(ɛ) =min{n : R n ɛ}
28 We assume manifolds without boundary of bounded volume and reach.
29 We assume manifolds without boundary of bounded volume and reach. Let M be the set of compact d-dimensional Riemannian manifolds without boundary such that
30 We assume manifolds without boundary of bounded volume and reach. Let M be the set of compact d-dimensional Riemannian manifolds without boundary such that 1 M ball D (0, 1)
31 We assume manifolds without boundary of bounded volume and reach. Let M be the set of compact d-dimensional Riemannian manifolds without boundary such that 1 M ball D (0, 1) 2 vol(m) c d
32 We assume manifolds without boundary of bounded volume and reach. Let M be the set of compact d-dimensional Riemannian manifolds without boundary such that 1 M ball D (0, 1) 2 3 vol(m) c d The reach of M is at most τ.
33 We assume manifolds without boundary of bounded volume and reach. Let M be the set of compact d-dimensional Riemannian manifolds without boundary such that 1 M ball D (0, 1) 2 3 vol(m) c d The reach of M is at most τ. Let P be the set of probability distributions supported over M Mwith densities bounded from below by a constant a.
34 We consider 4 different noise models. Noiseless Clutter Tubular Additive
35 We consider 4 different noise models. Noiseless Clutter Q = P Tubular Additive
36 We consider 4 different noise models. Noiseless Clutter Q = P Q =(1 γ)u + γp P P U is uniform on ball(0, 1) Tubular Additive
37 We consider 4 different noise models. Noiseless Clutter Q = P Q =(1 γ)u + γp P P U is uniform on ball(0, 1) Tubular Let Q M,σ be uniform on M σ. Additive Q = {Q M,σ : M M}
38 We consider 4 different noise models. Noiseless Clutter Q = P Q =(1 γ)u + γp P P U is uniform on ball(0, 1) Tubular Let Q M,σ be uniform on M σ. Q = {Q M,σ : M M} Additive Q = {P Φ:P P} Φ is Gaussian with σ τ or Φ has Fourier transform bounded away from 0 and τ is fixed.
39 Le Cam s Lemma is a powerful tool for proving minimax lower bounds.
40 Le Cam s Lemma is a powerful tool for proving minimax lower bounds. Lemma. Let Q be a set of distributions. Let θ(q) take values in a metric space (X, ρ) for Q Q. For any Q 1,Q 2 Q, inf ˆθ sup Q Q E Q n [ ] ρ(ˆθ, θ(q)) 1 8 ρ(θ(q 1),θ(Q 2 ))(1 TV(Q 1,Q 2 )) 2n
41 Le Cam s Lemma is a powerful tool for proving minimax lower bounds. Lemma. Let Q be a set of distributions. Let θ(q) take values in a metric space (X, ρ) for Q Q. For any Q 1,Q 2 Q, inf ˆθ sup Q Q E Q n [ ] ρ(ˆθ, θ(q)) 1 8 ρ(θ(q 1),θ(Q 2 ))(1 TV(Q 1,Q 2 )) 2n For homology, use the trivial metric. ρ(x, y) = { 0 if x = y 1 if x y
42 Le Cam s Lemma is a powerful tool for proving minimax lower bounds. Lemma. Let Q be a set of distributions. Let θ(q) take values in a metric space (X, ρ) for Q Q. For any Q 1,Q 2 Q, inf ˆθ sup Q Q E Q n [ ] ρ(ˆθ, θ(q)) 1 8 ρ(θ(q 1),θ(Q 2 ))(1 TV(Q 1,Q 2 )) 2n For homology, use the trivial metric. ρ(x, y) = { 0 if x = y 1 if x y inf Ĥ sup Q Q Q n (Ĥ H(M)) 1 8 (1 TV(Q 1,Q 2 )) 2n
43 Le Cam s Lemma is a powerful tool for proving minimax lower bounds. Lemma. Let Q be a set of distributions. Let θ(q) take values in a metric space (X, ρ) for Q Q. For any Q 1,Q 2 Q, inf ˆθ sup Q Q E Q n [ ] ρ(ˆθ, θ(q)) 1 8 ρ(θ(q 1),θ(Q 2 ))(1 TV(Q 1,Q 2 )) 2n For homology, use the trivial metric. ρ(x, y) = { 0 if x = y 1 if x y R n = inf Ĥ sup Q Q Q n (Ĥ H(M)) 1 8 (1 TV(Q 1,Q 2 )) 2n
44 The lower bound requires two manifolds that are geometrically close but topologically distinct.
45 The lower bound requires two manifolds that are geometrically close but topologically distinct. B =ball d (0, 1 τ) A = B \ ball d (0, 2τ)
46 The lower bound requires two manifolds that are geometrically close but topologically distinct. B =ball d (0, 1 τ) A = B \ ball d (0, 2τ) M 1 = (B τ ) M 2 = (A τ )
47 The lower bound requires two manifolds that are geometrically close but topologically distinct. B =ball d (0, 1 τ) A = B \ ball d (0, 2τ) M 1 = (B τ ) M 2 = (A τ ) The overlap
48 It suffices to bound the total variation distance.
49 It suffices to bound the total variation distance. Total Variation Distance: TV(Q 1,Q 2 )=sup A Q 1 (A) Q 2 (A) a max{vol(m 1 \ M 2 ), vol(m 2 \ M 1 )} C d aτ d
50 It suffices to bound the total variation distance. Total Variation Distance: Minimax Risk: TV(Q 1,Q 2 )=sup A Q 1 (A) Q 2 (A) a max{vol(m 1 \ M 2 ), vol(m 2 \ M 1 )} C d aτ d R n 1 8 (1 TV(Q 1,Q 2 ) 2n 1 8 (1 C daτ d ) 2n 1 8 e 2C daτ d n
51 It suffices to bound the total variation distance. Total Variation Distance: Minimax Risk: TV(Q 1,Q 2 )=sup A Q 1 (A) Q 2 (A) a max{vol(m 1 \ M 2 ), vol(m 2 \ M 1 )} C d aτ d R n 1 8 (1 TV(Q 1,Q 2 ) 2n 1 8 (1 C daτ d ) 2n 1 8 e 2C daτ d n Sampling Rate: n(ɛ) ( 1 τ ) d log 1 ε
52 The upper bound uses a union of balls to estimate the homology of M.
53 The upper bound uses a union of balls to estimate the homology of M.
54 The upper bound uses a union of balls to estimate the homology of M.
55 The upper bound uses a union of balls to estimate the homology of M. 1 Take a union of balls.
56 The upper bound uses a union of balls to estimate the homology of M. 1 Take a union of balls. Compute the homology of the 2 resulting Cech complex.
57 The upper bound uses a union of balls to estimate the homology of M. 1 Take a union of balls. Compute the homology of the 2 resulting Cech complex.
58 The upper bound uses a union of balls to estimate the homology of M. 0 1 Denoise the data. Take a union of balls. Compute the homology of the 2 resulting Cech complex.
59 The upper bound uses a union of balls to estimate the homology of M. 0 1 Denoise the data. Take a union of balls. Compute the homology of the 2 resulting Cech complex.
60 The upper bound uses a union of balls to estimate the homology of M. 0 1 Denoise the data. Take a union of balls. Compute the homology of the 2 resulting Cech complex.
61 The upper bound uses a union of balls to estimate the homology of M. 0 1 Denoise the data. Take a union of balls. Compute the homology of the 2 resulting Cech complex.
62 The upper bound uses a union of balls to estimate the homology of M. 0 1 Denoise the data. Take a union of balls. Compute the homology of the 2 resulting Cech complex.
63 The upper bound uses a union of balls to estimate the homology of M. 0 1 Denoise the data. Take a union of balls. Compute the homology of the 2 resulting Cech complex. To prove: The density is bounded from below near M and from above far from M.
64 Many fundamental problems are still open.
65 Many fundamental problems are still open. 1 Is the reach the right parameter?
66 Many fundamental problems are still open. 1 Is the reach the right parameter? 2 What about manifolds with boundary?
67 Many fundamental problems are still open. 1 Is the reach the right parameter? 2 What about manifolds with boundary? 3 Homotopy equivalence?
68 Many fundamental problems are still open. 1 Is the reach the right parameter? 2 What about manifolds with boundary? 3 Homotopy equivalence? 4 How to choose parameters?
69 Many fundamental problems are still open. 1 Is the reach the right parameter? 2 What about manifolds with boundary? 3 Homotopy equivalence? 4 How to choose parameters? 5 Are there efficient algorithms?
70 Thank you.
Minimax Rates for Homology Inference
Minimax Rates for Homology Inference Sivaraman Balakrishnan Alessandro Rinaldo Don Sheehy Aarti Singh Larry Wasserman Carnegie Mellon University INRIA Our upper bounds mainly generalize those in the work
More informationMinimax Estimation of Kernel Mean Embeddings
Minimax Estimation of Kernel Mean Embeddings Bharath K. Sriperumbudur Department of Statistics Pennsylvania State University Gatsby Computational Neuroscience Unit May 4, 2016 Collaborators Dr. Ilya Tolstikhin
More informationSome Topics in Computational Topology
Some Topics in Computational Topology Yusu Wang Ohio State University AMS Short Course 2014 Introduction Much recent developments in computational topology Both in theory and in their applications E.g,
More informationSome Topics in Computational Topology. Yusu Wang. AMS Short Course 2014
Some Topics in Computational Topology Yusu Wang AMS Short Course 2014 Introduction Much recent developments in computational topology Both in theory and in their applications E.g, the theory of persistence
More informationPersistent homology and nonparametric regression
Cleveland State University March 10, 2009, BIRS: Data Analysis using Computational Topology and Geometric Statistics joint work with Gunnar Carlsson (Stanford), Moo Chung (Wisconsin Madison), Peter Kim
More informationMultivariate Topological Data Analysis
Cleveland State University November 20, 2008, Duke University joint work with Gunnar Carlsson (Stanford), Peter Kim and Zhiming Luo (Guelph), and Moo Chung (Wisconsin Madison) Framework Ideal Truth (Parameter)
More informationLECTURE 15: COMPLETENESS AND CONVEXITY
LECTURE 15: COMPLETENESS AND CONVEXITY 1. The Hopf-Rinow Theorem Recall that a Riemannian manifold (M, g) is called geodesically complete if the maximal defining interval of any geodesic is R. On the other
More informationThe Intersection of Statistics and Topology:
The Intersection of Statistics and Topology: Confidence Sets for Persistence Diagrams Brittany Terese Fasy joint work with S. Balakrishnan, F. Lecci, A. Rinaldo, A. Singh, L. Wasserman 3 June 2014 [FLRWBS]
More informationStatistics and Topological Data Analysis
Geometry Understanding in Higher Dimensions CHAIRE D INFORMATIQUE ET SCIENCES NUMÉRIQUES Collège de France - June 2017 Statistics and Topological Data Analysis Bertrand MICHEL Ecole Centrale de Nantes
More informationPersistent Homology of Data, Groups, Function Spaces, and Landscapes.
Persistent Homology of Data, Groups, Function Spaces, and Landscapes. Shmuel Weinberger Department of Mathematics University of Chicago William Benter Lecture City University of Hong Kong Liu Bie Ju Centre
More informationMetric Thickenings of Euclidean Submanifolds
Metric Thickenings of Euclidean Submanifolds Advisor: Dr. Henry Adams Committe: Dr. Chris Peterson, Dr. Daniel Cooley Joshua Mirth Masters Thesis Defense October 3, 2017 Introduction Motivation Can we
More informationThe uniformization theorem
1 The uniformization theorem Thurston s basic insight in all four of the theorems discussed in this book is that either the topology of the problem induces an appropriate geometry or there is an understandable
More informationGeometric Inference on Kernel Density Estimates
Geometric Inference on Kernel Density Estimates Bei Wang 1 Jeff M. Phillips 2 Yan Zheng 2 1 Scientific Computing and Imaging Institute, University of Utah 2 School of Computing, University of Utah Feb
More informationMAT 530: Topology&Geometry, I Fall 2005
MAT 530: Topology&Geometry, I Fall 2005 Midterm Solutions Note: These solutions are more detailed than solutions sufficient for full credit. Let X denote the set {a, b, c}. The collections Problem 1 5+5
More informationTransversality. Abhishek Khetan. December 13, Basics 1. 2 The Transversality Theorem 1. 3 Transversality and Homotopy 2
Transversality Abhishek Khetan December 13, 2017 Contents 1 Basics 1 2 The Transversality Theorem 1 3 Transversality and Homotopy 2 4 Intersection Number Mod 2 4 5 Degree Mod 2 4 1 Basics Definition. Let
More informationUnique equilibrium states for geodesic flows in nonpositive curvature
Unique equilibrium states for geodesic flows in nonpositive curvature Todd Fisher Department of Mathematics Brigham Young University Fractal Geometry, Hyperbolic Dynamics and Thermodynamical Formalism
More informationWe will outline two proofs of the main theorem:
Chapter 4 Higher Genus Surfaces 4.1 The Main Result We will outline two proofs of the main theorem: Theorem 4.1.1. Let Σ be a closed oriented surface of genus g > 1. Then every homotopy class of homeomorphisms
More informationIntroduction to Empirical Processes and Semiparametric Inference Lecture 12: Glivenko-Cantelli and Donsker Results
Introduction to Empirical Processes and Semiparametric Inference Lecture 12: Glivenko-Cantelli and Donsker Results Michael R. Kosorok, Ph.D. Professor and Chair of Biostatistics Professor of Statistics
More informationLecture 7 Introduction to Statistical Decision Theory
Lecture 7 Introduction to Statistical Decision Theory I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 20, 2016 1 / 55 I-Hsiang Wang IT Lecture 7
More informationDistribution-Free Distribution Regression
Distribution-Free Distribution Regression Barnabás Póczos, Alessandro Rinaldo, Aarti Singh and Larry Wasserman AISTATS 2013 Presented by Esther Salazar Duke University February 28, 2014 E. Salazar (Reading
More information19.1 Problem setup: Sparse linear regression
ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016 Lecture 19: Minimax rates for sparse linear regression Lecturer: Yihong Wu Scribe: Subhadeep Paul, April 13/14, 2016 In
More informationSemi-Supervised Learning by Multi-Manifold Separation
Semi-Supervised Learning by Multi-Manifold Separation Xiaojin (Jerry) Zhu Department of Computer Sciences University of Wisconsin Madison Joint work with Andrew Goldberg, Zhiting Xu, Aarti Singh, and Rob
More informationGeometric inference for measures based on distance functions The DTM-signature for a geometric comparison of metric-measure spaces from samples
Distance to Geometric inference for measures based on distance functions The - for a geometric comparison of metric-measure spaces from samples the Ohio State University wan.252@osu.edu 1/31 Geometric
More informationGaussian Estimation under Attack Uncertainty
Gaussian Estimation under Attack Uncertainty Tara Javidi Yonatan Kaspi Himanshu Tyagi Abstract We consider the estimation of a standard Gaussian random variable under an observation attack where an adversary
More information10-704: Information Processing and Learning Fall Lecture 21: Nov 14. sup
0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 2: Nov 4 Note: hese notes are based on scribed notes from Spring5 offering of this course LaeX template courtesy of UC
More informationWe have the following immediate corollary. 1
1. Thom Spaces and Transversality Definition 1.1. Let π : E B be a real k vector bundle with a Euclidean metric and let E 1 be the set of elements of norm 1. The Thom space T (E) of E is the quotient E/E
More informationQUALIFYING EXAMINATION Harvard University Department of Mathematics Tuesday 10 February 2004 (Day 1)
Tuesday 10 February 2004 (Day 1) 1a. Prove the following theorem of Banach and Saks: Theorem. Given in L 2 a sequence {f n } which weakly converges to 0, we can select a subsequence {f nk } such that the
More informationProblem Set 1: Solutions Math 201A: Fall Problem 1. Let (X, d) be a metric space. (a) Prove the reverse triangle inequality: for every x, y, z X
Problem Set 1: s Math 201A: Fall 2016 Problem 1. Let (X, d) be a metric space. (a) Prove the reverse triangle inequality: for every x, y, z X d(x, y) d(x, z) d(z, y). (b) Prove that if x n x and y n y
More informationA Resampling Method on Pivotal Estimating Functions
A Resampling Method on Pivotal Estimating Functions Kun Nie Biostat 277,Winter 2004 March 17, 2004 Outline Introduction A General Resampling Method Examples - Quantile Regression -Rank Regression -Simulation
More informationLocal Minimax Testing
Local Minimax Testing Sivaraman Balakrishnan and Larry Wasserman Carnegie Mellon June 11, 2017 1 / 32 Hypothesis testing beyond classical regimes Example 1: Testing distribution of bacteria in gut microbiome
More informationAssignment #10 Morgan Schreffler 1 of 7
Assignment #10 Morgan Schreffler 1 of 7 Lee, Chapter 4 Exercise 10 Let S be the square I I with the order topology generated by the dictionary topology. (a) Show that S has the least uppper bound property.
More informationOptimal Covariance Change Point Detection in High Dimension
Optimal Covariance Change Point Detection in High Dimension Joint work with Daren Wang and Alessandro Rinaldo, CMU Yi Yu School of Mathematics, University of Bristol Outline Review of change point detection
More informationMath 4317 : Real Analysis I Mid-Term Exam 1 25 September 2012
Instructions: Answer all of the problems. Math 4317 : Real Analysis I Mid-Term Exam 1 25 September 2012 Definitions (2 points each) 1. State the definition of a metric space. A metric space (X, d) is set
More informationTopological Simplification in 3-Dimensions
Topological Simplification in 3-Dimensions David Letscher Saint Louis University Computational & Algorithmic Topology Sydney Letscher (SLU) Topological Simplification CATS 2017 1 / 33 Motivation Original
More informationLECTURE 3: SMOOTH FUNCTIONS
LECTURE 3: SMOOTH FUNCTIONS Let M be a smooth manifold. 1. Smooth Functions Definition 1.1. We say a function f : M R is smooth if for any chart {ϕ α, U α, V α } in A that defines the smooth structure
More informationConvergence of Quantum Statistical Experiments
Convergence of Quantum Statistical Experiments M!d!lin Gu"! University of Nottingham Jonas Kahn (Paris-Sud) Richard Gill (Leiden) Anna Jen#ová (Bratislava) Statistical experiments and statistical decision
More informationTesting Algebraic Hypotheses
Testing Algebraic Hypotheses Mathias Drton Department of Statistics University of Chicago 1 / 18 Example: Factor analysis Multivariate normal model based on conditional independence given hidden variable:
More informationEXPOSITORY NOTES ON DISTRIBUTION THEORY, FALL 2018
EXPOSITORY NOTES ON DISTRIBUTION THEORY, FALL 2018 While these notes are under construction, I expect there will be many typos. The main reference for this is volume 1 of Hörmander, The analysis of liner
More informationLearning Theory. Ingo Steinwart University of Stuttgart. September 4, 2013
Learning Theory Ingo Steinwart University of Stuttgart September 4, 2013 Ingo Steinwart University of Stuttgart () Learning Theory September 4, 2013 1 / 62 Basics Informal Introduction Informal Description
More information1.4 The Jacobian of a map
1.4 The Jacobian of a map Derivative of a differentiable map Let F : M n N m be a differentiable map between two C 1 manifolds. Given a point p M we define the derivative of F at p by df p df (p) : T p
More informationLecture 35: December The fundamental statistical distances
36-705: Intermediate Statistics Fall 207 Lecturer: Siva Balakrishnan Lecture 35: December 4 Today we will discuss distances and metrics between distributions that are useful in statistics. I will be lose
More informationLecture Learning infinite hypothesis class via VC-dimension and Rademacher complexity;
CSCI699: Topics in Learning and Game Theory Lecture 2 Lecturer: Ilias Diakonikolas Scribes: Li Han Today we will cover the following 2 topics: 1. Learning infinite hypothesis class via VC-dimension and
More informationWeak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij
Weak convergence and Brownian Motion (telegram style notes) P.J.C. Spreij this version: December 8, 2006 1 The space C[0, ) In this section we summarize some facts concerning the space C[0, ) of real
More informationA sampling theory for compact sets
ENS Lyon January 2010 A sampling theory for compact sets F. Chazal Geometrica Group INRIA Saclay To download these slides: http://geometrica.saclay.inria.fr/team/fred.chazal/teaching/distancefunctions.pdf
More informationCMS winter meeting 2008, Ottawa. The heat kernel on connected sums
CMS winter meeting 2008, Ottawa The heat kernel on connected sums Laurent Saloff-Coste (Cornell University) December 8 2008 p(t, x, y) = The heat kernel ) 1 x y 2 exp ( (4πt) n/2 4t The heat kernel is:
More informationGeometry and topology of continuous best and near best approximations
Journal of Approximation Theory 105: 252 262, Geometry and topology of continuous best and near best approximations Paul C. Kainen Dept. of Mathematics Georgetown University Washington, D.C. 20057 Věra
More informationModel Selection and Geometry
Model Selection and Geometry Pascal Massart Université Paris-Sud, Orsay Leipzig, February Purpose of the talk! Concentration of measure plays a fundamental role in the theory of model selection! Model
More informationBayesian Nonparametric Point Estimation Under a Conjugate Prior
University of Pennsylvania ScholarlyCommons Statistics Papers Wharton Faculty Research 5-15-2002 Bayesian Nonparametric Point Estimation Under a Conjugate Prior Xuefeng Li University of Pennsylvania Linda
More informationGeometry of Ricci Solitons
Geometry of Ricci Solitons H.-D. Cao, Lehigh University LMU, Munich November 25, 2008 1 Ricci Solitons A complete Riemannian (M n, g ij ) is a Ricci soliton if there exists a smooth function f on M such
More informationThe Learning Problem and Regularization
9.520 Class 02 February 2011 Computational Learning Statistical Learning Theory Learning is viewed as a generalization/inference problem from usually small sets of high dimensional, noisy data. Learning
More informationHandlebody Decomposition of a Manifold
Handlebody Decomposition of a Manifold Mahuya Datta Statistics and Mathematics Unit Indian Statistical Institute, Kolkata mahuya@isical.ac.in January 12, 2012 contents Introduction What is a handlebody
More information21.1 Lower bounds on minimax risk for functional estimation
ECE598: Information-theoretic methods in high-dimensional statistics Spring 016 Lecture 1: Functional estimation & testing Lecturer: Yihong Wu Scribe: Ashok Vardhan, Apr 14, 016 In this chapter, we will
More informationHypothesis Testing via Convex Optimization
Hypothesis Testing via Convex Optimization Arkadi Nemirovski Joint research with Alexander Goldenshluger Haifa University Anatoli Iouditski Grenoble University Information Theory, Learning and Big Data
More information7 Complete metric spaces and function spaces
7 Complete metric spaces and function spaces 7.1 Completeness Let (X, d) be a metric space. Definition 7.1. A sequence (x n ) n N in X is a Cauchy sequence if for any ɛ > 0, there is N N such that n, m
More informationEcon Lecture 3. Outline. 1. Metric Spaces and Normed Spaces 2. Convergence of Sequences in Metric Spaces 3. Sequences in R and R n
Econ 204 2011 Lecture 3 Outline 1. Metric Spaces and Normed Spaces 2. Convergence of Sequences in Metric Spaces 3. Sequences in R and R n 1 Metric Spaces and Metrics Generalize distance and length notions
More informationDifferential Privacy in an RKHS
Differential Privacy in an RKHS Rob Hall (with Larry Wasserman and Alessandro Rinaldo) 2/20/2012 rjhall@cs.cmu.edu http://www.cs.cmu.edu/~rjhall 1 Overview Why care about privacy? Differential Privacy
More informationLecture 8: Information Theory and Statistics
Lecture 8: Information Theory and Statistics Part II: Hypothesis Testing and I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 23, 2015 1 / 50 I-Hsiang
More information4 Countability axioms
4 COUNTABILITY AXIOMS 4 Countability axioms Definition 4.1. Let X be a topological space X is said to be first countable if for any x X, there is a countable basis for the neighborhoods of x. X is said
More informationProblem set 1, Real Analysis I, Spring, 2015.
Problem set 1, Real Analysis I, Spring, 015. (1) Let f n : D R be a sequence of functions with domain D R n. Recall that f n f uniformly if and only if for all ɛ > 0, there is an N = N(ɛ) so that if n
More informationCoarse Geometry. Phanuel Mariano. Fall S.i.g.m.a. Seminar. Why Coarse Geometry? Coarse Invariants A Coarse Equivalence to R 1
Coarse Geometry 1 1 University of Connecticut Fall 2014 - S.i.g.m.a. Seminar Outline 1 Motivation 2 3 The partition space P ([a, b]). Preliminaries Main Result 4 Outline Basic Problem 1 Motivation 2 3
More informationON THE REGULARITY OF SAMPLE PATHS OF SUB-ELLIPTIC DIFFUSIONS ON MANIFOLDS
Bendikov, A. and Saloff-Coste, L. Osaka J. Math. 4 (5), 677 7 ON THE REGULARITY OF SAMPLE PATHS OF SUB-ELLIPTIC DIFFUSIONS ON MANIFOLDS ALEXANDER BENDIKOV and LAURENT SALOFF-COSTE (Received March 4, 4)
More informationPriors for the frequentist, consistency beyond Schwartz
Victoria University, Wellington, New Zealand, 11 January 2016 Priors for the frequentist, consistency beyond Schwartz Bas Kleijn, KdV Institute for Mathematics Part I Introduction Bayesian and Frequentist
More informationMath 225A: Differential Topology, Final Exam
Math 225A: Differential Topology, Final Exam Ian Coley December 9, 2013 The goal is the following theorem. Theorem (Hopf). Let M be a compact n-manifold without boundary, and let f, g : M S n be two smooth
More informationA Glimpse into Topology
A Glimpse into Topology by Vishal Malik A project submitted to the Department of Mathematical Sciences in conformity with the requirements for Math 430 (203-4) Lakehead University Thunder Bay, Ontario,
More informationCOUNTEREXAMPLES TO THE COARSE BAUM-CONNES CONJECTURE. Nigel Higson. Unpublished Note, 1999
COUNTEREXAMPLES TO THE COARSE BAUM-CONNES CONJECTURE Nigel Higson Unpublished Note, 1999 1. Introduction Let X be a discrete, bounded geometry metric space. 1 Associated to X is a C -algebra C (X) which
More informationHomology through Interleaving
Notes by Tamal K. Dey, OSU 91 Homology through Interleaving 32 Concept of interleaving A discrete set P R k is assumed to be a sample of a set X R k, if it lies near to it which we can quantify with the
More informationMA651 Topology. Lecture 10. Metric Spaces.
MA65 Topology. Lecture 0. Metric Spaces. This text is based on the following books: Topology by James Dugundgji Fundamental concepts of topology by Peter O Neil Linear Algebra and Analysis by Marc Zamansky
More informationIntroduction to Empirical Processes and Semiparametric Inference Lecture 08: Stochastic Convergence
Introduction to Empirical Processes and Semiparametric Inference Lecture 08: Stochastic Convergence Michael R. Kosorok, Ph.D. Professor and Chair of Biostatistics Professor of Statistics and Operations
More informationTHESIS METRIC THICKENINGS OF EUCLIDEAN SUBMANIFOLDS. Submitted by. Joshua R. Mirth. Department of Mathematics
THESIS METRIC THICKENINGS OF EUCLIDEAN SUBMANIFOLDS Submitted by Joshua R. Mirth Department of Mathematics In partial fulfillment of the requirements For the Degree of Master of Science Colorado State
More informationLocal Asymptotic Normality in Quantum Statistics
Local Asymptotic Normality in Quantum Statistics M!d!lin Gu"! School of Mathematical Sciences University of Nottingham Richard Gill (Leiden) Jonas Kahn (Paris XI) Bas Janssens (Utrecht) Anna Jencova (Bratislava)
More informationNote: all spaces are assumed to be path connected and locally path connected.
Homework 2 Note: all spaces are assumed to be path connected and locally path connected. 1: Let X be the figure 8 space. Precisely define a space X and a map p : X X which is the universal cover. Check
More information121B: ALGEBRAIC TOPOLOGY. Contents. 6. Poincaré Duality
121B: ALGEBRAIC TOPOLOGY Contents 6. Poincaré Duality 1 6.1. Manifolds 2 6.2. Orientation 3 6.3. Orientation sheaf 9 6.4. Cap product 11 6.5. Proof for good coverings 15 6.6. Direct limit 18 6.7. Proof
More informationAsymptotic Equivalence and Adaptive Estimation for Robust Nonparametric Regression
Asymptotic Equivalence and Adaptive Estimation for Robust Nonparametric Regression T. Tony Cai 1 and Harrison H. Zhou 2 University of Pennsylvania and Yale University Abstract Asymptotic equivalence theory
More informationarxiv:submit/ [math.st] 6 May 2011
A Continuous Mapping Theorem for the Smallest Argmax Functional arxiv:submit/0243372 [math.st] 6 May 2011 Emilio Seijo and Bodhisattva Sen Columbia University Abstract This paper introduces a version of
More informationGromov s Proof of Mostow Rigidity
Gromov s Proof of Mostow Rigidity Mostow Rigidity is a theorem about invariants. An invariant assigns a label to whatever mathematical object I am studying that is well-defined up to the appropriate equivalence.
More informationStatistical Measures of Uncertainty in Inverse Problems
Statistical Measures of Uncertainty in Inverse Problems Workshop on Uncertainty in Inverse Problems Institute for Mathematics and Its Applications Minneapolis, MN 19-26 April 2002 P.B. Stark Department
More information16.1 Bounding Capacity with Covering Number
ECE598: Information-theoretic methods in high-dimensional statistics Spring 206 Lecture 6: Upper Bounds for Density Estimation Lecturer: Yihong Wu Scribe: Yang Zhang, Apr, 206 So far we have been mostly
More informationManifold Regularization
9.520: Statistical Learning Theory and Applications arch 3rd, 200 anifold Regularization Lecturer: Lorenzo Rosasco Scribe: Hooyoung Chung Introduction In this lecture we introduce a class of learning algorithms,
More informationMath General Topology Fall 2012 Homework 13 Solutions
Math 535 - General Topology Fall 2012 Homework 13 Solutions Note: In this problem set, function spaces are endowed with the compact-open topology unless otherwise noted. Problem 1. Let X be a compact topological
More informationAlgebraic Topology European Mathematical Society Zürich 2008 Tammo tom Dieck Georg-August-Universität
1 Algebraic Topology European Mathematical Society Zürich 2008 Tammo tom Dieck Georg-August-Universität Corrections for the first printing Page 7 +6: j is already assumed to be an inclusion. But the assertion
More informationl 1 -Regularized Linear Regression: Persistence and Oracle Inequalities
l -Regularized Linear Regression: Persistence and Oracle Inequalities Peter Bartlett EECS and Statistics UC Berkeley slides at http://www.stat.berkeley.edu/ bartlett Joint work with Shahar Mendelson and
More informationHow hard is this function to optimize?
How hard is this function to optimize? John Duchi Based on joint work with Sabyasachi Chatterjee, John Lafferty, Yuancheng Zhu Stanford University West Coast Optimization Rumble October 2016 Problem minimize
More informationAn inverse of Sanov s theorem
An inverse of Sanov s theorem Ayalvadi Ganesh and Neil O Connell BRIMS, Hewlett-Packard Labs, Bristol Abstract Let X k be a sequence of iid random variables taking values in a finite set, and consider
More informationChapter 4: Asymptotic Properties of the MLE
Chapter 4: Asymptotic Properties of the MLE Daniel O. Scharfstein 09/19/13 1 / 1 Maximum Likelihood Maximum likelihood is the most powerful tool for estimation. In this part of the course, we will consider
More informationSpectral applications of metric surgeries
Spectral applications of metric surgeries Pierre Jammes Neuchâtel, june 2013 Introduction and motivations Examples of applications of metric surgeries Let (M n, g) be a closed riemannian manifold, and
More informationABSOLUTE CONTINUITY OF FOLIATIONS
ABSOLUTE CONTINUITY OF FOLIATIONS C. PUGH, M. VIANA, A. WILKINSON 1. Introduction In what follows, U is an open neighborhood in a compact Riemannian manifold M, and F is a local foliation of U. By this
More informationarxiv: v3 [math.ds] 9 Nov 2012
Positive expansive flows Alfonso Artigue November 13, 2018 arxiv:1210.3202v3 [math.ds] 9 Nov 2012 Abstract We show that every positive expansive flow on a compact metric space consists of a finite number
More informationStatistical inference on Lévy processes
Alberto Coca Cabrero University of Cambridge - CCA Supervisors: Dr. Richard Nickl and Professor L.C.G.Rogers Funded by Fundación Mutua Madrileña and EPSRC MASDOC/CCA student workshop 2013 26th March Outline
More informationDiscussion of Hypothesis testing by convex optimization
Electronic Journal of Statistics Vol. 9 (2015) 1 6 ISSN: 1935-7524 DOI: 10.1214/15-EJS990 Discussion of Hypothesis testing by convex optimization Fabienne Comte, Céline Duval and Valentine Genon-Catalot
More informationDynamics of Group Actions and Minimal Sets
University of Illinois at Chicago www.math.uic.edu/ hurder First Joint Meeting of the Sociedad de Matemática de Chile American Mathematical Society Special Session on Group Actions: Probability and Dynamics
More informationClosed characteristics on non-compact mechanical contact manifolds 1
Closed characteristics on non-compact mechanical contact manifolds Jan Bouwe van den Berg, Federica Pasquotto, Thomas O. Rot and Robert C.A.M. Vandervorst Department of Mathematics, VU Universiteit Amsterdam,
More informationGraph Induced Complex: A Data Sparsifier for Homology Inference
Graph Induced Complex: A Data Sparsifier for Homology Inference Tamal K. Dey Fengtao Fan Yusu Wang Department of Computer Science and Engineering The Ohio State University October, 2013 Space, Sample and
More informationLecture I: Asymptotics for large GUE random matrices
Lecture I: Asymptotics for large GUE random matrices Steen Thorbjørnsen, University of Aarhus andom Matrices Definition. Let (Ω, F, P) be a probability space and let n be a positive integer. Then a random
More informationPICARD S THEOREM STEFAN FRIEDL
PICARD S THEOREM STEFAN FRIEDL Abstract. We give a summary for the proof of Picard s Theorem. The proof is for the most part an excerpt of [F]. 1. Introduction Definition. Let U C be an open subset. A
More informationMath212a1411 Lebesgue measure.
Math212a1411 Lebesgue measure. October 14, 2014 Reminder No class this Thursday Today s lecture will be devoted to Lebesgue measure, a creation of Henri Lebesgue, in his thesis, one of the most famous
More informationfy (X(g)) Y (f)x(g) gy (X(f)) Y (g)x(f)) = fx(y (g)) + gx(y (f)) fy (X(g)) gy (X(f))
1. Basic algebra of vector fields Let V be a finite dimensional vector space over R. Recall that V = {L : V R} is defined to be the set of all linear maps to R. V is isomorphic to V, but there is no canonical
More informationEilenberg-Steenrod properties. (Hatcher, 2.1, 2.3, 3.1; Conlon, 2.6, 8.1, )
II.3 : Eilenberg-Steenrod properties (Hatcher, 2.1, 2.3, 3.1; Conlon, 2.6, 8.1, 8.3 8.5 Definition. Let U be an open subset of R n for some n. The de Rham cohomology groups (U are the cohomology groups
More informationNonparametric regression for topology. applied to brain imaging data
, applied to brain imaging data Cleveland State University October 15, 2010 Motivation from Brain Imaging MRI Data Topology Statistics Application MRI Data Topology Statistics Application Cortical surface
More informationPOINTWISE BOUNDS ON QUASIMODES OF SEMICLASSICAL SCHRÖDINGER OPERATORS IN DIMENSION TWO
POINTWISE BOUNDS ON QUASIMODES OF SEMICLASSICAL SCHRÖDINGER OPERATORS IN DIMENSION TWO HART F. SMITH AND MACIEJ ZWORSKI Abstract. We prove optimal pointwise bounds on quasimodes of semiclassical Schrödinger
More informationMath General Topology Fall 2012 Homework 8 Solutions
Math 535 - General Topology Fall 2012 Homework 8 Solutions Problem 1. (Willard Exercise 19B.1) Show that the one-point compactification of R n is homeomorphic to the n-dimensional sphere S n. Note that
More information