Sampling and high-dimensional convex geometry

Size: px
Start display at page:

Download "Sampling and high-dimensional convex geometry"

Transcription

1 Sampling and high-dimensional convex geometry Roman Vershynin SampTA 2013 Bremen, Germany, June 2013

2 Geometry of sampling problems Signals live in high dimensions; sampling is often random. Geometry in high dimensions, with probabilistic methods ( ): Asymptotic convex geometry = geometric functional analysis.

3 Geometry of sampling problems Signal recovery problem. Signal: x R n. Unknown. Sampling (measurement) map: Φ : R n R m. Known. Measurement vector: y = Φ(x) R m. Known. Goal: recover x from y.

4 Geometry of sampling problems Prior information (model): x K, where K R n is a known signal set. Examples of K: Discrete L p, i.e. l n p, for 0 < p < {band-limited functions} {s-sparse signals in R n } {block-sparse signals} {low-rank matrices} etc.

5 Geometry of sampling problems Linear sampling: y = Ax, where A is an m n matrix. Non-linear sampling: later.

6 Convexity Signal set K may be non-convex. Then convexity: K conv(k). Question. What do convex sets look like in high dimensions? This is the main question of asymptotic convex geometry = geometric functional analysis.

7 Asymptotic convex geometry Main message of asymptotic convex geometry: K bulk + outliers. Bulk = round ball, makes up most volume of K. Outliers = few long tentacles, contain little volume. V. Milman s heuristic picture of a convex body in high dimensions

8 Concentration of volume The volume of an isotropic convex set is concentrated in a round ball. Isotropy assumption: X Unif(K) satisfies E X = 0, E XX T = I n. E X 2 2 = n, so the radius of that ball is n. Theorem (concentration of volume) [Paouris 06] P { X 2 > t n } exp( ct n), t 1. A simpler proof: [Adamczak-Litvak-Oleszkiewicz-Pajor-Tomczak 13]

9 Concentration of volume Theorem (thin shell) [Klartag 08] X 2 n n with high probability. Question. What is the best bound? Currently best known: n 1/3 [Guedon-E.Milman 11]. Thin Shell Conjecture: absolute constant. This would imply the Hyperplane Conjecture [Eldan-Klartag 11].

10 Round sections A random subspace E misses the outliers, passes through the bulk of K. Theorem (Dvoretzky s theorem) Consider a random subspace E in R n of dimension log n. Then K E round ball with high probability. Remark. For many convex sets K, the dimension of E is much larger than log n. See Milman s version of Dvoretzky s theorem.

11 Sections of arbitrary dimension K R n any convex set. Theorem (M estimate) [Milman 81, Pajor-Tomczak 85, Mendelson-Pajor-Tomczak 07] Consider a random subspace E in R n of codimension m. Then diam(k E) C w(k) m with high probability. Here w(k) is the mean width of K.

12 Mean width w(k) := E sup x K K g, x, where g N(0, I n ). Note: g 2 n, so w(k) = n width of K in a random direction.

13 Example: the l 1 ball K = conv(±e i ) = {x : x 1 1} = B n 1. The standard picture: Vol(K) 1/n Vol( ) 1/n 1 n. Hence a more accurate picture is this:

14 Example: the l 1 ball Mean width: w(k) := E sup x K K g, x = n E [ width of K in random direction ]. w(k) log n, w( ) 1. Hence mean width sees the bulk, ignores the outliers.

15 Signal recovery (following [Mendelson-Pajor-Tomczak 07]) Back to the signal recovery problem (linear sampling): Recover signal x K R n from y = Ax R m. If m n, problem is ill-posed. What do we know? x belongs to both K and the affine subspace E x := {x : Ax = y} = x + ker(a). Both K and E x are known. If diam(k E x ) ε then x can be recovered with error ε.

16 Signal recovery diam(k E x ) ε? Assume K is convex and origin-symmetric. Then diam is largest for x = 0. Conclusion. Let E = ker(a). If diam(k E) ε then any x K can be recovered from y = Ax with error ε. The recovery can be done by a convex program Find x K such that Ax = y. Find a signal consistent with the model (K) and with measurements (y).

17 Signal recovery Remaining question: when is diam(k E) ε? E = ker(a). If A is an random matrix then E is a random subspace. So the M estimate applies and yields diam(k E) C w(k) m with high probability. Equate with ε and obtain the sample size (# of measurements): m ε 2 w(k) 2.

18 i.e. the metric projection of A y onto K. Conclusion (Signal recovery) Let K be a convex and origin-symmetric signal set in R n. Let A be an m n random matrix. If m w(k) 2 then one can accurately recover any signal x K from m random linear measurements given as y = Ax R m. The recovery is done by the convex program Find x K such that Ax = y i.e. find x consistent with the model (K) and measurements (y). Other natural recovery programs: (1) max Ax, y subject to x K i.e. find x K most correlated with the measurements. (2) find x K closest to A y

19 Remarks. 1. If the signal set K is not convex, then convexify; w(conv(k)) = w(k). 2. If the signal set K is not origin-symmetric, then symmetrize; w(k K) 2w(K). 3. Mean width can be efficiently estimated. Randomized linear program: w(k) sup g, x, g N(0, I n ). x K K

20 Information theory viewpoint The sample size m = w(k) 2 is an effective dimension of K, the amount of information in K. Conclusion: One can effectively recover any signal in K from w(k) 2 random linear measurements. Example 1. K = B n 1 = conv(±e i). w(k) 2 log n. So, one can effectively recover any signal in B n 1 from log n random linear measurements. Remark. log n bits are required to specify even a vertex of K. So the number of measurements is optimal.

21 Information theory viewpoint Example 2. K = {s-sparse unit vectors in R n }. w(k) 2 s log n. Conclusion: One can effectively recover any s-sparse signal from s log n random linear measurements. This is a well-known result in compressed sensing. (Warning: exact recovery is not explained by this geometric reasoning.) Remark. log ( n s) s log n bits are required to specify the sparsity pattern.

22 Information theory viewpoint Remark. The mean width is related to other information-theoretic quantities, in particular to the metric entropy: c max t>0 t log N(K, t) w(k) C Sudakov s and Dudley s inequalities. 0 log N(K, t) dt.

23 Non-linear sampling Claim. One can recover a signal x K from non-linear measurements y = Φ(x), and even without fully knowing the nature of non-linearity Φ.

24 Non-linear sampling Non-linear measurements are given as y = θ(ax). Here A is an m n random matrix (known), θ : R R is a link function (possibly unknown). θ is applied to each coordinate of Ax, i.e. the measurements are y i = θ( a i, x ), i = 1,..., m

25 Non-linear sampling Examples. 1. Quantization. Single-bit compressed sensing: θ(t) = sign(t). [Plan-Vershynin 11] 2. Generalized Linear Models (GLM) in Statistics. θ(t) is arbitrary, perhaps unknown. In particular, logistic regression. [Plan-Vershynin 12]

26 Probabilistic Heuristics How is reconstruction from non-linear measurements possible? Probabilistic heuristic. 1. Linear. Let a N(0, I n ). Then Both sides are linear in y, so Law of Large Numbers E a, x a, y = x, y for all x, y R n. 1 m E a, x a = x for all x R n. m a i, x a i x if m is large. i=1

27 Probabilistic Heuristics 1 m m a i, x a i x if m is large. i=1 One can interpret this as: {a i } m i=1 is a frame in Rn. Gaussian frame. One can reconstruct x from linear measurements y = Ax, where y i = a i, x, i = 1,..., m Question. How large does m need to be? Without any prior knowledge about x, m n suffices. If x K, then m w(k) 2, the effective dimension of K. Reconstruction: metric projection of 1 m m i=1 a i, x a i = A y onto K.

28 Probabilistic Heuristics Probabilistic heuristic. 2. Non-linear. Let a N(0, I n ). Then E θ ( a, x ) a, y = λ x, y where λ = E θ(g)g =: 1, g N(0, 1). for all unit x, y R n Still linear in y Law of Large Numbers E θ ( a, x ) a = x for all unit x R n. 1 m m θ ( a i, x ) a i x i=1 if m is large.

29 Probabilistic Heuristics 1 m m θ ( a i, x ) a i x i=1 if m is large. One can interpret this as: {a i } m i=1 is a non-linear frame in Rn. One can reconstruct x from non-linear measurements y = θ(ax), where y i = θ ( a i, x ), i = 1,..., m For the reconstruction, one does not need to know θ! (at least, in principle).

30 Single-bit measurements Let us work out the particular example of single-bit measurements, y i = sign( a i, x ), i = 1,..., m. Geometric interpretation: y = vector of orientations of x with respect to m random hyperplanes (with normals a i ). In stochastic geometry: random hyperplane tessellation.

31 Single-bit measurements We know y = sign(ax), i.e. y i = a i, x i = we know the cell x. If diam(cell) ε cell, then we could recover x with error ε. Recovery would be done by the convex program Find x K such that sign(ax ) = y. It remains to find m so that all cells have diameter ε.

32 Question (Pizza cutting). Given a set K R n, how many random hyperplanes does it take in order to cut K in pieces of diameter ε? Non-trivial even for K = S n 1.

33 Theorem (Pizza cutting) [Plan-V 12]. Consider a set K S n 1 and m random hyperplanes. Then, with high probability, [ ] 1/3 C w(k) diam(cell) cell. m This is similar to the M estimate for diam(k random subspace), except for the exponent 1/3. Optimal exponent =? Corollary (Single-bit recovery). One can accurately and effectively recover any signal x K from m w(k) 2 random measurements y = sign(ax) { 1, 1} m.

34 General non-linear measurements Similar recovery results hold for general link function θ( ), y = θ(ax). Recovery of x can be done by the convex program max y, Ax subject to x K. i.e. find x K most correlated with the measurements. Remark. The solver does not need to know the nature of non-linearity given by θ. It may be unknown or unspecified. [Plan-V 12]

35 Summary: geometric view of signal recovery Recover signal x K from m random measurements/samples y = Ax (linear), y = θ(ax) (non-linear). Convex sets look like this: K bulk + outliers The size of the bulk determines the accuracy of recovery. Accurate and efficient recovery is possible if m w(k) 2 = the effective dimension of K. Here w(k) is the mean width of K, a computable quantity.

36 Further reading: Elementary introduction into modern convex geometry [Ball 97] Lectures in geometric functional analysis [Vershynin 11] Notes on isotropic convex bodies [Brazitikos-Giannopoulos-Valettas-Vritsiou 13+] Textbook in geometric functional analysis [Artstein-Giannopoulos-Milman-Vritsiou 13+] Further on non-linear recovery: single-bit matrix completion [Davenport-Plan-van den Berg-Wootters 12] Thank you!

Random hyperplane tessellations and dimension reduction

Random hyperplane tessellations and dimension reduction Random hyperplane tessellations and dimension reduction Roman Vershynin University of Michigan, Department of Mathematics Phenomena in high dimensions in geometric analysis, random matrices and computational

More information

New ways of dimension reduction? Cutting data sets into small pieces

New ways of dimension reduction? Cutting data sets into small pieces New ways of dimension reduction? Cutting data sets into small pieces Roman Vershynin University of Michigan, Department of Mathematics Statistical Machine Learning Ann Arbor, June 5, 2012 Joint work with

More information

High-dimensional distributions with convexity properties

High-dimensional distributions with convexity properties High-dimensional distributions with convexity properties Bo az Klartag Tel-Aviv University A conference in honor of Charles Fefferman, Princeton, May 2009 High-Dimensional Distributions We are concerned

More information

Optimisation Combinatoire et Convexe.

Optimisation Combinatoire et Convexe. Optimisation Combinatoire et Convexe. Low complexity models, l 1 penalties. A. d Aspremont. M1 ENS. 1/36 Today Sparsity, low complexity models. l 1 -recovery results: three approaches. Extensions: matrix

More information

Sections of Convex Bodies via the Combinatorial Dimension

Sections of Convex Bodies via the Combinatorial Dimension Sections of Convex Bodies via the Combinatorial Dimension (Rough notes - no proofs) These notes are centered at one abstract result in combinatorial geometry, which gives a coordinate approach to several

More information

Invertibility of random matrices

Invertibility of random matrices University of Michigan February 2011, Princeton University Origins of Random Matrix Theory Statistics (Wishart matrices) PCA of a multivariate Gaussian distribution. [Gaël Varoquaux s blog gael-varoquaux.info]

More information

Approximately Gaussian marginals and the hyperplane conjecture

Approximately Gaussian marginals and the hyperplane conjecture Approximately Gaussian marginals and the hyperplane conjecture Tel-Aviv University Conference on Asymptotic Geometric Analysis, Euler Institute, St. Petersburg, July 2010 Lecture based on a joint work

More information

Convex inequalities, isoperimetry and spectral gap III

Convex inequalities, isoperimetry and spectral gap III Convex inequalities, isoperimetry and spectral gap III Jesús Bastero (Universidad de Zaragoza) CIDAMA Antequera, September 11, 2014 Part III. K-L-S spectral gap conjecture KLS estimate, through Milman's

More information

On the distribution of the ψ 2 -norm of linear functionals on isotropic convex bodies

On the distribution of the ψ 2 -norm of linear functionals on isotropic convex bodies On the distribution of the ψ 2 -norm of linear functionals on isotropic convex bodies Joint work with G. Paouris and P. Valettas Cortona 2011 June 16, 2011 (Cortona 2011) Distribution of the ψ 2 -norm

More information

Sparse and Low Rank Recovery via Null Space Properties

Sparse and Low Rank Recovery via Null Space Properties Sparse and Low Rank Recovery via Null Space Properties Holger Rauhut Lehrstuhl C für Mathematik (Analysis), RWTH Aachen Convexity, probability and discrete structures, a geometric viewpoint Marne-la-Vallée,

More information

Random Matrices: Invertibility, Structure, and Applications

Random Matrices: Invertibility, Structure, and Applications Random Matrices: Invertibility, Structure, and Applications Roman Vershynin University of Michigan Colloquium, October 11, 2011 Roman Vershynin (University of Michigan) Random Matrices Colloquium 1 / 37

More information

Mahler Conjecture and Reverse Santalo

Mahler Conjecture and Reverse Santalo Mini-Courses Radoslaw Adamczak Random matrices with independent log-concave rows/columns. I will present some recent results concerning geometric properties of random matrices with independent rows/columns

More information

Small Ball Probability, Arithmetic Structure and Random Matrices

Small Ball Probability, Arithmetic Structure and Random Matrices Small Ball Probability, Arithmetic Structure and Random Matrices Roman Vershynin University of California, Davis April 23, 2008 Distance Problems How far is a random vector X from a given subspace H in

More information

Geometry of log-concave Ensembles of random matrices

Geometry of log-concave Ensembles of random matrices Geometry of log-concave Ensembles of random matrices Nicole Tomczak-Jaegermann Joint work with Radosław Adamczak, Rafał Latała, Alexander Litvak, Alain Pajor Cortona, June 2011 Nicole Tomczak-Jaegermann

More information

Anti-concentration Inequalities

Anti-concentration Inequalities Anti-concentration Inequalities Roman Vershynin Mark Rudelson University of California, Davis University of Missouri-Columbia Phenomena in High Dimensions Third Annual Conference Samos, Greece June 2007

More information

Probabilistic Methods in Asymptotic Geometric Analysis.

Probabilistic Methods in Asymptotic Geometric Analysis. Probabilistic Methods in Asymptotic Geometric Analysis. C. Hugo Jiménez PUC-RIO, Brazil September 21st, 2016. Colmea. RJ 1 Origin 2 Normed Spaces 3 Distribution of volume of high dimensional convex bodies

More information

Asymptotic shape of the convex hull of isotropic log-concave random vectors

Asymptotic shape of the convex hull of isotropic log-concave random vectors Asymptotic shape of the convex hull of isotropic log-concave random vectors Apostolos Giannopoulos, Labrini Hioni and Antonis Tsolomitis Abstract Let x,..., x N be independent random points distributed

More information

IEOR 265 Lecture 3 Sparse Linear Regression

IEOR 265 Lecture 3 Sparse Linear Regression IOR 65 Lecture 3 Sparse Linear Regression 1 M Bound Recall from last lecture that the reason we are interested in complexity measures of sets is because of the following result, which is known as the M

More information

Dimensionality Reduction Notes 3

Dimensionality Reduction Notes 3 Dimensionality Reduction Notes 3 Jelani Nelson minilek@seas.harvard.edu August 13, 2015 1 Gordon s theorem Let T be a finite subset of some normed vector space with norm X. We say that a sequence T 0 T

More information

Super-Gaussian directions of random vectors

Super-Gaussian directions of random vectors Weizmann Institute & Tel Aviv University IMU-INdAM Conference in Analysis, Tel Aviv, June 2017 Gaussian approximation Many distributions in R n, for large n, have approximately Gaussian marginals. Classical

More information

ALEXANDER KOLDOBSKY AND ALAIN PAJOR. Abstract. We prove that there exists an absolute constant C so that µ(k) C p max. ξ S n 1 µ(k ξ ) K 1/n

ALEXANDER KOLDOBSKY AND ALAIN PAJOR. Abstract. We prove that there exists an absolute constant C so that µ(k) C p max. ξ S n 1 µ(k ξ ) K 1/n A REMARK ON MEASURES OF SECTIONS OF L p -BALLS arxiv:1601.02441v1 [math.mg] 11 Jan 2016 ALEXANDER KOLDOBSKY AND ALAIN PAJOR Abstract. We prove that there exists an absolute constant C so that µ(k) C p

More information

A Bernstein-Chernoff deviation inequality, and geometric properties of random families of operators

A Bernstein-Chernoff deviation inequality, and geometric properties of random families of operators A Bernstein-Chernoff deviation inequality, and geometric properties of random families of operators Shiri Artstein-Avidan, Mathematics Department, Princeton University Abstract: In this paper we first

More information

Least singular value of random matrices. Lewis Memorial Lecture / DIMACS minicourse March 18, Terence Tao (UCLA)

Least singular value of random matrices. Lewis Memorial Lecture / DIMACS minicourse March 18, Terence Tao (UCLA) Least singular value of random matrices Lewis Memorial Lecture / DIMACS minicourse March 18, 2008 Terence Tao (UCLA) 1 Extreme singular values Let M = (a ij ) 1 i n;1 j m be a square or rectangular matrix

More information

Entropy extension. A. E. Litvak V. D. Milman A. Pajor N. Tomczak-Jaegermann

Entropy extension. A. E. Litvak V. D. Milman A. Pajor N. Tomczak-Jaegermann Entropy extension A. E. Litvak V. D. Milman A. Pajor N. Tomczak-Jaegermann Dedication: The paper is dedicated to the memory of an outstanding analyst B. Ya. Levin. The second named author would like to

More information

Exponential decay of reconstruction error from binary measurements of sparse signals

Exponential decay of reconstruction error from binary measurements of sparse signals Exponential decay of reconstruction error from binary measurements of sparse signals Deanna Needell Joint work with R. Baraniuk, S. Foucart, Y. Plan, and M. Wootters Outline Introduction Mathematical Formulation

More information

Sparse Recovery with Pre-Gaussian Random Matrices

Sparse Recovery with Pre-Gaussian Random Matrices Sparse Recovery with Pre-Gaussian Random Matrices Simon Foucart Laboratoire Jacques-Louis Lions Université Pierre et Marie Curie Paris, 75013, France Ming-Jun Lai Department of Mathematics University of

More information

GAUSSIAN MEASURE OF SECTIONS OF DILATES AND TRANSLATIONS OF CONVEX BODIES. 2π) n

GAUSSIAN MEASURE OF SECTIONS OF DILATES AND TRANSLATIONS OF CONVEX BODIES. 2π) n GAUSSIAN MEASURE OF SECTIONS OF DILATES AND TRANSLATIONS OF CONVEX BODIES. A. ZVAVITCH Abstract. In this paper we give a solution for the Gaussian version of the Busemann-Petty problem with additional

More information

On the isotropic constant of marginals

On the isotropic constant of marginals On the isotropic constant of marginals Grigoris Paouris Abstract Let p < and n. We write µ B,p,n for the probability measure in R n with density B n p, where Bp n := {x R n : x p b p,n} and Bp n =. Let

More information

Lecture Notes 9: Constrained Optimization

Lecture Notes 9: Constrained Optimization Optimization-based data analysis Fall 017 Lecture Notes 9: Constrained Optimization 1 Compressed sensing 1.1 Underdetermined linear inverse problems Linear inverse problems model measurements of the form

More information

On the singular values of random matrices

On the singular values of random matrices On the singular values of random matrices Shahar Mendelson Grigoris Paouris Abstract We present an approach that allows one to bound the largest and smallest singular values of an N n random matrix with

More information

Rapid Steiner symmetrization of most of a convex body and the slicing problem

Rapid Steiner symmetrization of most of a convex body and the slicing problem Rapid Steiner symmetrization of most of a convex body and the slicing problem B. Klartag, V. Milman School of Mathematical Sciences Tel Aviv University Tel Aviv 69978, Israel Abstract For an arbitrary

More information

Sparse Proteomics Analysis (SPA)

Sparse Proteomics Analysis (SPA) Sparse Proteomics Analysis (SPA) Toward a Mathematical Theory for Feature Selection from Forward Models Martin Genzel Technische Universität Berlin Winter School on Compressed Sensing December 5, 2015

More information

SMALL BALL PROBABILITIES FOR LINEAR IMAGES OF HIGH DIMENSIONAL DISTRIBUTIONS

SMALL BALL PROBABILITIES FOR LINEAR IMAGES OF HIGH DIMENSIONAL DISTRIBUTIONS SMALL BALL PROBABILITIES FOR LINEAR IMAGES OF HIGH DIMENSIONAL DISTRIBUTIONS MARK RUDELSON AND ROMAN VERSHYNIN Abstract. We study concentration properties of random vectors of the form AX, where X = (X

More information

Intrinsic Volumes of Convex Cones Theory and Applications

Intrinsic Volumes of Convex Cones Theory and Applications Intrinsic Volumes of Convex Cones Theory and Applications M\cr NA Manchester Numerical Analysis Martin Lotz School of Mathematics The University of Manchester with the collaboration of Dennis Amelunxen,

More information

Reconstruction from Anisotropic Random Measurements

Reconstruction from Anisotropic Random Measurements Reconstruction from Anisotropic Random Measurements Mark Rudelson and Shuheng Zhou The University of Michigan, Ann Arbor Coding, Complexity, and Sparsity Workshop, 013 Ann Arbor, Michigan August 7, 013

More information

Diameters of Sections and Coverings of Convex Bodies

Diameters of Sections and Coverings of Convex Bodies Diameters of Sections and Coverings of Convex Bodies A. E. Litvak A. Pajor N. Tomczak-Jaegermann Abstract We study the diameters of sections of convex bodies in R N determined by a random N n matrix Γ,

More information

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee227c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee227c@berkeley.edu

More information

arxiv: v3 [cs.it] 19 Jul 2012

arxiv: v3 [cs.it] 19 Jul 2012 ROBUST 1-BIT COMPRESSED SENSING AND SPARSE LOGISTIC REGRESSION: A CONVEX PROGRAMMING APPROACH YANIV PLAN AND ROMAN VERSHYNIN arxiv:1202.1212v3 [cs.it] 19 Jul 2012 Abstract. This paper develops theoretical

More information

ON THE ISOTROPY CONSTANT OF PROJECTIONS OF POLYTOPES

ON THE ISOTROPY CONSTANT OF PROJECTIONS OF POLYTOPES ON THE ISOTROPY CONSTANT OF PROJECTIONS OF POLYTOPES DAVID ALONSO-GUTIÉRREZ, JESÚS BASTERO, JULIO BERNUÉS, AND PAWE L WOLFF Abstract. The isotropy constant of any d-dimensional polytope with n vertices

More information

Weak and strong moments of l r -norms of log-concave vectors

Weak and strong moments of l r -norms of log-concave vectors Weak and strong moments of l r -norms of log-concave vectors Rafał Latała based on the joint work with Marta Strzelecka) University of Warsaw Minneapolis, April 14 2015 Log-concave measures/vectors A measure

More information

Asymptotic shape of a random polytope in a convex body

Asymptotic shape of a random polytope in a convex body Asymptotic shape of a random polytope in a convex body N. Dafnis, A. Giannopoulos, and A. Tsolomitis Abstract Let K be an isotropic convex body in R n and let Z qk be the L q centroid body of K. For every

More information

Uniform Uncertainty Principle and signal recovery via Regularized Orthogonal Matching Pursuit

Uniform Uncertainty Principle and signal recovery via Regularized Orthogonal Matching Pursuit Uniform Uncertainty Principle and signal recovery via Regularized Orthogonal Matching Pursuit arxiv:0707.4203v2 [math.na] 14 Aug 2007 Deanna Needell Department of Mathematics University of California,

More information

Concentration inequalities: basics and some new challenges

Concentration inequalities: basics and some new challenges Concentration inequalities: basics and some new challenges M. Ledoux University of Toulouse, France & Institut Universitaire de France Measure concentration geometric functional analysis, probability theory,

More information

Non-Asymptotic Theory of Random Matrices Lecture 4: Dimension Reduction Date: January 16, 2007

Non-Asymptotic Theory of Random Matrices Lecture 4: Dimension Reduction Date: January 16, 2007 Non-Asymptotic Theory of Random Matrices Lecture 4: Dimension Reduction Date: January 16, 2007 Lecturer: Roman Vershynin Scribe: Matthew Herman 1 Introduction Consider the set X = {n points in R N } where

More information

Uniform uncertainty principle for Bernoulli and subgaussian ensembles

Uniform uncertainty principle for Bernoulli and subgaussian ensembles arxiv:math.st/0608665 v1 27 Aug 2006 Uniform uncertainty principle for Bernoulli and subgaussian ensembles Shahar MENDELSON 1 Alain PAJOR 2 Nicole TOMCZAK-JAEGERMANN 3 1 Introduction August 21, 2006 In

More information

On John type ellipsoids

On John type ellipsoids On John type ellipsoids B. Klartag Tel Aviv University Abstract Given an arbitrary convex symmetric body K R n, we construct a natural and non-trivial continuous map u K which associates ellipsoids to

More information

A SIMPLE TOOL FOR BOUNDING THE DEVIATION OF RANDOM MATRICES ON GEOMETRIC SETS

A SIMPLE TOOL FOR BOUNDING THE DEVIATION OF RANDOM MATRICES ON GEOMETRIC SETS A SIMPLE TOOL FOR BOUNDING THE DEVIATION OF RANDOM MATRICES ON GEOMETRIC SETS CHRISTOPHER LIAW, ABBAS MEHRABIAN, YANIV PLAN, AND ROMAN VERSHYNIN Abstract. Let A be an isotropic, sub-gaussian m n matrix.

More information

Estimates for the affine and dual affine quermassintegrals of convex bodies

Estimates for the affine and dual affine quermassintegrals of convex bodies Estimates for the affine and dual affine quermassintegrals of convex bodies Nikos Dafnis and Grigoris Paouris Abstract We provide estimates for suitable normalizations of the affine and dual affine quermassintegrals

More information

DS-GA 1002 Lecture notes 10 November 23, Linear models

DS-GA 1002 Lecture notes 10 November 23, Linear models DS-GA 2 Lecture notes November 23, 2 Linear functions Linear models A linear model encodes the assumption that two quantities are linearly related. Mathematically, this is characterized using linear functions.

More information

arxiv: v1 [cs.it] 21 Feb 2013

arxiv: v1 [cs.it] 21 Feb 2013 q-ary Compressive Sensing arxiv:30.568v [cs.it] Feb 03 Youssef Mroueh,, Lorenzo Rosasco, CBCL, CSAIL, Massachusetts Institute of Technology LCSL, Istituto Italiano di Tecnologia and IIT@MIT lab, Istituto

More information

arxiv: v2 [cs.it] 8 Apr 2013

arxiv: v2 [cs.it] 8 Apr 2013 ONE-BIT COMPRESSED SENSING WITH NON-GAUSSIAN MEASUREMENTS ALBERT AI, ALEX LAPANOWSKI, YANIV PLAN, AND ROMAN VERSHYNIN arxiv:1208.6279v2 [cs.it] 8 Apr 2013 Abstract. In one-bit compressed sensing, previous

More information

Methods for sparse analysis of high-dimensional data, II

Methods for sparse analysis of high-dimensional data, II Methods for sparse analysis of high-dimensional data, II Rachel Ward May 26, 2011 High dimensional data with low-dimensional structure 300 by 300 pixel images = 90, 000 dimensions 2 / 55 High dimensional

More information

ON THE CONVEX INFIMUM CONVOLUTION INEQUALITY WITH OPTIMAL COST FUNCTION

ON THE CONVEX INFIMUM CONVOLUTION INEQUALITY WITH OPTIMAL COST FUNCTION ON THE CONVEX INFIMUM CONVOLUTION INEQUALITY WITH OPTIMAL COST FUNCTION MARTA STRZELECKA, MICHA L STRZELECKI, AND TOMASZ TKOCZ Abstract. We show that every symmetric random variable with log-concave tails

More information

Pointwise estimates for marginals of convex bodies

Pointwise estimates for marginals of convex bodies Journal of Functional Analysis 54 8) 75 93 www.elsevier.com/locate/jfa Pointwise estimates for marginals of convex bodies R. Eldan a,1,b.klartag b,, a School of Mathematical Sciences, Tel-Aviv University,

More information

Lecture 13 October 6, Covering Numbers and Maurey s Empirical Method

Lecture 13 October 6, Covering Numbers and Maurey s Empirical Method CS 395T: Sublinear Algorithms Fall 2016 Prof. Eric Price Lecture 13 October 6, 2016 Scribe: Kiyeon Jeon and Loc Hoang 1 Overview In the last lecture we covered the lower bound for p th moment (p > 2) and

More information

Quantized Iterative Hard Thresholding:

Quantized Iterative Hard Thresholding: Quantized Iterative Hard Thresholding: ridging 1-bit and High Resolution Quantized Compressed Sensing Laurent Jacques, Kévin Degraux, Christophe De Vleeschouwer Louvain University (UCL), Louvain-la-Neuve,

More information

Approximately gaussian marginals and the hyperplane conjecture

Approximately gaussian marginals and the hyperplane conjecture Approximately gaussian marginals and the hyperplane conjecture R. Eldan and B. Klartag Abstract We discuss connections between certain well-known open problems related to the uniform measure on a high-dimensional

More information

An example of a convex body without symmetric projections.

An example of a convex body without symmetric projections. An example of a convex body without symmetric projections. E. D. Gluskin A. E. Litvak N. Tomczak-Jaegermann Abstract Many crucial results of the asymptotic theory of symmetric convex bodies were extended

More information

arxiv:math/ v1 [math.fa] 30 Sep 2004

arxiv:math/ v1 [math.fa] 30 Sep 2004 arxiv:math/04000v [math.fa] 30 Sep 2004 Small ball probability and Dvoretzy Theorem B. Klartag, R. Vershynin Abstract Large deviation estimates are by now a standard tool in the Asymptotic Convex Geometry,

More information

Curriculum Vitae. Grigoris Paouris November 2018

Curriculum Vitae. Grigoris Paouris November 2018 Curriculum Vitae Grigoris Paouris November 2018 ADDRESS Texas A&M University Department of Mathematics College Station TX 77843-3368 U.S.A. e-mail: grigoris@math.tamu.edu Web-page http://www.math.tamu.edu/

More information

Compressed Sensing and Sparse Recovery

Compressed Sensing and Sparse Recovery ELE 538B: Sparsity, Structure and Inference Compressed Sensing and Sparse Recovery Yuxin Chen Princeton University, Spring 217 Outline Restricted isometry property (RIP) A RIPless theory Compressed sensing

More information

Abstracts. Analytic and Probabilistic Techniques in Modern Convex Geometry November 7-9, 2015 University of Missouri-Columbia

Abstracts. Analytic and Probabilistic Techniques in Modern Convex Geometry November 7-9, 2015 University of Missouri-Columbia Abstracts Analytic and Probabilistic Techniques in Modern Convex Geometry November 7-9, 2015 University of Missouri-Columbia Rotations of shadows of convex bodies: Positive Results and Counterexamples

More information

Lecture 2: Linear Algebra Review

Lecture 2: Linear Algebra Review EE 227A: Convex Optimization and Applications January 19 Lecture 2: Linear Algebra Review Lecturer: Mert Pilanci Reading assignment: Appendix C of BV. Sections 2-6 of the web textbook 1 2.1 Vectors 2.1.1

More information

Universal low-rank matrix recovery from Pauli measurements

Universal low-rank matrix recovery from Pauli measurements Universal low-rank matrix recovery from Pauli measurements Yi-Kai Liu Applied and Computational Mathematics Division National Institute of Standards and Technology Gaithersburg, MD, USA yi-kai.liu@nist.gov

More information

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis Lecture 7: Matrix completion Yuejie Chi The Ohio State University Page 1 Reference Guaranteed Minimum-Rank Solutions of Linear

More information

On the existence of supergaussian directions on convex bodies

On the existence of supergaussian directions on convex bodies On the existence of supergaussian directions on convex bodies Grigoris Paouris Abstract We study the question whether every centered convex body K of volume 1 in R n has supergaussian directions, which

More information

On isotropicity with respect to a measure

On isotropicity with respect to a measure On isotropicity with respect to a measure Liran Rotem Abstract A body is said to be isoptropic with respect to a measure µ if the function θ x, θ dµ(x) is constant on the unit sphere. In this note, we

More information

Methods for sparse analysis of high-dimensional data, II

Methods for sparse analysis of high-dimensional data, II Methods for sparse analysis of high-dimensional data, II Rachel Ward May 23, 2011 High dimensional data with low-dimensional structure 300 by 300 pixel images = 90, 000 dimensions 2 / 47 High dimensional

More information

On Z p -norms of random vectors

On Z p -norms of random vectors On Z -norms of random vectors Rafa l Lata la Abstract To any n-dimensional random vector X we may associate its L -centroid body Z X and the corresonding norm. We formulate a conjecture concerning the

More information

Hereditary discrepancy and factorization norms

Hereditary discrepancy and factorization norms Hereditary discrepancy and factorization norms organized by Aleksandar Nikolov and Kunal Talwar Workshop Summary Background and Goals This workshop was devoted to the application of methods from functional

More information

The Stability of Low-Rank Matrix Reconstruction: a Constrained Singular Value Perspective

The Stability of Low-Rank Matrix Reconstruction: a Constrained Singular Value Perspective Forty-Eighth Annual Allerton Conference Allerton House UIUC Illinois USA September 9 - October 1 010 The Stability of Low-Rank Matrix Reconstruction: a Constrained Singular Value Perspective Gongguo Tang

More information

On the monotonicity of the expected volume of a random. Luis Rademacher Computer Science and Engineering The Ohio State University

On the monotonicity of the expected volume of a random. Luis Rademacher Computer Science and Engineering The Ohio State University On the monotonicity of the expected volume of a random simplex Luis Rademacher Computer Science and Engineering The Ohio State University Sylvester s problem 4 random points: what is the probability that

More information

BEYOND HIRSCH CONJECTURE: WALKS ON RANDOM POLYTOPES AND SMOOTHED COMPLEXITY OF THE SIMPLEX METHOD

BEYOND HIRSCH CONJECTURE: WALKS ON RANDOM POLYTOPES AND SMOOTHED COMPLEXITY OF THE SIMPLEX METHOD BEYOND HIRSCH CONJECTURE: WALKS ON RANDOM POLYTOPES AND SMOOTHED COMPLEXITY OF THE SIMPLEX METHOD ROMAN VERSHYNIN Abstract. The smoothed analysis of algorithms is concerned with the expected running time

More information

1-Bit Matrix Completion

1-Bit Matrix Completion 1-Bit Matrix Completion Mark A. Davenport School of Electrical and Computer Engineering Georgia Institute of Technology Yaniv Plan Mary Wootters Ewout van den Berg Matrix Completion d When is it possible

More information

Supremum of simple stochastic processes

Supremum of simple stochastic processes Subspace embeddings Daniel Hsu COMS 4772 1 Supremum of simple stochastic processes 2 Recap: JL lemma JL lemma. For any ε (0, 1/2), point set S R d of cardinality 16 ln n S = n, and k N such that k, there

More information

RESEARCH STATEMENT GALYNA V. LIVSHYTS

RESEARCH STATEMENT GALYNA V. LIVSHYTS RESEARCH STATEMENT GALYNA V. LIVSHYTS Introduction My research is devoted to exploring the geometry of convex bodies in high dimensions, using methods and ideas from probability, harmonic analysis, differential

More information

Concentration phenomena in high dimensional geometry.

Concentration phenomena in high dimensional geometry. Concentration phenomena in high dimensional geometry. Olivier Guédon arxiv:30.204v [math.fa] 4 Oct 203 Abstract The purpose of this note is to present several aspects of concentration phenomena in high

More information

LARGE DEVIATIONS OF TYPICAL LINEAR FUNCTIONALS ON A CONVEX BODY WITH UNCONDITIONAL BASIS. S. G. Bobkov and F. L. Nazarov. September 25, 2011

LARGE DEVIATIONS OF TYPICAL LINEAR FUNCTIONALS ON A CONVEX BODY WITH UNCONDITIONAL BASIS. S. G. Bobkov and F. L. Nazarov. September 25, 2011 LARGE DEVIATIONS OF TYPICAL LINEAR FUNCTIONALS ON A CONVEX BODY WITH UNCONDITIONAL BASIS S. G. Bobkov and F. L. Nazarov September 25, 20 Abstract We study large deviations of linear functionals on an isotropic

More information

Constrained optimization

Constrained optimization Constrained optimization DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_fall17/index.html Carlos Fernandez-Granda Compressed sensing Convex constrained

More information

The convex algebraic geometry of linear inverse problems

The convex algebraic geometry of linear inverse problems The convex algebraic geometry of linear inverse problems The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation As Published Publisher

More information

Binary matrix completion

Binary matrix completion Binary matrix completion Yaniv Plan University of Michigan SAMSI, LDHD workshop, 2013 Joint work with (a) Mark Davenport (b) Ewout van den Berg (c) Mary Wootters Yaniv Plan (U. Mich.) Binary matrix completion

More information

Empirical Processes and random projections

Empirical Processes and random projections Empirical Processes and random projections B. Klartag, S. Mendelson School of Mathematics, Institute for Advanced Study, Princeton, NJ 08540, USA. Institute of Advanced Studies, The Australian National

More information

Optimal compression of approximate Euclidean distances

Optimal compression of approximate Euclidean distances Optimal compression of approximate Euclidean distances Noga Alon 1 Bo az Klartag 2 Abstract Let X be a set of n points of norm at most 1 in the Euclidean space R k, and suppose ε > 0. An ε-distance sketch

More information

An algebraic perspective on integer sparse recovery

An algebraic perspective on integer sparse recovery An algebraic perspective on integer sparse recovery Lenny Fukshansky Claremont McKenna College (joint work with Deanna Needell and Benny Sudakov) Combinatorics Seminar USC October 31, 2018 From Wikipedia:

More information

Geometric Functional Analysis College Station July Titles/Abstracts. Sourav Chatterjee. Nonlinear large deviations

Geometric Functional Analysis College Station July Titles/Abstracts. Sourav Chatterjee. Nonlinear large deviations Geometric Functional Analysis College Station 25-29 July 2016 Titles/Abstracts 1 Mini-courses Sourav Chatterjee Nonlinear large deviations I will talk about a general technique for computing large deviations

More information

In English, this means that if we travel on a straight line between any two points in C, then we never leave C.

In English, this means that if we travel on a straight line between any two points in C, then we never leave C. Convex sets In this section, we will be introduced to some of the mathematical fundamentals of convex sets. In order to motivate some of the definitions, we will look at the closest point problem from

More information

Isomorphic and almost-isometric problems in high-dimensional convex geometry

Isomorphic and almost-isometric problems in high-dimensional convex geometry Isomorphic and almost-isometric problems in high-dimensional convex geometry Bo az Klartag Abstract. The classical theorems of high-dimensional convex geometry exhibit a surprising level of regularity

More information

On a quantitative reversal of Alexandrov s inequality

On a quantitative reversal of Alexandrov s inequality On a quantitative reversal of Alexandrov s inequality Grigoris Paouris Peter Pivovarov Petros Valettas February 21, 2017 Abstract Alexandrov s inequalities imply that for any convex body A, the sequence

More information

1-Bit Matrix Completion

1-Bit Matrix Completion 1-Bit Matrix Completion Mark A. Davenport School of Electrical and Computer Engineering Georgia Institute of Technology Yaniv Plan Mary Wootters Ewout van den Berg Matrix Completion d When is it possible

More information

GREEDY SIGNAL RECOVERY REVIEW

GREEDY SIGNAL RECOVERY REVIEW GREEDY SIGNAL RECOVERY REVIEW DEANNA NEEDELL, JOEL A. TROPP, ROMAN VERSHYNIN Abstract. The two major approaches to sparse recovery are L 1-minimization and greedy methods. Recently, Needell and Vershynin

More information

ROBUST BINARY FUSED COMPRESSIVE SENSING USING ADAPTIVE OUTLIER PURSUIT. Xiangrong Zeng and Mário A. T. Figueiredo

ROBUST BINARY FUSED COMPRESSIVE SENSING USING ADAPTIVE OUTLIER PURSUIT. Xiangrong Zeng and Mário A. T. Figueiredo 2014 IEEE International Conference on Acoustic, Speech and Signal Processing (ICASSP) ROBUST BINARY FUSED COMPRESSIVE SENSING USING ADAPTIVE OUTLIER PURSUIT Xiangrong Zeng and Mário A. T. Figueiredo Instituto

More information

Structured signal recovery from non-linear and heavy-tailed measurements

Structured signal recovery from non-linear and heavy-tailed measurements Structured signal recovery from non-linear and heavy-tailed measurements Larry Goldstein* Stanislav Minsker* Xiaohan Wei # *Department of Mathematics # Department of Electrical Engineering UniversityofSouthern

More information

Small ball probability and Dvoretzky Theorem

Small ball probability and Dvoretzky Theorem Small ball probability and Dvoretzy Theorem B. Klartag, R. Vershynin Abstract Large deviation estimates are by now a standard tool in Asymptotic Convex Geometry, contrary to small deviation results. In

More information

CS286.2 Lecture 13: Quantum de Finetti Theorems

CS286.2 Lecture 13: Quantum de Finetti Theorems CS86. Lecture 13: Quantum de Finetti Theorems Scribe: Thom Bohdanowicz Before stating a quantum de Finetti theorem for density operators, we should define permutation invariance for quantum states. Let

More information

arxiv: v2 [math.pr] 9 Apr 2018

arxiv: v2 [math.pr] 9 Apr 2018 Gaussian polytopes: a cumulant-based approach Julian Grote and Christoph Thäle arxiv:16.6148v math.pr] 9 Apr 18 Abstract The random convex hull of a Poisson point process in R d whose intensity measure

More information

Another Low-Technology Estimate in Convex Geometry

Another Low-Technology Estimate in Convex Geometry Convex Geometric Analysis MSRI Publications Volume 34, 1998 Another Low-Technology Estimate in Convex Geometry GREG KUPERBERG Abstract. We give a short argument that for some C > 0, every n- dimensional

More information

THE SMALLEST SINGULAR VALUE OF A RANDOM RECTANGULAR MATRIX

THE SMALLEST SINGULAR VALUE OF A RANDOM RECTANGULAR MATRIX THE SMALLEST SINGULAR VALUE OF A RANDOM RECTANGULAR MATRIX MARK RUDELSON AND ROMAN VERSHYNIN Abstract. We prove an optimal estimate of the smallest singular value of a random subgaussian matrix, valid

More information

Subspaces and orthogonal decompositions generated by bounded orthogonal systems

Subspaces and orthogonal decompositions generated by bounded orthogonal systems Subspaces and orthogonal decompositions generated by bounded orthogonal systems Olivier GUÉDON Shahar MENDELSON Alain PAJOR Nicole TOMCZAK-JAEGERMANN August 3, 006 Abstract We investigate properties of

More information

COMMUNITY DETECTION IN SPARSE NETWORKS VIA GROTHENDIECK S INEQUALITY

COMMUNITY DETECTION IN SPARSE NETWORKS VIA GROTHENDIECK S INEQUALITY COMMUNITY DETECTION IN SPARSE NETWORKS VIA GROTHENDIECK S INEQUALITY OLIVIER GUÉDON AND ROMAN VERSHYNIN Abstract. We present a simple and flexible method to prove consistency of semidefinite optimization

More information

arxiv: v2 [math.pr] 15 Dec 2010

arxiv: v2 [math.pr] 15 Dec 2010 HOW CLOSE IS THE SAMPLE COVARIANCE MATRIX TO THE ACTUAL COVARIANCE MATRIX? arxiv:1004.3484v2 [math.pr] 15 Dec 2010 ROMAN VERSHYNIN Abstract. GivenaprobabilitydistributioninR n withgeneral(non-white) covariance,

More information