15.1 Upper bound via Sudakov minorization

Size: px
Start display at page:

Download "15.1 Upper bound via Sudakov minorization"

Transcription

1 ECE598: Information-theoretic methos in high-imensional statistics Spring 206 Lecture 5: Suakov, Maurey, an uality of metric entropy Lecturer: Yihong Wu Scribe: Aolin Xu, Mar 7, 206 [E. Mar 24] In this lecture we stuy the upper an lower bouns on NB, 2, ɛ). From the last lecture, we know that for any Θ R an any ɛ > 0, volθ) volɛb) MΘ,, 2ɛ) NΘ,, ɛ) MΘ,, ɛ) volθ + ɛ 2 B) vol ɛ 2 B). where B is the ball of raius measure by. Therefore, MB, 2, ɛ) volb + ɛ 2 B 2) vol ɛ 2 B 2) vol + ɛ 2 )B ) vol ɛ 2 B = 2) + ɛ ) ) 2 c ɛ + c ) 2 2 ɛ, where we have use the fact that B 2 B by Cauchy-Schwarz inequality, volb ) /, an volb 2 ) /. On the other han, MB, 2, ɛ) volb ) volɛb 2 ) = ) ) volb ) c ɛ volb 2 ) = ɛ. Note that the lower boun erive above is useful only when ɛ. 5. Upper boun via Suakov minorization Recall that the Gaussian with of Θ R is efine as wθ) = E sup θ, Z, where Z N0, I ). θ Θ Theorem 5. Suakov minorization). For any Θ R an any ɛ > 0, wθ) ɛ log MΘ, 2, ɛ). The proof of Theorem 5. relies on Slepian s Gaussian comparison lemma: Lemma 5. Slepian s lemma). Let X = X,..., X n ) an Y = Y,..., Y n ) be Gaussian ranom vectors. If EY i Y j ) 2 EX i X j ) 2 for all i, j, then E max Y i E max X i. For a self-containe proof see [Cha05]. 2 See also [Pis99, Lemma 5.7, p. 70] for a simpler proof of a weaker version E max X i 2E max Y i, which suffices for our purposes though. To avoi measurability ifficulty, wθ) shoul be unerstoo as sup T Θ, T < E max θ T θ, Z. 2 If you took ECE 534 last fall, you shoul revisit Problem 5 of fall5a/homework/hw4.pf which follows [Cha05].

2 Proof of Theorem 5. assuming Slepian. Let {θ,..., θ M } be an he optimal ɛ-packing of Θ. Let i.i.. X i = θ i, Z for i [M], where Z N 0, I ). Let Y i N 0, ɛ 2 /2). Then for any pair i, j, X i an X j are jointly Gaussian, an EX i X j ) 2 = θ i θ j ) E[ZZ ]θ i θ j ) = θ i θ j 2 2 ɛ 2 = EY i Y j ) 2. It follows from Lemma 5. that E max i M X i E max i M Y i ɛ log M. This completes the proof because E sup θ Θ θ, Z E max i M X i. We can apply this theorem to Θ = B. In this case, by the efinition of the ual norm, wb ) = E The theorem then implies that sup x R : x x, Z = E Z log. log MB, 2, ɛ) log ɛ 2. 5.) This boun is almost optimal: When ɛ /, this upper boun is in fact optimal an) much better than what we get from the volume argument, which is log MB, 2, ɛ) log + c ) ɛ. However, 5.) is not always sharp. For example, when ɛ /, it gives log an we know even from volume boun) that the correct behavior is. This suggests we nee a more refine boun that interpolates between volume an Suakov. 5.2 Upper boun via Maurey s empirical metho We can construct a covering of B using the probabilistic metho. Let {e i, i =,..., } be the stanar basis of R. For an arbitrary x B, efine a imensional ranom vector Z as { sgnx i )e i w.p. x i Z = 0 w.p. x Z has the property that EZ i = x i for i =,...,, hence EZ = x, an Var[Z i ] = EZ i x i ) 2 for i =,...,. Let Z ),..., Z k) be i.i.. copies of Z, an let Z = k k j= Z j). Then E Z x 2 2 = E Z i x i ) 2 = Var[ Z i ] = k Var[Z i ] = k E Z x 2 2 k E Z x 2 k, where we have use the facts that Var[ Z i ] = k Var[Z i] an Z x 2 Z x. If we choose k = /ɛ 2, then E Z x 2 E Z x 2 2 ɛ. So there is a realization z of Z such that z x 2 ɛ. 2

3 Now we examine how many ifferent values Z can take regarless of x. Note that where Z = k k Z j) = k K,..., K ), j= K i k, with K i Z, an 0 K i k for i =,...,. 5.2) For any K,..., K ) satisfying inequality 5.2), we get a solution for the following inequality K i + + Ki k, with K Z, an 0 K k for i =,...,, 5.3) by setting K i + = K i an Ki = 0 if K i 0, an setting K i + = 0 an Ki = K i if K i < 0. Therefore, the number of values Z can take is upper boune by the number of solutions of inequality 5.3). Note that there are ) k+2 2 solutions for K i + + Ki = k, with K Z, an 0 K k for i =,...,, because the solutions are all possible types of the sequences of length k with alphabet size 2. It follows that the number of solutions of inequality 5.3) is ) ) ) ) ) k + 2 k + 2 k = =, k which is an upper boun on the number of Z s regarless of x. We thus have shown the existence of an ɛ-covering of B in 2 with carinality upper boune by ) + 2 ) + 2 ɛ 2 ɛ = 2. 2 Therefore, log NB, 2, ɛ) 2 log + ɛ 2 ) ɛ 2 log + 2ɛ 2). 2ɛ 2 We can see that the first upper boun recovers the result from the volume argument, while the secon upper boun is even stronger than the result obtaine from Suakov s minorization. 5.3 Lower boun via packing Hamming spheres Let S k = {x {0, } : w H x) = k} be the Hamming sphere of raius k. For the 2ρ-packing of S k in Hamming istance H, we have log MS k, H, 2ρ) S k k) B H ρ) = ρ i=0 i). This leas to the following lemma. 3

4 Lemma 5.2 Gilbert-Varshamov). There exist constants c an c 2 such that for all N an any k [], log MS k, H, c k) c 2 k log e k. Now we construct a packing of B base on a packing of S k. Let {x,..., x M } be a c k-packing of S k. Let θ i = x i /k. Then θ i B for i =,..., M, an θ i θ j 2 2 = k 2 x i x j H c k. Therefore, {θ,..., θ M } is a c /k-packing of B in 2. Choosing k = /ɛ 2, it follows from Lemma 5.2 that log MB, 2, c ɛ) c 2 ɛ 2 log eɛ 2) for some constants c an c 2. To summarize, combining the upper an lower bouns, we have log ɛ 2 ) ɛ ɛ 2 log NB, 2, ɛ) ɛ. 5.4) log ɛ ɛ Duality First we efine a more general notoin of covering number. For K, T R, efine the covering number of K using translates of T as NK, T ) = min{n : x,..., X N R such that K = N T + x i }. An amazing theorem of Artstein-Milman-Szarek [AMS04] establishes the following uality result for metric entropy: There exist constants α an β such that for any symmetric convex boy K, β log N B 2, ɛ α K ) log NK, ɛb 2 ) log NB 2, αɛk ), where B 2 is the usual unit l 2 -ball, { } K = y : sup x, y x K is the polar boy of K. For example, Bp = B q whenever p + q =. Therefore by uality, 5.4) also applies to log NB 2,, ɛ), which is what is neee for application to minimax risk. 5.5 Example Finally, we use the results in this lecture to erive the minimax lower boun for the p-imension, n-sample Gaussian location moel with respect to the istortion function θ ˆθ 2. 4

5 We can construct an ɛ-packing of B 2 δ) in. From the Fano s metho, R ɛ 2 iam KL{Nθ, n I ) p), θ B 2 δ)}) log MB 2 δ),, ɛ) = ɛ 2 nδ 2 ) log MB 2 δ),, ɛ) = ɛ 2 nδ 2 ) log MB, 2, ɛ/δ) ɛ 2 nδ2 ) δ 2 log + pɛ2 ɛ 2 δ 2 where we have use that fact that iam KL {Nθ, n I p), θ B 2 δ)}) = niam 2 2 B 2 δ)), the uality theorem, an the upper boun on log MB, 2, ɛ/δ). Choosing ɛ = c n an δ = c 2 ɛ with appropriate c an c 2 such that the parenthesis in the lower boun is a positive constant, we obtain R n. An alternative proof of this result is by choosing the packing set as τ{e,..., e p } for some τ > 0 to be etermine later. This set is a τ-packing of R in, because τe i e j ) = τ for all pairs {i, j}. We also have τe i e j ) 2 2 = 2τ 2. Then by Fano s metho, R τ 2 2nτ 2 ). Choosing τ = c n with some appropriate constant c such that the parenthesis in the above boun is a positive constant, we obtain References R n. [AMS04] Shiri Artstein, Vitali Milman, an Stanis law J Szarek. Duality of metric entropy. Annals of mathematics, pages , [Cha05] Sourav Chatterjee. An error boun in the Suakov-Fernique inequality. arxiv preprint arxiv:050424, [Pis99] G. Pisier. The volume of convex boies an Banach space geometry. Cambrige University Press,

19.1 Maximum Likelihood estimator and risk upper bound

19.1 Maximum Likelihood estimator and risk upper bound ECE598: Information-theoretic methods in high-dimensional statistics Spring 016 Lecture 19: Denoising sparse vectors - Ris upper bound Lecturer: Yihong Wu Scribe: Ravi Kiran Raman, Apr 1, 016 This lecture

More information

UC Berkeley Department of Electrical Engineering and Computer Science Department of Statistics

UC Berkeley Department of Electrical Engineering and Computer Science Department of Statistics UC Berkeley Department of Electrical Engineering an Computer Science Department of Statistics EECS 8B / STAT 4B Avance Topics in Statistical Learning Theory Solutions 3 Spring 9 Solution 3. For parti,

More information

21.1 Lower bounds on minimax risk for functional estimation

21.1 Lower bounds on minimax risk for functional estimation ECE598: Information-theoretic methods in high-dimensional statistics Spring 016 Lecture 1: Functional estimation & testing Lecturer: Yihong Wu Scribe: Ashok Vardhan, Apr 14, 016 In this chapter, we will

More information

Lecture Introduction. 2 Examples of Measure Concentration. 3 The Johnson-Lindenstrauss Lemma. CS-621 Theory Gems November 28, 2012

Lecture Introduction. 2 Examples of Measure Concentration. 3 The Johnson-Lindenstrauss Lemma. CS-621 Theory Gems November 28, 2012 CS-6 Theory Gems November 8, 0 Lecture Lecturer: Alesaner Mąry Scribes: Alhussein Fawzi, Dorina Thanou Introuction Toay, we will briefly iscuss an important technique in probability theory measure concentration

More information

19.1 Problem setup: Sparse linear regression

19.1 Problem setup: Sparse linear regression ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016 Lecture 19: Minimax rates for sparse linear regression Lecturer: Yihong Wu Scribe: Subhadeep Paul, April 13/14, 2016 In

More information

Least-Squares Regression on Sparse Spaces

Least-Squares Regression on Sparse Spaces Least-Squares Regression on Sparse Spaces Yuri Grinberg, Mahi Milani Far, Joelle Pineau School of Computer Science McGill University Montreal, Canaa {ygrinb,mmilan1,jpineau}@cs.mcgill.ca 1 Introuction

More information

LECTURE NOTES ON DVORETZKY S THEOREM

LECTURE NOTES ON DVORETZKY S THEOREM LECTURE NOTES ON DVORETZKY S THEOREM STEVEN HEILMAN Abstract. We present the first half of the paper [S]. In particular, the results below, unless otherwise state, shoul be attribute to G. Schechtman.

More information

Lecture 9: October 25, Lower bounds for minimax rates via multiple hypotheses

Lecture 9: October 25, Lower bounds for minimax rates via multiple hypotheses Information and Coding Theory Autumn 07 Lecturer: Madhur Tulsiani Lecture 9: October 5, 07 Lower bounds for minimax rates via multiple hypotheses In this lecture, we extend the ideas from the previous

More information

ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016

ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016 ECE598: Information-theoretic methods in high-dimensional statistics Spring 06 Lecture : Mutual Information Method Lecturer: Yihong Wu Scribe: Jaeho Lee, Mar, 06 Ed. Mar 9 Quick review: Assouad s lemma

More information

Agmon Kolmogorov Inequalities on l 2 (Z d )

Agmon Kolmogorov Inequalities on l 2 (Z d ) Journal of Mathematics Research; Vol. 6, No. ; 04 ISSN 96-9795 E-ISSN 96-9809 Publishe by Canaian Center of Science an Eucation Agmon Kolmogorov Inequalities on l (Z ) Arman Sahovic Mathematics Department,

More information

21.2 Example 1 : Non-parametric regression in Mean Integrated Square Error Density Estimation (L 2 2 risk)

21.2 Example 1 : Non-parametric regression in Mean Integrated Square Error Density Estimation (L 2 2 risk) 10-704: Information Processing and Learning Spring 2015 Lecture 21: Examples of Lower Bounds and Assouad s Method Lecturer: Akshay Krishnamurthy Scribes: Soumya Batra Note: LaTeX template courtesy of UC

More information

Lecture 17: Density Estimation Lecturer: Yihong Wu Scribe: Jiaqi Mu, Mar 31, 2016 [Ed. Apr 1]

Lecture 17: Density Estimation Lecturer: Yihong Wu Scribe: Jiaqi Mu, Mar 31, 2016 [Ed. Apr 1] ECE598: Information-theoretic methods in high-dimensional statistics Spring 06 Lecture 7: Density Estimation Lecturer: Yihong Wu Scribe: Jiaqi Mu, Mar 3, 06 [Ed. Apr ] In last lecture, we studied the minimax

More information

Function Spaces. 1 Hilbert Spaces

Function Spaces. 1 Hilbert Spaces Function Spaces A function space is a set of functions F that has some structure. Often a nonparametric regression function or classifier is chosen to lie in some function space, where the assume structure

More information

Lower bounds on Locality Sensitive Hashing

Lower bounds on Locality Sensitive Hashing Lower bouns on Locality Sensitive Hashing Rajeev Motwani Assaf Naor Rina Panigrahy Abstract Given a metric space (X, X ), c 1, r > 0, an p, q [0, 1], a istribution over mappings H : X N is calle a (r,

More information

3.0.1 Multivariate version and tensor product of experiments

3.0.1 Multivariate version and tensor product of experiments ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016 Lecture 3: Minimax risk of GLM and four extensions Lecturer: Yihong Wu Scribe: Ashok Vardhan, Jan 28, 2016 [Ed. Mar 24]

More information

A new proof of the sharpness of the phase transition for Bernoulli percolation on Z d

A new proof of the sharpness of the phase transition for Bernoulli percolation on Z d A new proof of the sharpness of the phase transition for Bernoulli percolation on Z Hugo Duminil-Copin an Vincent Tassion October 8, 205 Abstract We provie a new proof of the sharpness of the phase transition

More information

An Optimal Algorithm for Bandit and Zero-Order Convex Optimization with Two-Point Feedback

An Optimal Algorithm for Bandit and Zero-Order Convex Optimization with Two-Point Feedback Journal of Machine Learning Research 8 07) - Submitte /6; Publishe 5/7 An Optimal Algorithm for Banit an Zero-Orer Convex Optimization with wo-point Feeback Oha Shamir Department of Computer Science an

More information

16.1 Bounding Capacity with Covering Number

16.1 Bounding Capacity with Covering Number ECE598: Information-theoretic methods in high-dimensional statistics Spring 206 Lecture 6: Upper Bounds for Density Estimation Lecturer: Yihong Wu Scribe: Yang Zhang, Apr, 206 So far we have been mostly

More information

Qubit channels that achieve capacity with two states

Qubit channels that achieve capacity with two states Qubit channels that achieve capacity with two states Dominic W. Berry Department of Physics, The University of Queenslan, Brisbane, Queenslan 4072, Australia Receive 22 December 2004; publishe 22 March

More information

Time-of-Arrival Estimation in Non-Line-Of-Sight Environments

Time-of-Arrival Estimation in Non-Line-Of-Sight Environments 2 Conference on Information Sciences an Systems, The Johns Hopkins University, March 2, 2 Time-of-Arrival Estimation in Non-Line-Of-Sight Environments Sinan Gezici, Hisashi Kobayashi an H. Vincent Poor

More information

PDE Notes, Lecture #11

PDE Notes, Lecture #11 PDE Notes, Lecture # from Professor Jalal Shatah s Lectures Febuary 9th, 2009 Sobolev Spaces Recall that for u L loc we can efine the weak erivative Du by Du, φ := udφ φ C0 If v L loc such that Du, φ =

More information

1. Aufgabenblatt zur Vorlesung Probability Theory

1. Aufgabenblatt zur Vorlesung Probability Theory 24.10.17 1. Aufgabenblatt zur Vorlesung By (Ω, A, P ) we always enote the unerlying probability space, unless state otherwise. 1. Let r > 0, an efine f(x) = 1 [0, [ (x) exp( r x), x R. a) Show that p f

More information

Floating Body, Illumination Body, and Polytopal Approximation

Floating Body, Illumination Body, and Polytopal Approximation Convex Geometric Analysis MSRI Publications Volume 34, 998 Floating Boy, Illumination Boy, an Polytopal Approximation CARSTEN SCHÜTT Abstract. Let K be a convex boy inr an K t its floating boies. There

More information

Lecture 13 October 6, Covering Numbers and Maurey s Empirical Method

Lecture 13 October 6, Covering Numbers and Maurey s Empirical Method CS 395T: Sublinear Algorithms Fall 2016 Prof. Eric Price Lecture 13 October 6, 2016 Scribe: Kiyeon Jeon and Loc Hoang 1 Overview In the last lecture we covered the lower bound for p th moment (p > 2) and

More information

A PAC-Bayesian Approach to Spectrally-Normalized Margin Bounds for Neural Networks

A PAC-Bayesian Approach to Spectrally-Normalized Margin Bounds for Neural Networks A PAC-Bayesian Approach to Spectrally-Normalize Margin Bouns for Neural Networks Behnam Neyshabur, Srinah Bhojanapalli, Davi McAllester, Nathan Srebro Toyota Technological Institute at Chicago {bneyshabur,

More information

Lecture 21: Minimax Theory

Lecture 21: Minimax Theory Lecture : Minimax Theory Akshay Krishnamurthy akshay@cs.umass.edu November 8, 07 Recap In the first part of the course, we spent the majority of our time studying risk minimization. We found many ways

More information

Lecture 12: November 6, 2013

Lecture 12: November 6, 2013 Information an Coing Theory Autumn 204 Lecturer: Mahur Tulsiani Lecture 2: November 6, 203 Scribe: Davi Kim Recall: We were looking at coes of the form C : F k q F n q, where q is prime, k is the message

More information

Chaos, Solitons and Fractals Nonlinear Science, and Nonequilibrium and Complex Phenomena

Chaos, Solitons and Fractals Nonlinear Science, and Nonequilibrium and Complex Phenomena Chaos, Solitons an Fractals (7 64 73 Contents lists available at ScienceDirect Chaos, Solitons an Fractals onlinear Science, an onequilibrium an Complex Phenomena journal homepage: www.elsevier.com/locate/chaos

More information

Lower Bounds for the Smoothed Number of Pareto optimal Solutions

Lower Bounds for the Smoothed Number of Pareto optimal Solutions Lower Bouns for the Smoothe Number of Pareto optimal Solutions Tobias Brunsch an Heiko Röglin Department of Computer Science, University of Bonn, Germany brunsch@cs.uni-bonn.e, heiko@roeglin.org Abstract.

More information

Sections of Convex Bodies via the Combinatorial Dimension

Sections of Convex Bodies via the Combinatorial Dimension Sections of Convex Bodies via the Combinatorial Dimension (Rough notes - no proofs) These notes are centered at one abstract result in combinatorial geometry, which gives a coordinate approach to several

More information

Survey Sampling. 1 Design-based Inference. Kosuke Imai Department of Politics, Princeton University. February 19, 2013

Survey Sampling. 1 Design-based Inference. Kosuke Imai Department of Politics, Princeton University. February 19, 2013 Survey Sampling Kosuke Imai Department of Politics, Princeton University February 19, 2013 Survey sampling is one of the most commonly use ata collection methos for social scientists. We begin by escribing

More information

arxiv: v4 [cs.ds] 7 Mar 2014

arxiv: v4 [cs.ds] 7 Mar 2014 Analysis of Agglomerative Clustering Marcel R. Ackermann Johannes Blömer Daniel Kuntze Christian Sohler arxiv:101.697v [cs.ds] 7 Mar 01 Abstract The iameter k-clustering problem is the problem of partitioning

More information

7.1 Support Vector Machine

7.1 Support Vector Machine 67577 Intro. to Machine Learning Fall semester, 006/7 Lecture 7: Support Vector Machines an Kernel Functions II Lecturer: Amnon Shashua Scribe: Amnon Shashua 7. Support Vector Machine We return now to

More information

Second order differentiation formula on RCD(K, N) spaces

Second order differentiation formula on RCD(K, N) spaces Secon orer ifferentiation formula on RCD(K, N) spaces Nicola Gigli Luca Tamanini February 8, 018 Abstract We prove the secon orer ifferentiation formula along geoesics in finite-imensional RCD(K, N) spaces.

More information

Adaptive Gain-Scheduled H Control of Linear Parameter-Varying Systems with Time-Delayed Elements

Adaptive Gain-Scheduled H Control of Linear Parameter-Varying Systems with Time-Delayed Elements Aaptive Gain-Scheule H Control of Linear Parameter-Varying Systems with ime-delaye Elements Yoshihiko Miyasato he Institute of Statistical Mathematics 4-6-7 Minami-Azabu, Minato-ku, okyo 6-8569, Japan

More information

Multi-View Clustering via Canonical Correlation Analysis

Multi-View Clustering via Canonical Correlation Analysis Technical Report TTI-TR-2008-5 Multi-View Clustering via Canonical Correlation Analysis Kamalika Chauhuri UC San Diego Sham M. Kakae Toyota Technological Institute at Chicago ABSTRACT Clustering ata in

More information

Robust Forward Algorithms via PAC-Bayes and Laplace Distributions. ω Q. Pr (y(ω x) < 0) = Pr A k

Robust Forward Algorithms via PAC-Bayes and Laplace Distributions. ω Q. Pr (y(ω x) < 0) = Pr A k A Proof of Lemma 2 B Proof of Lemma 3 Proof: Since the support of LL istributions is R, two such istributions are equivalent absolutely continuous with respect to each other an the ivergence is well-efine

More information

The chromatic number of graph powers

The chromatic number of graph powers Combinatorics, Probability an Computing (19XX) 00, 000 000. c 19XX Cambrige University Press Printe in the Unite Kingom The chromatic number of graph powers N O G A A L O N 1 an B O J A N M O H A R 1 Department

More information

A Sketch of Menshikov s Theorem

A Sketch of Menshikov s Theorem A Sketch of Menshikov s Theorem Thomas Bao March 14, 2010 Abstract Let Λ be an infinite, locally finite oriente multi-graph with C Λ finite an strongly connecte, an let p

More information

26.1 Metropolis method

26.1 Metropolis method CS880: Approximations Algorithms Scribe: Dave Anrzejewski Lecturer: Shuchi Chawla Topic: Metropolis metho, volume estimation Date: 4/26/07 The previous lecture iscusse they some of the key concepts of

More information

Lecture 2 Lagrangian formulation of classical mechanics Mechanics

Lecture 2 Lagrangian formulation of classical mechanics Mechanics Lecture Lagrangian formulation of classical mechanics 70.00 Mechanics Principle of stationary action MATH-GA To specify a motion uniquely in classical mechanics, it suffices to give, at some time t 0,

More information

Homework 2 Solutions EM, Mixture Models, PCA, Dualitys

Homework 2 Solutions EM, Mixture Models, PCA, Dualitys Homewor Solutions EM, Mixture Moels, PCA, Dualitys CMU 0-75: Machine Learning Fall 05 http://www.cs.cmu.eu/~bapoczos/classes/ml075_05fall/ OUT: Oct 5, 05 DUE: Oct 9, 05, 0:0 AM An EM algorithm for a Mixture

More information

Chromatic number for a generalization of Cartesian product graphs

Chromatic number for a generalization of Cartesian product graphs Chromatic number for a generalization of Cartesian prouct graphs Daniel Král Douglas B. West Abstract Let G be a class of graphs. The -fol gri over G, enote G, is the family of graphs obtaine from -imensional

More information

Convergence of Random Walks

Convergence of Random Walks Chapter 16 Convergence of Ranom Walks This lecture examines the convergence of ranom walks to the Wiener process. This is very important both physically an statistically, an illustrates the utility of

More information

A Unified Theorem on SDP Rank Reduction

A Unified Theorem on SDP Rank Reduction A Unifie heorem on SDP Ran Reuction Anthony Man Cho So, Yinyu Ye, Jiawei Zhang November 9, 006 Abstract We consier the problem of fining a low ran approximate solution to a system of linear equations in

More information

Parameter estimation: A new approach to weighting a priori information

Parameter estimation: A new approach to weighting a priori information Parameter estimation: A new approach to weighting a priori information J.L. Mea Department of Mathematics, Boise State University, Boise, ID 83725-555 E-mail: jmea@boisestate.eu Abstract. We propose a

More information

Global Solutions to the Coupled Chemotaxis-Fluid Equations

Global Solutions to the Coupled Chemotaxis-Fluid Equations Global Solutions to the Couple Chemotaxis-Flui Equations Renjun Duan Johann Raon Institute for Computational an Applie Mathematics Austrian Acaemy of Sciences Altenbergerstrasse 69, A-44 Linz, Austria

More information

Lecture 10: October 30, 2017

Lecture 10: October 30, 2017 Information an Coing Theory Autumn 2017 Lecturer: Mahur Tulsiani Lecture 10: October 30, 2017 1 I-Projections an applications In this lecture, we will talk more about fining the istribution in a set Π

More information

Concentration behavior of the penalized least squares estimator

Concentration behavior of the penalized least squares estimator Concentration behavior of the penalized least squares estimator Penalized least squares behavior arxiv:1511.08698v2 [math.st] 19 Oct 2016 Alan Muro and Sara van de Geer {muro,geer}@stat.math.ethz.ch Seminar

More information

Lecture 2: Correlated Topic Model

Lecture 2: Correlated Topic Model Probabilistic Moels for Unsupervise Learning Spring 203 Lecture 2: Correlate Topic Moel Inference for Correlate Topic Moel Yuan Yuan First of all, let us make some claims about the parameters an variables

More information

Expected Value of Partial Perfect Information

Expected Value of Partial Perfect Information Expecte Value of Partial Perfect Information Mike Giles 1, Takashi Goa 2, Howar Thom 3 Wei Fang 1, Zhenru Wang 1 1 Mathematical Institute, University of Oxfor 2 School of Engineering, University of Tokyo

More information

Tractability results for weighted Banach spaces of smooth functions

Tractability results for weighted Banach spaces of smooth functions Tractability results for weighte Banach spaces of smooth functions Markus Weimar Mathematisches Institut, Universität Jena Ernst-Abbe-Platz 2, 07740 Jena, Germany email: markus.weimar@uni-jena.e March

More information

Active Learning: Disagreement Coefficient

Active Learning: Disagreement Coefficient Advanced Course in Machine Learning Spring 2010 Active Learning: Disagreement Coefficient Handouts are jointly prepared by Shie Mannor and Shai Shalev-Shwartz In previous lectures we saw examples in which

More information

Lower Bounds for k-distance Approximation

Lower Bounds for k-distance Approximation Lower Bouns for k-distance Approximation Quentin Mérigot March 21, 2013 Abstract Consier a set P of N ranom points on the unit sphere of imension 1, an the symmetrize set S = P ( P). The halving polyheron

More information

10-704: Information Processing and Learning Fall Lecture 21: Nov 14. sup

10-704: Information Processing and Learning Fall Lecture 21: Nov 14. sup 0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 2: Nov 4 Note: hese notes are based on scribed notes from Spring5 offering of this course LaeX template courtesy of UC

More information

High-Dimensional p-norms

High-Dimensional p-norms High-Dimensional p-norms Gérar Biau an Davi M. Mason Abstract Let X = X 1,...,X be a R -value ranom vector with i.i.. components, an let X p = j=1 X j p 1/p be its p-norm, for p > 0. The impact of letting

More information

On the constant in the reverse Brunn-Minkowski inequality for p-convex balls.

On the constant in the reverse Brunn-Minkowski inequality for p-convex balls. On the constant in the reverse Brunn-Minkowski inequality for p-convex balls. A.E. Litvak Abstract This note is devoted to the study of the dependence on p of the constant in the reverse Brunn-Minkowski

More information

Step 1. Analytic Properties of the Riemann zeta function [2 lectures]

Step 1. Analytic Properties of the Riemann zeta function [2 lectures] Step. Analytic Properties of the Riemann zeta function [2 lectures] The Riemann zeta function is the infinite sum of terms /, n. For each n, the / is a continuous function of s, i.e. lim s s 0 n = s n,

More information

A note on asymptotic formulae for one-dimensional network flow problems Carlos F. Daganzo and Karen R. Smilowitz

A note on asymptotic formulae for one-dimensional network flow problems Carlos F. Daganzo and Karen R. Smilowitz A note on asymptotic formulae for one-imensional network flow problems Carlos F. Daganzo an Karen R. Smilowitz (to appear in Annals of Operations Research) Abstract This note evelops asymptotic formulae

More information

SOME RESULTS ON THE GEOMETRY OF MINKOWSKI PLANE. Bing Ye Wu

SOME RESULTS ON THE GEOMETRY OF MINKOWSKI PLANE. Bing Ye Wu ARCHIVUM MATHEMATICUM (BRNO Tomus 46 (21, 177 184 SOME RESULTS ON THE GEOMETRY OF MINKOWSKI PLANE Bing Ye Wu Abstract. In this paper we stuy the geometry of Minkowski plane an obtain some results. We focus

More information

NONPARAMETRIC LEAST SQUARES ESTIMATION OF A MULTIVARIATE CONVEX REGRESSION FUNCTION

NONPARAMETRIC LEAST SQUARES ESTIMATION OF A MULTIVARIATE CONVEX REGRESSION FUNCTION NONPARAMETRIC LEAST SQUARES ESTIMATION OF A MULTIVARIATE CONVEX REGRESSION FUNCTION Emilio Seijo an Bohisattva Sen Columbia University arxiv:003.4765v2 [math.st] 0 Jul 200 July 0, 200 Abstract This paper

More information

A New Minimum Description Length

A New Minimum Description Length A New Minimum Description Length Soosan Beheshti, Munther A. Dahleh Laboratory for Information an Decision Systems Massachusetts Institute of Technology soosan@mit.eu,ahleh@lis.mit.eu Abstract The minimum

More information

Convergence of random variables, and the Borel-Cantelli lemmas

Convergence of random variables, and the Borel-Cantelli lemmas Stat 205A Setember, 12, 2002 Convergence of ranom variables, an the Borel-Cantelli lemmas Lecturer: James W. Pitman Scribes: Jin Kim (jin@eecs) 1 Convergence of ranom variables Recall that, given a sequence

More information

Sharp Thresholds. Zachary Hamaker. March 15, 2010

Sharp Thresholds. Zachary Hamaker. March 15, 2010 Sharp Threshols Zachary Hamaker March 15, 2010 Abstract The Kolmogorov Zero-One law states that for tail events on infinite-imensional probability spaces, the probability must be either zero or one. Behavior

More information

A study on ant colony systems with fuzzy pheromone dispersion

A study on ant colony systems with fuzzy pheromone dispersion A stuy on ant colony systems with fuzzy pheromone ispersion Louis Gacogne LIP6 104, Av. Kenney, 75016 Paris, France gacogne@lip6.fr Sanra Sanri IIIA/CSIC Campus UAB, 08193 Bellaterra, Spain sanri@iiia.csic.es

More information

Lectures - Week 10 Introduction to Ordinary Differential Equations (ODES) First Order Linear ODEs

Lectures - Week 10 Introduction to Ordinary Differential Equations (ODES) First Order Linear ODEs Lectures - Week 10 Introuction to Orinary Differential Equations (ODES) First Orer Linear ODEs When stuying ODEs we are consiering functions of one inepenent variable, e.g., f(x), where x is the inepenent

More information

Acute sets in Euclidean spaces

Acute sets in Euclidean spaces Acute sets in Eucliean spaces Viktor Harangi April, 011 Abstract A finite set H in R is calle an acute set if any angle etermine by three points of H is acute. We examine the maximal carinality α() of

More information

Lecture 8: Minimax Lower Bounds: LeCam, Fano, and Assouad

Lecture 8: Minimax Lower Bounds: LeCam, Fano, and Assouad 40.850: athematical Foundation of Big Data Analysis Spring 206 Lecture 8: inimax Lower Bounds: LeCam, Fano, and Assouad Lecturer: Fang Han arch 07 Disclaimer: These notes have not been subjected to the

More information

Functional Analysis, Math 7320 Lecture Notes from August taken by Yaofeng Su

Functional Analysis, Math 7320 Lecture Notes from August taken by Yaofeng Su Functional Analysis, Math 7320 Lecture Notes from August 30 2016 taken by Yaofeng Su 1 Essentials of Topology 1.1 Continuity Next we recall a stronger notion of continuity: 1.1.1 Definition. Let (X, d

More information

cosh x sinh x So writing t = tan(x/2) we have 6.4 Integration using tan(x/2) 2t 1 + t 2 cos x = 1 t2 sin x =

cosh x sinh x So writing t = tan(x/2) we have 6.4 Integration using tan(x/2) 2t 1 + t 2 cos x = 1 t2 sin x = 6.4 Integration using tan/ We will revisit the ouble angle ientities: sin = sin/ cos/ = tan/ sec / = tan/ + tan / cos = cos / sin / tan = = tan / sec / tan/ tan /. = tan / + tan / So writing t = tan/ we

More information

Spectral Gap and Concentration for Some Spherically Symmetric Probability Measures

Spectral Gap and Concentration for Some Spherically Symmetric Probability Measures Spectral Gap and Concentration for Some Spherically Symmetric Probability Measures S.G. Bobkov School of Mathematics, University of Minnesota, 127 Vincent Hall, 26 Church St. S.E., Minneapolis, MN 55455,

More information

Some Examples. Uniform motion. Poisson processes on the real line

Some Examples. Uniform motion. Poisson processes on the real line Some Examples Our immeiate goal is to see some examples of Lévy processes, an/or infinitely-ivisible laws on. Uniform motion Choose an fix a nonranom an efine X := for all (1) Then, {X } is a [nonranom]

More information

Lecture 5. Symmetric Shearer s Lemma

Lecture 5. Symmetric Shearer s Lemma Stanfor University Spring 208 Math 233: Non-constructive methos in combinatorics Instructor: Jan Vonrák Lecture ate: January 23, 208 Original scribe: Erik Bates Lecture 5 Symmetric Shearer s Lemma Here

More information

c Birkhäuser Verlag, Basel 2004 GAFA Geometric And Functional Analysis

c Birkhäuser Verlag, Basel 2004 GAFA Geometric And Functional Analysis GAFA, Geom. funct. anal. Vol. 14 (2004) 1134 1141 1016-443X/04/051134-8 DOI 10.1007/s00039-004-0486-3 c Birkhäuser Verlag, Basel 2004 GAFA Geometric And Functional Analysis ON CONVEXIFIED PACKING AND ENTROPY

More information

Lecture 6 : Dimensionality Reduction

Lecture 6 : Dimensionality Reduction CPS290: Algorithmic Founations of Data Science February 3, 207 Lecture 6 : Dimensionality Reuction Lecturer: Kamesh Munagala Scribe: Kamesh Munagala In this lecture, we will consier the roblem of maing

More information

Concentration inequalities and the entropy method

Concentration inequalities and the entropy method Concentration inequalities and the entropy method Gábor Lugosi ICREA and Pompeu Fabra University Barcelona what is concentration? We are interested in bounding random fluctuations of functions of many

More information

1 Lecture 20: Implicit differentiation

1 Lecture 20: Implicit differentiation Lecture 20: Implicit ifferentiation. Outline The technique of implicit ifferentiation Tangent lines to a circle Derivatives of inverse functions by implicit ifferentiation Examples.2 Implicit ifferentiation

More information

Discrete Mathematics

Discrete Mathematics Discrete Mathematics 309 (009) 86 869 Contents lists available at ScienceDirect Discrete Mathematics journal homepage: wwwelseviercom/locate/isc Profile vectors in the lattice of subspaces Dániel Gerbner

More information

Lecture 19: Elias-Bassalygo Bound

Lecture 19: Elias-Bassalygo Bound Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecturer: Atri Rudra Lecture 19: Elias-Bassalygo Bound October 10, 2007 Scribe: Michael Pfetsch & Atri Rudra In the last lecture,

More information

SINGULAR PERTURBATION AND STATIONARY SOLUTIONS OF PARABOLIC EQUATIONS IN GAUSS-SOBOLEV SPACES

SINGULAR PERTURBATION AND STATIONARY SOLUTIONS OF PARABOLIC EQUATIONS IN GAUSS-SOBOLEV SPACES Communications on Stochastic Analysis Vol. 2, No. 2 (28) 289-36 Serials Publications www.serialspublications.com SINGULAR PERTURBATION AND STATIONARY SOLUTIONS OF PARABOLIC EQUATIONS IN GAUSS-SOBOLEV SPACES

More information

1.1 Basis of Statistical Decision Theory

1.1 Basis of Statistical Decision Theory ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016 Lecture 1: Introduction Lecturer: Yihong Wu Scribe: AmirEmad Ghassami, Jan 21, 2016 [Ed. Jan 31] Outline: Introduction of

More information

arxiv: v4 [math.pr] 27 Jul 2016

arxiv: v4 [math.pr] 27 Jul 2016 The Asymptotic Distribution of the Determinant of a Ranom Correlation Matrix arxiv:309768v4 mathpr] 7 Jul 06 AM Hanea a, & GF Nane b a Centre of xcellence for Biosecurity Risk Analysis, University of Melbourne,

More information

Bayesian Estimation of the Entropy of the Multivariate Gaussian

Bayesian Estimation of the Entropy of the Multivariate Gaussian Bayesian Estimation of the Entropy of the Multivariate Gaussian Santosh Srivastava Fre Hutchinson Cancer Research Center Seattle, WA 989, USA Email: ssrivast@fhcrc.org Maya R. Gupta Department of Electrical

More information

Binary Discrimination Methods for High Dimensional Data with a. Geometric Representation

Binary Discrimination Methods for High Dimensional Data with a. Geometric Representation Binary Discrimination Methos for High Dimensional Data with a Geometric Representation Ay Bolivar-Cime, Luis Miguel Corova-Roriguez Universia Juárez Autónoma e Tabasco, División Acaémica e Ciencias Básicas

More information

Tutorial on Maximum Likelyhood Estimation: Parametric Density Estimation

Tutorial on Maximum Likelyhood Estimation: Parametric Density Estimation Tutorial on Maximum Likelyhoo Estimation: Parametric Density Estimation Suhir B Kylasa 03/13/2014 1 Motivation Suppose one wishes to etermine just how biase an unfair coin is. Call the probability of tossing

More information

Linear Algebra- Review And Beyond. Lecture 3

Linear Algebra- Review And Beyond. Lecture 3 Linear Algebra- Review An Beyon Lecture 3 This lecture gives a wie range of materials relate to matrix. Matrix is the core of linear algebra, an it s useful in many other fiels. 1 Matrix Matrix is the

More information

Monte Carlo Methods with Reduced Error

Monte Carlo Methods with Reduced Error Monte Carlo Methos with Reuce Error As has been shown, the probable error in Monte Carlo algorithms when no information about the smoothness of the function is use is Dξ r N = c N. It is important for

More information

Ramsey numbers of some bipartite graphs versus complete graphs

Ramsey numbers of some bipartite graphs versus complete graphs Ramsey numbers of some bipartite graphs versus complete graphs Tao Jiang, Michael Salerno Miami University, Oxfor, OH 45056, USA Abstract. The Ramsey number r(h, K n ) is the smallest positive integer

More information

Separation of Variables

Separation of Variables Physics 342 Lecture 1 Separation of Variables Lecture 1 Physics 342 Quantum Mechanics I Monay, January 25th, 2010 There are three basic mathematical tools we nee, an then we can begin working on the physical

More information

Concentration, self-bounding functions

Concentration, self-bounding functions Concentration, self-bounding functions S. Boucheron 1 and G. Lugosi 2 and P. Massart 3 1 Laboratoire de Probabilités et Modèles Aléatoires Université Paris-Diderot 2 Economics University Pompeu Fabra 3

More information

Consistency and asymptotic normality

Consistency and asymptotic normality Consistency an asymtotic normality Class notes for Econ 842 Robert e Jong March 2006 1 Stochastic convergence The asymtotic theory of minimization estimators relies on various theorems from mathematical

More information

MAT 545: Complex Geometry Fall 2008

MAT 545: Complex Geometry Fall 2008 MAT 545: Complex Geometry Fall 2008 Notes on Lefschetz Decomposition 1 Statement Let (M, J, ω) be a Kahler manifol. Since ω is a close 2-form, it inuces a well-efine homomorphism L: H k (M) H k+2 (M),

More information

CHAPTER 1 : DIFFERENTIABLE MANIFOLDS. 1.1 The definition of a differentiable manifold

CHAPTER 1 : DIFFERENTIABLE MANIFOLDS. 1.1 The definition of a differentiable manifold CHAPTER 1 : DIFFERENTIABLE MANIFOLDS 1.1 The efinition of a ifferentiable manifol Let M be a topological space. This means that we have a family Ω of open sets efine on M. These satisfy (1), M Ω (2) the

More information

Topic 7: Convergence of Random Variables

Topic 7: Convergence of Random Variables Topic 7: Convergence of Ranom Variables Course 003, 2016 Page 0 The Inference Problem So far, our starting point has been a given probability space (S, F, P). We now look at how to generate information

More information

On a limit theorem for non-stationary branching processes.

On a limit theorem for non-stationary branching processes. On a limit theorem for non-stationary branching processes. TETSUYA HATTORI an HIROSHI WATANABE 0. Introuction. The purpose of this paper is to give a limit theorem for a certain class of iscrete-time multi-type

More information

Introduction to the Vlasov-Poisson system

Introduction to the Vlasov-Poisson system Introuction to the Vlasov-Poisson system Simone Calogero 1 The Vlasov equation Consier a particle with mass m > 0. Let x(t) R 3 enote the position of the particle at time t R an v(t) = ẋ(t) = x(t)/t its

More information

Supplementary Material for Nonparametric Operator-Regularized Covariance Function Estimation for Functional Data

Supplementary Material for Nonparametric Operator-Regularized Covariance Function Estimation for Functional Data Supplementary Material for Nonparametric Operator-Regularized Covariance Function Estimation for Functional Data Raymond K. W. Wong Department of Statistics, Texas A&M University Xiaoke Zhang Department

More information

Table of Common Derivatives By David Abraham

Table of Common Derivatives By David Abraham Prouct an Quotient Rules: Table of Common Derivatives By Davi Abraham [ f ( g( ] = [ f ( ] g( + f ( [ g( ] f ( = g( [ f ( ] g( g( f ( [ g( ] Trigonometric Functions: sin( = cos( cos( = sin( tan( = sec

More information

Levy Process and Infinitely Divisible Law

Levy Process and Infinitely Divisible Law Stat205B: Probability Theory (Spring 2003) Lecture: 26 Levy Process an Infinitely Divisible Law Lecturer: James W. Pitman Scribe: Bo Li boli@stat.berkeley.eu Levy Processes an Infinitely Divisible Law

More information

Introduction to variational calculus: Lecture notes 1

Introduction to variational calculus: Lecture notes 1 October 10, 2006 Introuction to variational calculus: Lecture notes 1 Ewin Langmann Mathematical Physics, KTH Physics, AlbaNova, SE-106 91 Stockholm, Sween Abstract I give an informal summary of variational

More information