Lecture 21 Nov 18, 2015

Size: px
Start display at page:

Download "Lecture 21 Nov 18, 2015"

Transcription

1 CS 388R: Randoized Algoriths Fall 05 Prof. Eric Price Lecture Nov 8, 05 Scribe: Chad Voegele, Arun Sai Overview In the last class, we defined the ters cut sparsifier and spectral sparsifier and introduced Roan Vershynin (RV) Lea which will help us analyse Laplacians of rando graphs. In this lecture, we show how to produce spectral sparsifiers with O(n log n/ɛ ) edges where n is the nuber of nodes in the graph and ɛ is a easure of the quality of the sparsifier. Background Definition. For A and B syetric atrices, A B if x, x Ax x Bx Definition. raph Laplacian The Laplacian atrix of a weighted graph = (V, E, w), where w (u,v) is the weight of edge (u, v) is defined by { L (u, v) = w (u,v) z w (u,z) if u v if u = v The Laplacian can be expressed in ters of differences of standard basis vectors. L = w e (e u e v )(e u e v ) e=(u,v) = e=(u,v) w e u e u e where e i is the standard basis vector such that (e i ) j = δ ij and for edge e = (u, v), u e = e u e v. Definition 3. Spectral Sparsifier A graph H = (V, E, w ) is an ɛ spectral approxiation of a graph = (V, E, w) if ( ɛ)l L H ( + ɛ)l where L, L H are the Laplacians of graphs, H respectively. Note that x L x = e=(u,v) w e(x u x v ) is shift invariant. So in the analysis below, we restrict ourselves to x such that x = 0. Lea 4. Roan Vershynin Lea Let {X i } be i.i.d rando vectors in Rn, such that each X i is uniforly bounded X i κ, E[X i X i ] i []

2 Then E [ ] X i X log n i E[XX ] κ Last class, we proposed the following randoized algorith for coputing a spectral sparsifier. Algorith enerates spectral sparsifier Input: = (V, E, w). Output: H = (V, E, w ), a spectral sparsifier of : for ties do : Choose each edge e E with soe probability 3: Add edge e to E with w (e) = we In expectation, the Laplacian of the graph H output by the above algorith is equal to the Laplacian of. Let Y e = we u e and let {Z i } be independent rando variables where Z i = Y e with probability. Note that L H = Z iz i. [ ] E[L H ] = E Z i Z i = E[Z Z ] = e E Y e Y e = e E w e u e u e = L In the next section we discuss how to choose, a probability distribution over edges in, that gives us a good spectral sparsifier. 3 Spectral Sparsifiers We start with the siple case of coplete graphs, which have a spherical Laplacian, and ove to non-coplete graphs in Section 3.. To keep the analysis siple we only consider unweighted graphs. 3. Coplete raphs When is a coplete graph, the Laplacian L is given by: n n L =..... = ni. n

3 where is a vector of all s. Fro Definition 3, for H to be a spectral sparsifier, we need that ( ɛ)l L H ( + ɛ)l ( ɛ)x L x x L H x ( + ɛ)x L x x s.t. x = 0 x (L H L )x ɛx L x x s.t. x = 0 L H L ɛn where the last step follows fro the assuption that x = 0 and x L x = x (ni ) x = n x. We now show that when ( n log n ) and p ɛ e is unifor over edges, Algorith outputs an ɛ spectral approxiatior of. We have: = ( ( ) n = Θ ) n It is easy to see that the rando variables {Z i } are uniforly bounded: Z i = Y e (for soe edge e) we = u e u e = = Θ(n) Also, fro before we have that E [Z i Z i ] = L = n Applying RV Lea on rando variables X i = Z i n, we get Thus ( ) n log n E n L H L n log n E ( L H L ) n ( ) n log n ɛn if So for ( n log n ɛ ), we get an ɛ approxiate spectral sparsifier of. ɛ 3. Non-coplete raphs For non-coplete graphs, there are two issues that we need to deal with. 3

4 . The Laplacian of a non-coplete graph need not be spherical. So in order to apply RV lea, rather than looking at rando variables {Z i }, we look at transfored rando variables {AZ i } for soe atrix A.. Need to find a better sapling distribution,. For exaple, in the case of a barbell graph, we need to return the iddle edge to get a good sparsifier. Consider an unweighted graph. Let U R E n be atrix representing E edges where U = u u u E Then the Laplacian for can be represented as L = U U L is syetric since L = L. Also we have that x, x L x = x U Ux = Ux 0 and thus L is positive sei-definite. This iplies that all eigenvalues of L are non-negative. Using the eigenvalue decoposition of L, we can express L as L = n λ i b i b i where {b i } n are orthonoral eigenvectors and λ i 0. Positive powers of L can be calculated by L p = n λ p i b ib i The Moore-Penrose pseudo-inverse of L is given by L = n λ i 0 λ i b i b i and thus (L ) = n λ i 0 λ / i b i b i Using these, we can express the projector onto the span of L as Π L = b i b i = (L ) L = L (L ) λ i 0 4

5 Note that Π L Π L = Π L. For H to be a spectral sparsifier of, we need that x L H x = ( ± ɛ)x L x, x x L H x = ( ± ɛ)x L x, x s.t. x T = 0 x L H x = ( ± ɛ)x L x, x span(l ) where the last stateent holds when is a connected graph (because L has rank n for a connected graph and L = 0). Thus for any x span(l ): where y = L x. Then the condition becoes: x = Π L x = (L ) L x = (L ) y x L H x = ( ± ɛ)x L x, x span(l ) y L L HL y = ( ± ɛ)y L L L y L = ( ± ɛ)y L L L = ( ± ɛ)y Π L Π L y = ( ± ɛ)y Π L y, y y Subtracting y Π L y fro both sides, we get y (L L HL Π L )y ɛy Π L y = ɛy y L L HL Π L ɛ We now apply RV Lea on rando variables A i = L Z i. Let κ = ax A i and we have: Applying RV Lea we get: E[A i A i ] = L E[Z iz i ]L = L L L = Π L Ai A i E[A ia i ] = L L HL Π L κ log n So if (κ log n/ɛ ), we get a ɛ approxiate sparsifier. Note that we haven t yet defined the probability distribution. κ will depend on the choice of. To pick a good probability distribution and to copute κ, we appeal to physical intution. Consider the graph to represent nodes on a circuit and let x R n denote the voltages on each 5

6 node. The current flow along edge e, denoted by I e, fro u to v is related to the voltage drop. Thus I e = x u x v = u e x. The flow along all edges is given by I = Ux where I R E. iven a battery on the circuit, with x s and x t fixed at soe voltages, we can calculate the rest of the internal voltages x v using Kirchoff s Laws: current into vertex current out of vertex = external flow We know that the external flow is: ( at s, at t, 0 elsewhere ) For node v, (I ext ) v = I e e=(u,v) e=(v,u) I e = e I e (U e ) v = (U I) v = (U Ux) v = (L x) v We know the external flow and want voltages so we copute x = L I ext where L is the pseudoinverse. If we set I ext to u e for soe edge (u, v) to indicate that unit of current is pushed fro u to v, then L u e is a vector of all voltages in the circuit and thus u el u e is the voltage drop fro u to v. Fro Oh s Law, we know that V = IR eff where R eff is the effective resistance. Since we have unit of current, we conclude that R eff = u el u e. We use this fact in our calculation of A i. A i = A i A i = Z i L L Z i = Z i L Z i = u el u e = R eff This suggests to set R eff and after noralizing = R eff (e) Reff (e) Thus κ = R eff (e) and we need (( R eff (e)) log n/ɛ ) to get a ɛ approxiate sparsifier. And finally to copute R eff (e), we use Foster s Theore. 6

7 Theore 5. Foster s Theore Let R eff (e) denote the effective resistance along edge e on a connected graph of n nodes. Then R eff (e) = n e E Proof. Define P = UL U. Then P = UL L L U = P Thus P is a projection atrix, and all its eigenvalues λ i {0, }. Since L has rank n and P has the sae rank as L, n eigenvalues of P are equal to and the rest are 0. Fro the definition of effective resistance we have: R eff (e) = u el u e = P e,e R eff (e) = tr(p ) = λ i = n Finally, we conclude that we need (n log n/ɛ ) and coplete the proof for non-coplete graphs. References [] R. Motwani and P. Raghavan. Randoized Algoriths. Cabridge University Press, 995. [] D. A. Spielan and S. Teng. Spectral sparsification of graphs. CoRR, abs/ , 008. [3] R. Vershynin. Introduction to the non-asyptotic analysis of rando atrices, 00. [4] E. W. Weisstein. Foster s theores. htl. 7

Quantum algorithms (CO 781, Winter 2008) Prof. Andrew Childs, University of Waterloo LECTURE 15: Unstructured search and spatial search

Quantum algorithms (CO 781, Winter 2008) Prof. Andrew Childs, University of Waterloo LECTURE 15: Unstructured search and spatial search Quantu algoriths (CO 781, Winter 2008) Prof Andrew Childs, University of Waterloo LECTURE 15: Unstructured search and spatial search ow we begin to discuss applications of quantu walks to search algoriths

More information

Physics 215 Winter The Density Matrix

Physics 215 Winter The Density Matrix Physics 215 Winter 2018 The Density Matrix The quantu space of states is a Hilbert space H. Any state vector ψ H is a pure state. Since any linear cobination of eleents of H are also an eleent of H, it

More information

Least Squares Fitting of Data

Least Squares Fitting of Data Least Squares Fitting of Data David Eberly, Geoetric Tools, Redond WA 98052 https://www.geoetrictools.co/ This work is licensed under the Creative Coons Attribution 4.0 International License. To view a

More information

13.2 Fully Polynomial Randomized Approximation Scheme for Permanent of Random 0-1 Matrices

13.2 Fully Polynomial Randomized Approximation Scheme for Permanent of Random 0-1 Matrices CS71 Randoness & Coputation Spring 018 Instructor: Alistair Sinclair Lecture 13: February 7 Disclaier: These notes have not been subjected to the usual scrutiny accorded to foral publications. They ay

More information

CSE525: Randomized Algorithms and Probabilistic Analysis May 16, Lecture 13

CSE525: Randomized Algorithms and Probabilistic Analysis May 16, Lecture 13 CSE55: Randoied Algoriths and obabilistic Analysis May 6, Lecture Lecturer: Anna Karlin Scribe: Noah Siegel, Jonathan Shi Rando walks and Markov chains This lecture discusses Markov chains, which capture

More information

Block designs and statistics

Block designs and statistics Bloc designs and statistics Notes for Math 447 May 3, 2011 The ain paraeters of a bloc design are nuber of varieties v, bloc size, nuber of blocs b. A design is built on a set of v eleents. Each eleent

More information

Asynchronous Gossip Algorithms for Stochastic Optimization

Asynchronous Gossip Algorithms for Stochastic Optimization Asynchronous Gossip Algoriths for Stochastic Optiization S. Sundhar Ra ECE Dept. University of Illinois Urbana, IL 680 ssrini@illinois.edu A. Nedić IESE Dept. University of Illinois Urbana, IL 680 angelia@illinois.edu

More information

Physical Metaphors for Graphs

Physical Metaphors for Graphs Graphs and Networks Lecture 3 Physical Metaphors for Graphs Daniel A. Spielman October 9, 203 3. Overview We will examine physical metaphors for graphs. We begin with a spring model and then discuss resistor

More information

Supplementary Material for Fast and Provable Algorithms for Spectrally Sparse Signal Reconstruction via Low-Rank Hankel Matrix Completion

Supplementary Material for Fast and Provable Algorithms for Spectrally Sparse Signal Reconstruction via Low-Rank Hankel Matrix Completion Suppleentary Material for Fast and Provable Algoriths for Spectrally Sparse Signal Reconstruction via Low-Ran Hanel Matrix Copletion Jian-Feng Cai Tianing Wang Ke Wei March 1, 017 Abstract We establish

More information

3.3 Variational Characterization of Singular Values

3.3 Variational Characterization of Singular Values 3.3. Variational Characterization of Singular Values 61 3.3 Variational Characterization of Singular Values Since the singular values are square roots of the eigenvalues of the Heritian atrices A A and

More information

Fixed-to-Variable Length Distribution Matching

Fixed-to-Variable Length Distribution Matching Fixed-to-Variable Length Distribution Matching Rana Ali Ajad and Georg Böcherer Institute for Counications Engineering Technische Universität München, Gerany Eail: raa2463@gail.co,georg.boecherer@tu.de

More information

Feature Extraction Techniques

Feature Extraction Techniques Feature Extraction Techniques Unsupervised Learning II Feature Extraction Unsupervised ethods can also be used to find features which can be useful for categorization. There are unsupervised ethods that

More information

1 Proof of learning bounds

1 Proof of learning bounds COS 511: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #4 Scribe: Akshay Mittal February 13, 2013 1 Proof of learning bounds For intuition of the following theore, suppose there exists a

More information

A new type of lower bound for the largest eigenvalue of a symmetric matrix

A new type of lower bound for the largest eigenvalue of a symmetric matrix Linear Algebra and its Applications 47 7 9 9 www.elsevier.co/locate/laa A new type of lower bound for the largest eigenvalue of a syetric atrix Piet Van Mieghe Delft University of Technology, P.O. Box

More information

Explicit solution of the polynomial least-squares approximation problem on Chebyshev extrema nodes

Explicit solution of the polynomial least-squares approximation problem on Chebyshev extrema nodes Explicit solution of the polynoial least-squares approxiation proble on Chebyshev extrea nodes Alfredo Eisinberg, Giuseppe Fedele Dipartiento di Elettronica Inforatica e Sisteistica, Università degli Studi

More information

Distance Optimal Target Assignment in Robotic Networks under Communication and Sensing Constraints

Distance Optimal Target Assignment in Robotic Networks under Communication and Sensing Constraints Distance Optial Target Assignent in Robotic Networks under Counication and Sensing Constraints Jingjin Yu CSAIL @ MIT/MechE @ BU Soon-Jo Chung Petros G. Voulgaris AE @ University of Illinois Supported

More information

Consistent Multiclass Algorithms for Complex Performance Measures. Supplementary Material

Consistent Multiclass Algorithms for Complex Performance Measures. Supplementary Material Consistent Multiclass Algoriths for Coplex Perforance Measures Suppleentary Material Notations. Let λ be the base easure over n given by the unifor rando variable (say U over n. Hence, for all easurable

More information

Lecture 20 November 7, 2013

Lecture 20 November 7, 2013 CS 229r: Algoriths for Big Data Fall 2013 Prof. Jelani Nelson Lecture 20 Noveber 7, 2013 Scribe: Yun Willia Yu 1 Introduction Today we re going to go through the analysis of atrix copletion. First though,

More information

Fundamental Limits of Database Alignment

Fundamental Limits of Database Alignment Fundaental Liits of Database Alignent Daniel Cullina Dept of Electrical Engineering Princeton University dcullina@princetonedu Prateek Mittal Dept of Electrical Engineering Princeton University pittal@princetonedu

More information

Computational and Statistical Learning Theory

Computational and Statistical Learning Theory Coputational and Statistical Learning Theory Proble sets 5 and 6 Due: Noveber th Please send your solutions to learning-subissions@ttic.edu Notations/Definitions Recall the definition of saple based Radeacher

More information

Sparsification by Effective Resistance Sampling

Sparsification by Effective Resistance Sampling Spectral raph Theory Lecture 17 Sparsification by Effective Resistance Sampling Daniel A. Spielman November 2, 2015 Disclaimer These notes are not necessarily an accurate representation of what happened

More information

Chapter 6 1-D Continuous Groups

Chapter 6 1-D Continuous Groups Chapter 6 1-D Continuous Groups Continuous groups consist of group eleents labelled by one or ore continuous variables, say a 1, a 2,, a r, where each variable has a well- defined range. This chapter explores:

More information

The degree of a typical vertex in generalized random intersection graph models

The degree of a typical vertex in generalized random intersection graph models Discrete Matheatics 306 006 15 165 www.elsevier.co/locate/disc The degree of a typical vertex in generalized rando intersection graph odels Jerzy Jaworski a, Michał Karoński a, Dudley Stark b a Departent

More information

Sharp Time Data Tradeoffs for Linear Inverse Problems

Sharp Time Data Tradeoffs for Linear Inverse Problems Sharp Tie Data Tradeoffs for Linear Inverse Probles Saet Oyak Benjain Recht Mahdi Soltanolkotabi January 016 Abstract In this paper we characterize sharp tie-data tradeoffs for optiization probles used

More information

The Hilbert Schmidt version of the commutator theorem for zero trace matrices

The Hilbert Schmidt version of the commutator theorem for zero trace matrices The Hilbert Schidt version of the coutator theore for zero trace atrices Oer Angel Gideon Schechtan March 205 Abstract Let A be a coplex atrix with zero trace. Then there are atrices B and C such that

More information

Bipartite subgraphs and the smallest eigenvalue

Bipartite subgraphs and the smallest eigenvalue Bipartite subgraphs and the sallest eigenvalue Noga Alon Benny Sudaov Abstract Two results dealing with the relation between the sallest eigenvalue of a graph and its bipartite subgraphs are obtained.

More information

A note on the multiplication of sparse matrices

A note on the multiplication of sparse matrices Cent. Eur. J. Cop. Sci. 41) 2014 1-11 DOI: 10.2478/s13537-014-0201-x Central European Journal of Coputer Science A note on the ultiplication of sparse atrices Research Article Keivan Borna 12, Sohrab Aboozarkhani

More information

Machine Learning Basics: Estimators, Bias and Variance

Machine Learning Basics: Estimators, Bias and Variance Machine Learning Basics: Estiators, Bias and Variance Sargur N. srihari@cedar.buffalo.edu This is part of lecture slides on Deep Learning: http://www.cedar.buffalo.edu/~srihari/cse676 1 Topics in Basics

More information

Determinant of Sample Correlation Matrix with Application

Determinant of Sample Correlation Matrix with Application Deterinant of Saple Correlation Matrix with Application Tiefeng Jiang University of Minnesota Abstract Let x,, x n be independent rando vectors of a coon p-diensional noral distribution with population

More information

Optimal Jamming Over Additive Noise: Vector Source-Channel Case

Optimal Jamming Over Additive Noise: Vector Source-Channel Case Fifty-first Annual Allerton Conference Allerton House, UIUC, Illinois, USA October 2-3, 2013 Optial Jaing Over Additive Noise: Vector Source-Channel Case Erah Akyol and Kenneth Rose Abstract This paper

More information

Generalized eigenfunctions and a Borel Theorem on the Sierpinski Gasket.

Generalized eigenfunctions and a Borel Theorem on the Sierpinski Gasket. Generalized eigenfunctions and a Borel Theore on the Sierpinski Gasket. Kasso A. Okoudjou, Luke G. Rogers, and Robert S. Strichartz May 26, 2006 1 Introduction There is a well developed theory (see [5,

More information

Walks, Springs, and Resistor Networks

Walks, Springs, and Resistor Networks Spectral Graph Theory Lecture 12 Walks, Springs, and Resistor Networks Daniel A. Spielman October 8, 2018 12.1 Overview In this lecture we will see how the analysis of random walks, spring networks, and

More information

Computational and Statistical Learning Theory

Computational and Statistical Learning Theory Coputational and Statistical Learning Theory TTIC 31120 Prof. Nati Srebro Lecture 2: PAC Learning and VC Theory I Fro Adversarial Online to Statistical Three reasons to ove fro worst-case deterinistic

More information

G G G G G. Spec k G. G Spec k G G. G G m G. G Spec k. Spec k

G G G G G. Spec k G. G Spec k G G. G G m G. G Spec k. Spec k 12 VICTORIA HOSKINS 3. Algebraic group actions and quotients In this section we consider group actions on algebraic varieties and also describe what type of quotients we would like to have for such group

More information

Exact tensor completion with sum-of-squares

Exact tensor completion with sum-of-squares Proceedings of Machine Learning Research vol 65:1 54, 2017 30th Annual Conference on Learning Theory Exact tensor copletion with su-of-squares Aaron Potechin Institute for Advanced Study, Princeton David

More information

U.C. Berkeley CS270: Algorithms Lecture 21 Professor Vazirani and Professor Rao Last revised. Lecture 21

U.C. Berkeley CS270: Algorithms Lecture 21 Professor Vazirani and Professor Rao Last revised. Lecture 21 U.C. Berkeley CS270: Algorithms Lecture 21 Professor Vazirani and Professor Rao Scribe: Anupam Last revised Lecture 21 1 Laplacian systems in nearly linear time Building upon the ideas introduced in the

More information

E0 370 Statistical Learning Theory Lecture 6 (Aug 30, 2011) Margin Analysis

E0 370 Statistical Learning Theory Lecture 6 (Aug 30, 2011) Margin Analysis E0 370 tatistical Learning Theory Lecture 6 (Aug 30, 20) Margin Analysis Lecturer: hivani Agarwal cribe: Narasihan R Introduction In the last few lectures we have seen how to obtain high confidence bounds

More information

Linear Transformations

Linear Transformations Linear Transforations Hopfield Network Questions Initial Condition Recurrent Layer p S x W S x S b n(t + ) a(t + ) S x S x D a(t) S x S S x S a(0) p a(t + ) satlins (Wa(t) + b) The network output is repeatedly

More information

Semicircle law for generalized Curie-Weiss matrix ensembles at subcritical temperature

Semicircle law for generalized Curie-Weiss matrix ensembles at subcritical temperature Seicircle law for generalized Curie-Weiss atrix ensebles at subcritical teperature Werner Kirsch Fakultät für Matheatik und Inforatik FernUniversität in Hagen, Gerany Thoas Kriecherbauer Matheatisches

More information

Lecture 9 November 23, 2015

Lecture 9 November 23, 2015 CSC244: Discrepancy Theory in Coputer Science Fall 25 Aleksandar Nikolov Lecture 9 Noveber 23, 25 Scribe: Nick Spooner Properties of γ 2 Recall that γ 2 (A) is defined for A R n as follows: γ 2 (A) = in{r(u)

More information

Supplementary Information for Design of Bending Multi-Layer Electroactive Polymer Actuators

Supplementary Information for Design of Bending Multi-Layer Electroactive Polymer Actuators Suppleentary Inforation for Design of Bending Multi-Layer Electroactive Polyer Actuators Bavani Balakrisnan, Alek Nacev, and Elisabeth Sela University of Maryland, College Park, Maryland 074 1 Analytical

More information

Tail Estimation of the Spectral Density under Fixed-Domain Asymptotics

Tail Estimation of the Spectral Density under Fixed-Domain Asymptotics Tail Estiation of the Spectral Density under Fixed-Doain Asyptotics Wei-Ying Wu, Chae Young Li and Yiin Xiao Wei-Ying Wu, Departent of Statistics & Probability Michigan State University, East Lansing,

More information

1 Generalization bounds based on Rademacher complexity

1 Generalization bounds based on Rademacher complexity COS 5: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #0 Scribe: Suqi Liu March 07, 08 Last tie we started proving this very general result about how quickly the epirical average converges

More information

Support recovery in compressed sensing: An estimation theoretic approach

Support recovery in compressed sensing: An estimation theoretic approach Support recovery in copressed sensing: An estiation theoretic approach Ain Karbasi, Ali Horati, Soheil Mohajer, Martin Vetterli School of Coputer and Counication Sciences École Polytechnique Fédérale de

More information

On the Impact of Kernel Approximation on Learning Accuracy

On the Impact of Kernel Approximation on Learning Accuracy On the Ipact of Kernel Approxiation on Learning Accuracy Corinna Cortes Mehryar Mohri Aeet Talwalkar Google Research New York, NY corinna@google.co Courant Institute and Google Research New York, NY ohri@cs.nyu.edu

More information

1 Rademacher Complexity Bounds

1 Rademacher Complexity Bounds COS 511: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #10 Scribe: Max Goer March 07, 2013 1 Radeacher Coplexity Bounds Recall the following theore fro last lecture: Theore 1. With probability

More information

Generalized AOR Method for Solving System of Linear Equations. Davod Khojasteh Salkuyeh. Department of Mathematics, University of Mohaghegh Ardabili,

Generalized AOR Method for Solving System of Linear Equations. Davod Khojasteh Salkuyeh. Department of Mathematics, University of Mohaghegh Ardabili, Australian Journal of Basic and Applied Sciences, 5(3): 35-358, 20 ISSN 99-878 Generalized AOR Method for Solving Syste of Linear Equations Davod Khojasteh Salkuyeh Departent of Matheatics, University

More information

In this chapter, we consider several graph-theoretic and probabilistic models

In this chapter, we consider several graph-theoretic and probabilistic models THREE ONE GRAPH-THEORETIC AND STATISTICAL MODELS 3.1 INTRODUCTION In this chapter, we consider several graph-theoretic and probabilistic odels for a social network, which we do under different assuptions

More information

Uniform Approximation and Bernstein Polynomials with Coefficients in the Unit Interval

Uniform Approximation and Bernstein Polynomials with Coefficients in the Unit Interval Unifor Approxiation and Bernstein Polynoials with Coefficients in the Unit Interval Weiang Qian and Marc D. Riedel Electrical and Coputer Engineering, University of Minnesota 200 Union St. S.E. Minneapolis,

More information

Lecture: Modeling graphs with electrical networks

Lecture: Modeling graphs with electrical networks Stat260/CS294: Spectral Graph Methods Lecture 16-03/17/2015 Lecture: Modeling graphs with electrical networks Lecturer: Michael Mahoney Scribe: Michael Mahoney Warning: these notes are still very rough.

More information

Introduction to Robotics (CS223A) (Winter 2006/2007) Homework #5 solutions

Introduction to Robotics (CS223A) (Winter 2006/2007) Homework #5 solutions Introduction to Robotics (CS3A) Handout (Winter 6/7) Hoework #5 solutions. (a) Derive a forula that transfors an inertia tensor given in soe frae {C} into a new frae {A}. The frae {A} can differ fro frae

More information

1 Bounding the Margin

1 Bounding the Margin COS 511: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #12 Scribe: Jian Min Si March 14, 2013 1 Bounding the Margin We are continuing the proof of a bound on the generalization error of AdaBoost

More information

Learnability and Stability in the General Learning Setting

Learnability and Stability in the General Learning Setting Learnability and Stability in the General Learning Setting Shai Shalev-Shwartz TTI-Chicago shai@tti-c.org Ohad Shair The Hebrew University ohadsh@cs.huji.ac.il Nathan Srebro TTI-Chicago nati@uchicago.edu

More information

Device-to-Device Collaboration through Distributed Storage

Device-to-Device Collaboration through Distributed Storage Device-to-Device Collaboration through Distributed Storage Negin Golrezaei, Student Meber, IEEE, Alexandros G Diakis, Meber, IEEE, Andreas F Molisch, Fellow, IEEE Dept of Electrical Eng University of Southern

More information

L p moments of random vectors via majorizing measures

L p moments of random vectors via majorizing measures L p oents of rando vectors via ajorizing easures Olivier Guédon, Mark Rudelson Abstract For a rando vector X in R n, we obtain bounds on the size of a saple, for which the epirical p-th oents of linear

More information

Fast and Memory Optimal Low-Rank Matrix Approximation

Fast and Memory Optimal Low-Rank Matrix Approximation Fast and Meory Optial Low-Rank Matrix Approxiation Yun Se-Young, Marc Lelarge, Alexandre Proutière To cite this version: Yun Se-Young, Marc Lelarge, Alexandre Proutière. Fast and Meory Optial Low-Rank

More information

OBJECTIVES INTRODUCTION

OBJECTIVES INTRODUCTION M7 Chapter 3 Section 1 OBJECTIVES Suarize data using easures of central tendency, such as the ean, edian, ode, and idrange. Describe data using the easures of variation, such as the range, variance, and

More information

Web Appendix for Joint Variable Selection for Fixed and Random Effects in Linear Mixed-Effects Models

Web Appendix for Joint Variable Selection for Fixed and Random Effects in Linear Mixed-Effects Models Web Appendix for Joint Variable Selection for Fixed and Rando Effects in Linear Mixed-Effects Models Howard D. Bondell, Arun Krishna, and Sujit K. Ghosh APPENDIX A A. Regularity Conditions Assue that the

More information

Lecture 13 Eigenvalue Problems

Lecture 13 Eigenvalue Problems Lecture 13 Eigenvalue Probles MIT 18.335J / 6.337J Introduction to Nuerical Methods Per-Olof Persson October 24, 2006 1 The Eigenvalue Decoposition Eigenvalue proble for atrix A: Ax = λx with eigenvalues

More information

Lecture October 23. Scribes: Ruixin Qiang and Alana Shine

Lecture October 23. Scribes: Ruixin Qiang and Alana Shine CSCI699: Topics in Learning and Gae Theory Lecture October 23 Lecturer: Ilias Scribes: Ruixin Qiang and Alana Shine Today s topic is auction with saples. 1 Introduction to auctions Definition 1. In a single

More information

Simple and Compound Harmonic Motion

Simple and Compound Harmonic Motion Siple Copound Haronic Motion Prelab: visit this site: http://en.wiipedia.org/wii/noral_odes Purpose To deterine the noral ode frequencies of two systes:. a single ass - two springs syste (Figure );. two

More information

The Distribution of the Covariance Matrix for a Subset of Elliptical Distributions with Extension to Two Kurtosis Parameters

The Distribution of the Covariance Matrix for a Subset of Elliptical Distributions with Extension to Two Kurtosis Parameters journal of ultivariate analysis 58, 96106 (1996) article no. 0041 The Distribution of the Covariance Matrix for a Subset of Elliptical Distributions with Extension to Two Kurtosis Paraeters H. S. Steyn

More information

Testing Properties of Collections of Distributions

Testing Properties of Collections of Distributions Testing Properties of Collections of Distributions Reut Levi Dana Ron Ronitt Rubinfeld April 9, 0 Abstract We propose a fraework for studying property testing of collections of distributions, where the

More information

Compressive Distilled Sensing: Sparse Recovery Using Adaptivity in Compressive Measurements

Compressive Distilled Sensing: Sparse Recovery Using Adaptivity in Compressive Measurements 1 Copressive Distilled Sensing: Sparse Recovery Using Adaptivity in Copressive Measureents Jarvis D. Haupt 1 Richard G. Baraniuk 1 Rui M. Castro 2 and Robert D. Nowak 3 1 Dept. of Electrical and Coputer

More information

The Weierstrass Approximation Theorem

The Weierstrass Approximation Theorem 36 The Weierstrass Approxiation Theore Recall that the fundaental idea underlying the construction of the real nubers is approxiation by the sipler rational nubers. Firstly, nubers are often deterined

More information

Principal Components Analysis

Principal Components Analysis Principal Coponents Analysis Cheng Li, Bingyu Wang Noveber 3, 204 What s PCA Principal coponent analysis (PCA) is a statistical procedure that uses an orthogonal transforation to convert a set of observations

More information

Lecture 21. Interior Point Methods Setup and Algorithm

Lecture 21. Interior Point Methods Setup and Algorithm Lecture 21 Interior Point Methods In 1984, Kararkar introduced a new weakly polynoial tie algorith for solving LPs [Kar84a], [Kar84b]. His algorith was theoretically faster than the ellipsoid ethod and

More information

Random Process Review

Random Process Review Rando Process Review Consider a rando process t, and take k saples. For siplicity, we will set k. However it should ean any nuber of saples. t () t x t, t, t We have a rando vector t, t, t. If we find

More information

Intelligent Systems: Reasoning and Recognition. Artificial Neural Networks

Intelligent Systems: Reasoning and Recognition. Artificial Neural Networks Intelligent Systes: Reasoning and Recognition Jaes L. Crowley MOSIG M1 Winter Seester 2018 Lesson 7 1 March 2018 Outline Artificial Neural Networks Notation...2 Introduction...3 Key Equations... 3 Artificial

More information

On the Navier Stokes equations

On the Navier Stokes equations On the Navier Stokes equations Daniel Thoas Hayes April 26, 2018 The proble on the existence and soothness of the Navier Stokes equations is resolved. 1. Proble description The Navier Stokes equations

More information

Shannon Sampling II. Connections to Learning Theory

Shannon Sampling II. Connections to Learning Theory Shannon Sapling II Connections to Learning heory Steve Sale oyota echnological Institute at Chicago 147 East 60th Street, Chicago, IL 60637, USA E-ail: sale@athberkeleyedu Ding-Xuan Zhou Departent of Matheatics,

More information

E0 370 Statistical Learning Theory Lecture 5 (Aug 25, 2011)

E0 370 Statistical Learning Theory Lecture 5 (Aug 25, 2011) E0 370 Statistical Learning Theory Lecture 5 Aug 5, 0 Covering Nubers, Pseudo-Diension, and Fat-Shattering Diension Lecturer: Shivani Agarwal Scribe: Shivani Agarwal Introduction So far we have seen how

More information

Tail estimates for norms of sums of log-concave random vectors

Tail estimates for norms of sums of log-concave random vectors Tail estiates for nors of sus of log-concave rando vectors Rados law Adaczak Rafa l Lata la Alexander E. Litvak Alain Pajor Nicole Toczak-Jaegerann Abstract We establish new tail estiates for order statistics

More information

ON THE TWO-LEVEL PRECONDITIONING IN LEAST SQUARES METHOD

ON THE TWO-LEVEL PRECONDITIONING IN LEAST SQUARES METHOD PROCEEDINGS OF THE YEREVAN STATE UNIVERSITY Physical and Matheatical Sciences 04,, p. 7 5 ON THE TWO-LEVEL PRECONDITIONING IN LEAST SQUARES METHOD M a t h e a t i c s Yu. A. HAKOPIAN, R. Z. HOVHANNISYAN

More information

Symmetrization and Rademacher Averages

Symmetrization and Rademacher Averages Stat 928: Statistical Learning Theory Lecture: Syetrization and Radeacher Averages Instructor: Sha Kakade Radeacher Averages Recall that we are interested in bounding the difference between epirical and

More information

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Course Notes for EE227C (Spring 2018): Convex Optiization and Approxiation Instructor: Moritz Hardt Eail: hardt+ee227c@berkeley.edu Graduate Instructor: Max Sichowitz Eail: sichow+ee227c@berkeley.edu October

More information

M ath. Res. Lett. 15 (2008), no. 2, c International Press 2008 SUM-PRODUCT ESTIMATES VIA DIRECTED EXPANDERS. Van H. Vu. 1.

M ath. Res. Lett. 15 (2008), no. 2, c International Press 2008 SUM-PRODUCT ESTIMATES VIA DIRECTED EXPANDERS. Van H. Vu. 1. M ath. Res. Lett. 15 (2008), no. 2, 375 388 c International Press 2008 SUM-PRODUCT ESTIMATES VIA DIRECTED EXPANDERS Van H. Vu Abstract. Let F q be a finite field of order q and P be a polynoial in F q[x

More information

RECOVERY OF A DENSITY FROM THE EIGENVALUES OF A NONHOMOGENEOUS MEMBRANE

RECOVERY OF A DENSITY FROM THE EIGENVALUES OF A NONHOMOGENEOUS MEMBRANE Proceedings of ICIPE rd International Conference on Inverse Probles in Engineering: Theory and Practice June -8, 999, Port Ludlow, Washington, USA : RECOVERY OF A DENSITY FROM THE EIGENVALUES OF A NONHOMOGENEOUS

More information

Lecture 1 and 2: Random Spanning Trees

Lecture 1 and 2: Random Spanning Trees Recent Advances in Approximation Algorithms Spring 2015 Lecture 1 and 2: Random Spanning Trees Lecturer: Shayan Oveis Gharan March 31st Disclaimer: These notes have not been subjected to the usual scrutiny

More information

ORIE 6340: Mathematics of Data Science

ORIE 6340: Mathematics of Data Science ORIE 6340: Matheatics of Data Science Daek Davis Contents 1 Estiation in High Diensions 1 1.1 Tools for understanding high-diensional sets................. 3 1.1.1 Concentration of volue in high-diensions...............

More information

arxiv: v1 [cs.ds] 3 Feb 2014

arxiv: v1 [cs.ds] 3 Feb 2014 arxiv:40.043v [cs.ds] 3 Feb 04 A Bound on the Expected Optiality of Rando Feasible Solutions to Cobinatorial Optiization Probles Evan A. Sultani The Johns Hopins University APL evan@sultani.co http://www.sultani.co/

More information

arxiv: v2 [math.co] 3 Dec 2008

arxiv: v2 [math.co] 3 Dec 2008 arxiv:0805.2814v2 [ath.co] 3 Dec 2008 Connectivity of the Unifor Rando Intersection Graph Sion R. Blacburn and Stefanie Gere Departent of Matheatics Royal Holloway, University of London Egha, Surrey TW20

More information

An l 1 Regularized Method for Numerical Differentiation Using Empirical Eigenfunctions

An l 1 Regularized Method for Numerical Differentiation Using Empirical Eigenfunctions Journal of Matheatical Research with Applications Jul., 207, Vol. 37, No. 4, pp. 496 504 DOI:0.3770/j.issn:2095-265.207.04.0 Http://jre.dlut.edu.cn An l Regularized Method for Nuerical Differentiation

More information

arxiv: v1 [cs.ds] 17 Mar 2016

arxiv: v1 [cs.ds] 17 Mar 2016 Tight Bounds for Single-Pass Streaing Coplexity of the Set Cover Proble Sepehr Assadi Sanjeev Khanna Yang Li Abstract arxiv:1603.05715v1 [cs.ds] 17 Mar 2016 We resolve the space coplexity of single-pass

More information

Supplementary Materials: Proofs and Technical Details for Parsimonious Tensor Response Regression Lexin Li and Xin Zhang

Supplementary Materials: Proofs and Technical Details for Parsimonious Tensor Response Regression Lexin Li and Xin Zhang Suppleentary Materials: Proofs and Tecnical Details for Parsionious Tensor Response Regression Lexin Li and Xin Zang A Soe preliinary results We will apply te following two results repeatedly. For a positive

More information

Solutions 1. Introduction to Coding Theory - Spring 2010 Solutions 1. Exercise 1.1. See Examples 1.2 and 1.11 in the course notes.

Solutions 1. Introduction to Coding Theory - Spring 2010 Solutions 1. Exercise 1.1. See Examples 1.2 and 1.11 in the course notes. Solutions 1 Exercise 1.1. See Exaples 1.2 and 1.11 in the course notes. Exercise 1.2. Observe that the Haing distance of two vectors is the iniu nuber of bit flips required to transfor one into the other.

More information

Linear inverse problems on Erdős-Rényi graphs: Information-theoretic limits and efficient recovery

Linear inverse problems on Erdős-Rényi graphs: Information-theoretic limits and efficient recovery Linear inverse probles on Erdős-Rényi graphs: Inforation-theoretic liits and efficient recovery Eanuel Abbe, Afonso S. Bandeira, Annina Bracher, Ait Singer Princeton University Abstract This paper considers

More information

i ij j ( ) sin cos x y z x x x interchangeably.)

i ij j ( ) sin cos x y z x x x interchangeably.) Tensor Operators Michael Fowler,2/3/12 Introduction: Cartesian Vectors and Tensors Physics is full of vectors: x, L, S and so on Classically, a (three-diensional) vector is defined by its properties under

More information

e-companion ONLY AVAILABLE IN ELECTRONIC FORM

e-companion ONLY AVAILABLE IN ELECTRONIC FORM OPERATIONS RESEARCH doi 10.1287/opre.1070.0427ec pp. ec1 ec5 e-copanion ONLY AVAILABLE IN ELECTRONIC FORM infors 07 INFORMS Electronic Copanion A Learning Approach for Interactive Marketing to a Custoer

More information

Detection and Estimation Theory

Detection and Estimation Theory ESE 54 Detection and Estiation Theory Joseph A. O Sullivan Sauel C. Sachs Professor Electronic Systes and Signals Research Laboratory Electrical and Systes Engineering Washington University 11 Urbauer

More information

Introduction to Machine Learning. Recitation 11

Introduction to Machine Learning. Recitation 11 Introduction to Machine Learning Lecturer: Regev Schweiger Recitation Fall Seester Scribe: Regev Schweiger. Kernel Ridge Regression We now take on the task of kernel-izing ridge regression. Let x,...,

More information

Prepared with an evaluation version of PCT E X. Visit

Prepared with an evaluation version of PCT E X. Visit cspringer (Original version in the Proceedings of FPSAC') A Generalized Cover Tie for Rando Walks on Graphs Cyril Banderier and Robert P. Dobrow 2 Algoriths Project. INRIA (Rocquencourt), France. Cyril.Banderier@inria.fr,

More information

Tracking using CONDENSATION: Conditional Density Propagation

Tracking using CONDENSATION: Conditional Density Propagation Tracking using CONDENSATION: Conditional Density Propagation Goal Model-based visual tracking in dense clutter at near video frae rates M. Isard and A. Blake, CONDENSATION Conditional density propagation

More information

arxiv: v5 [cs.it] 16 Mar 2012

arxiv: v5 [cs.it] 16 Mar 2012 ONE-BIT COMPRESSED SENSING BY LINEAR PROGRAMMING YANIV PLAN AND ROMAN VERSHYNIN arxiv:09.499v5 [cs.it] 6 Mar 0 Abstract. We give the first coputationally tractable and alost optial solution to the proble

More information

New Bounds for Learning Intervals with Implications for Semi-Supervised Learning

New Bounds for Learning Intervals with Implications for Semi-Supervised Learning JMLR: Workshop and Conference Proceedings vol (1) 1 15 New Bounds for Learning Intervals with Iplications for Sei-Supervised Learning David P. Helbold dph@soe.ucsc.edu Departent of Coputer Science, University

More information

Jordan Journal of Physics

Jordan Journal of Physics Volue 5, Nuber 3, 212. pp. 113-118 ARTILE Jordan Journal of Physics Networks of Identical apacitors with a Substitutional apacitor Departent of Physics, Al-Hussein Bin Talal University, Ma an, 2, 71111,

More information

Infinitely Many Trees Have Non-Sperner Subtree Poset

Infinitely Many Trees Have Non-Sperner Subtree Poset Order (2007 24:133 138 DOI 10.1007/s11083-007-9064-2 Infinitely Many Trees Have Non-Sperner Subtree Poset Andrew Vince Hua Wang Received: 3 April 2007 / Accepted: 25 August 2007 / Published online: 2 October

More information

Original citation: Coja-Oghlan, Ain and Lanka, André. (2010 Finding planted partitions in rando graphs with general degree distributions. SIAM Journal on Discrete Matheatics, Vol.23 (No.4. pp. 1682-1714.

More information

Reading from Young & Freedman: For this topic, read the introduction to chapter 25 and sections 25.1 to 25.3 & 25.6.

Reading from Young & Freedman: For this topic, read the introduction to chapter 25 and sections 25.1 to 25.3 & 25.6. PHY10 Electricity Topic 6 (Lectures 9 & 10) Electric Current and Resistance n this topic, we will cover: 1) Current in a conductor ) Resistivity 3) Resistance 4) Oh s Law 5) The Drude Model of conduction

More information

Constant Arboricity Spectral Sparsifiers

Constant Arboricity Spectral Sparsifiers Constant Arboricity Spectral Sparsifiers arxiv:1808.0566v1 [cs.ds] 16 Aug 018 Timothy Chu oogle timchu@google.com Jakub W. Pachocki Carnegie Mellon University pachocki@cs.cmu.edu August 0, 018 Abstract

More information