Lecture 4: Universal Hash Functions/Streaming Cont d

Size: px
Start display at page:

Download "Lecture 4: Universal Hash Functions/Streaming Cont d"

Transcription

1 CSE 5: Desgn and Analyss of Algorthms I Sprng 06 Lecture 4: Unversal Hash Functons/Streamng Cont d Lecturer: Shayan Oves Gharan Aprl 6th Scrbe: Jacob Schreber Dsclamer: These notes have not been subjected to the usual scrutny reserved for formal publcatons. 4. Hash Functons Suppose we want to mantan a data structure of a set of elements x,..., x m of a unverse U, e.g., mages, that can perform nserton/deleton/search operatons. A smple strategy would be to have one bucket for every possble mage,.e., each element of U, and ndcate n each bucket whether or not the correspondng mage appeared. Unfortunately, U can be much much larger than the space avalable n our computers; for example, f U represents the set of all possble mages, U s as bg as Instead, one may use a hash functon. A hash functon h : U [B] maps elements of U to ntegers n [B]. For every element of the sequence we mark h(x ) wth x. When a query x arrves, we go to the cell h(x) f no element s stored there, x s not n our sequence. Otherwse, we go over all elements stored n h(x) and see f any of them s equal to x. Observe that the search operaton thus depends on the number of elements stored n h(x). Ideally, we would lke to have a hash functon that stores at most one element n every 0 B. Fx a functon h. Observe that h maps /B fracton of all elements of U to the same number [B]. Therefore, the search operaton n the worst case s very slow. We can mtgate ths problem by choosng a hash functon h unformly at random the famly of all functons that map U to B; let H = h : U [B], and let h H chosen unformly at random. Now, f the length of the sequence m B, then, by the brthday paradox phenomenon, wth hgh probablty, no two elements of the sequence map to the same cell. In other words, there s no collsons. However, observe that H has U B many functons, so even descrbng h requres log U B = U log B bts of memory. Recall that we assumed U so we cannot effcently represent h. Instead, we are gong to work wth smaller much famles of functons say H ; such a famly can only guarantee weaker notons of ndependence, but because H H, t s much easer to descrbe a randomly chosen functon from H. 4. -Unversal Functons In ths secton, we descrbe a famly hash functons that only preserve parwse-ndependent. Let p be a prme number, and let H = {h : [p] [p], h(x) = ax + b mod p}. Observe that any functon h a,b H can be represented n O(log p) bts of memory just by recordng the a, b [p]. Next, we show that a unformly random functon h H s parwse ndependent. Lemma 4.. For any x, y, c, d [p]x y, P [h(x) = c, h(y) = d] = p Proof. Suppose for some x y, h(x) c, and h(y) d. Equvalently, we can wrte, ax + b c mod p, and ay + b d mod p. 4-

2 4- Lecture 4: Unversal Hash Functons/Streamng Cont d Usng the laws of modular equatons, we can wrte, a(x y) (c b) (d b) mod p. Snce p s a prme, any number z p has a multplcatve nverse,.e., there s a number z p such that p p mod p. Snce x y, x y 0. Therefore, t has a multplcatve nverse, and we can wrte, a = (x y) (c d) mod p, whch gves, b = d ay mod p. In words, havng x, y, c, d unquely defnes a, b. Snce there are p possbltes for a, b, we get P [h(x) = c, h(y) = d] = /p. For our applcatons n estmatng F 0, we frst need to choose a prme number p > n. Then, we can use a hash functon h : [n] [B] where for any 0 x n, h(x) = ax + b mod p mod B. It s easy to see that such a functon s almost parwse ndependent whch s good enough for our applcaton n estmatng F 0. We can extend the above constructon to a famly of k-wse ndependence hash functons. We say a hash functon h : [p] [p] s k-wse ndependent f for all dstnct x 0,..., x k, P [, h(x ) = c ] = p k. Such a hash functon h can be constructed by choosng a 0, a,..., a k unformly and ndependently from [p] and lettng h(x) = a k x k + a k x k...a x + a 0. We are not provng that ths wll gve a k-wse ndependence hash functon. Instead, we just gve the hgh-level dea. Let h be a 4-wse ndependent hash functon and let x 0, x, x, x 3 [p] be dstnct and c 0, c, c, c 3 [p] we need to show that there s a unque set a 0, a, a, a 3 for whch h(x ) = c for all. To fnd a 0, a, a, a 3 t s enough to solve the followng system of lnear equautons. x 3 0 x 0 x 0 a 3 c 0 x 3 x x a x 3 x x a = c c x 3 3 x 3 x 3 a 0 c 3 It turns out that the Matrx n the LHS has a nonzero determnant of x 0, x, x, x 3 are dstnct. In such a case, t s nvertble, and we can use the nverse to unquely defne a 0, a, a, a F Moment Before desgnng a streamng algorthm that estmates F, let us revst the random walk example that we had a few lectures ago. Let X = X where for each, X = { +, w.p., w.p.

3 Lecture 4: Unversal Hash Functons/Streamng Cont d 4-3 Usng the Hoeffdng bound, we prevously showed that for any c >, P [X c n] e c. Is ths bound tght? Can we show that X Ω(n) wth a constant probablty? The answer yes. More generally t follows from the central lmt theorem. But nstead of usng such a heavy tool there s a more elementary argument that we can use. To show that X Ω( n) wth a constant probablty, t s enough to show that E [ X ] n. E [ [ ] X ] = E X ] = E X X j,j =,j E [X X j ] = E [ X ] = n, where n the second to last equalty we use that X, X j are ndependent, so E [X X j ] 0 only when = j, and n the last equalty we use E [ ] X s. Now back to estmatng F. We want to use a smlar dea. Let x, x,..., x m [n] be the nput sequence. For each [n] let m := #{x j = }. Recall that Let h : [n] {+, } where for any [n], F := h() = n m. = { +,,, chosen ndependently. Consder the followng algorthm: Start wth Y = 0. After readng each x, let Y = Y + h(x ). Return Y. Before, analyzng the algorthm let us study two extreme cases. Frst assume that x = x = = x m. Then, Y = m, Y = m as desred. Now, assume that x, x, dots, x m are mutually dstnct, then the dstrbuton of Y s the same as a random walk of length m; so by the prevous observaton Y n and Y n as desred. Lemma 4.. Y s an unbased estmator of F,.e., E [ Y ] = F. Proof. Frst, observe that Y = m h(). Therefore, E [ Y ] = E,j m m j h()h(j) =,j m m j E [h()h(j)] = m E [ h() ] = m where the second to last equalty uses that h() s ndependent of h(j) for all j. Now, all we need to do s to estmate the expectaton of Y wthn a ± ɛ factor. By Chebyshev s nequalty all we need to show s that Y has a small varance.

4 4-4 Lecture 4: Unversal Hash Functons/Streamng Cont d Lemma 4.3. Var(Y ) E [ Y ]. Proof. Frst, we calculate E [ Y 4]. The dea s smlar to before, we just use the ndependence of h() s. E [ Y 4] = E m m j m k m l h()h(j)h(k)h(l) =,j,k,l,j,k,l m m j m k m l E [h()h(j)h(k)h(l)] = m 4 E [ h() 4] + 6 <j m m je [ h() h(j) ] To see the last equalty, observe that for any 4-tuple,, j, k, l, E [h()h(j)h(k)h(l)] s nonzero only f each nteger n [m] shows up an even number. In other words, there are only two cases where E [h()h(j)h(k)h(l)] s nonzero: () when = j = k = l, () when two of these four numbers are equal and the other two are also equal. Snce for each, E [ h() ] = E [ h() 4] =, we have Now, usng Lemma 4., we can wrte, as desred. E [ Y 4] = n = m <j m m j. Var(Y ) = E [ Y 4] E [ Y ] = 4 m m j E [ Y ] Now, all we need to do s to use ndependent samples of Y to reduce the varance. Suppose we take k ndependent samples of Y usng k ndependently chosen hash functons h,..., h k,.e., we run the followng algorthm: Start wth Y = Y = = Y k = 0. After readng x, let Y j = Y j + h(x ) for all j k. Then, <j Var( k (Y + + Y k )) = k Var(Y ). Therefore, by the Chebyshev s nequalty, we can wrte, [ P k E [ Y ] ɛe [ Y ]] Var( k k = Y ) ɛ E [Y ] Y = k E [ Y ] ɛ E [Y ] = ɛ k So, k = 5 ɛ many samples s enough to approxmate F wthn + ɛ factor wth probablty at least 9 0. Note that n the above constructon we assumed that h(.) assgns ndependent values to all ntegers n [n]. But, t can be seen from the proof that we only used 4-wse ndependence. The only place that we used ndependence was to show that E [h()h(j)h(k)h(l)] = 0 when, j, k, l are mutually dstnct. That s of course true even f h(.) s just a 4-wse ndependent functon. Takng that nto account we can run the above algorthm wth space O(log(n)/ɛ ). In addton, we can turn the above probablstc guarantee nto δ probablty usng log δ ɛ We refran from gvng the detals. For more detaled dscusson we refer to [AMS96]. many samples.

5 REFERENCES 4-5 References [AMS96] N. Alon, Y. Matas, and M. Szegedy. The space complexty of approxmatng the frequency moments. In: STOCw. ACM. 996, pp. 0 9 (ct. on p. 4-4).

18.1 Introduction and Recap

18.1 Introduction and Recap CS787: Advanced Algorthms Scrbe: Pryananda Shenoy and Shjn Kong Lecturer: Shuch Chawla Topc: Streamng Algorthmscontnued) Date: 0/26/2007 We contnue talng about streamng algorthms n ths lecture, ncludng

More information

Notes on Frequency Estimation in Data Streams

Notes on Frequency Estimation in Data Streams Notes on Frequency Estmaton n Data Streams In (one of) the data streamng model(s), the data s a sequence of arrvals a 1, a 2,..., a m of the form a j = (, v) where s the dentty of the tem and belongs to

More information

Stanford University CS254: Computational Complexity Notes 7 Luca Trevisan January 29, Notes for Lecture 7

Stanford University CS254: Computational Complexity Notes 7 Luca Trevisan January 29, Notes for Lecture 7 Stanford Unversty CS54: Computatonal Complexty Notes 7 Luca Trevsan January 9, 014 Notes for Lecture 7 1 Approxmate Countng wt an N oracle We complete te proof of te followng result: Teorem 1 For every

More information

Lecture 5 September 17, 2015

Lecture 5 September 17, 2015 CS 229r: Algorthms for Bg Data Fall 205 Prof. Jelan Nelson Lecture 5 September 7, 205 Scrbe: Yakr Reshef Recap and overvew Last tme we dscussed the problem of norm estmaton for p-norms wth p > 2. We had

More information

princeton univ. F 13 cos 521: Advanced Algorithm Design Lecture 3: Large deviations bounds and applications Lecturer: Sanjeev Arora

princeton univ. F 13 cos 521: Advanced Algorithm Design Lecture 3: Large deviations bounds and applications Lecturer: Sanjeev Arora prnceton unv. F 13 cos 521: Advanced Algorthm Desgn Lecture 3: Large devatons bounds and applcatons Lecturer: Sanjeev Arora Scrbe: Today s topc s devaton bounds: what s the probablty that a random varable

More information

Introduction to Algorithms

Introduction to Algorithms Introducton to Algorthms 6.046J/8.40J Lecture 7 Prof. Potr Indyk Data Structures Role of data structures: Encapsulate data Support certan operatons (e.g., INSERT, DELETE, SEARCH) Our focus: effcency of

More information

Matrix Approximation via Sampling, Subspace Embedding. 1 Solving Linear Systems Using SVD

Matrix Approximation via Sampling, Subspace Embedding. 1 Solving Linear Systems Using SVD Matrx Approxmaton va Samplng, Subspace Embeddng Lecturer: Anup Rao Scrbe: Rashth Sharma, Peng Zhang 0/01/016 1 Solvng Lnear Systems Usng SVD Two applcatons of SVD have been covered so far. Today we loo

More information

Lecture 10: May 6, 2013

Lecture 10: May 6, 2013 TTIC/CMSC 31150 Mathematcal Toolkt Sprng 013 Madhur Tulsan Lecture 10: May 6, 013 Scrbe: Wenje Luo In today s lecture, we manly talked about random walk on graphs and ntroduce the concept of graph expander,

More information

Lecture 3 January 31, 2017

Lecture 3 January 31, 2017 CS 224: Advanced Algorthms Sprng 207 Prof. Jelan Nelson Lecture 3 January 3, 207 Scrbe: Saketh Rama Overvew In the last lecture we covered Y-fast tres and Fuson Trees. In ths lecture we start our dscusson

More information

MATH 829: Introduction to Data Mining and Analysis The EM algorithm (part 2)

MATH 829: Introduction to Data Mining and Analysis The EM algorithm (part 2) 1/16 MATH 829: Introducton to Data Mnng and Analyss The EM algorthm (part 2) Domnque Gullot Departments of Mathematcal Scences Unversty of Delaware Aprl 20, 2016 Recall 2/16 We are gven ndependent observatons

More information

Lecture Space-Bounded Derandomization

Lecture Space-Bounded Derandomization Notes on Complexty Theory Last updated: October, 2008 Jonathan Katz Lecture Space-Bounded Derandomzaton 1 Space-Bounded Derandomzaton We now dscuss derandomzaton of space-bounded algorthms. Here non-trval

More information

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016 U.C. Berkeley CS94: Spectral Methods and Expanders Handout 8 Luca Trevsan February 7, 06 Lecture 8: Spectral Algorthms Wrap-up In whch we talk about even more generalzatons of Cheeger s nequaltes, and

More information

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could

More information

Mining Data Streams-Estimating Frequency Moment

Mining Data Streams-Estimating Frequency Moment Mnng Data Streams-Estmatng Frequency Moment Barna Saha October 26, 2017 Frequency Moment Computng moments nvolves dstrbuton of frequences of dfferent elements n the stream. Frequency Moment Computng moments

More information

Introduction to Algorithms

Introduction to Algorithms Introducton to Algorthms 6.046J/18.401J Lecture 7 Prof. Potr Indyk Data Structures Role of data structures: Encapsulate data Support certan operatons (e.g., INSERT, DELETE, SEARCH) What data structures

More information

1 The Mistake Bound Model

1 The Mistake Bound Model 5-850: Advanced Algorthms CMU, Sprng 07 Lecture #: Onlne Learnng and Multplcatve Weghts February 7, 07 Lecturer: Anupam Gupta Scrbe: Bryan Lee,Albert Gu, Eugene Cho he Mstake Bound Model Suppose there

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 1 10/1/013 Martngale Concentraton Inequaltes and Applcatons Content. 1. Exponental concentraton for martngales wth bounded ncrements.

More information

APPENDIX A Some Linear Algebra

APPENDIX A Some Linear Algebra APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,

More information

E Tail Inequalities. E.1 Markov s Inequality. Non-Lecture E: Tail Inequalities

E Tail Inequalities. E.1 Markov s Inequality. Non-Lecture E: Tail Inequalities Algorthms Non-Lecture E: Tal Inequaltes If you hold a cat by the tal you learn thngs you cannot learn any other way. Mar Twan E Tal Inequaltes The smple recursve structure of sp lsts made t relatvely easy

More information

Min Cut, Fast Cut, Polynomial Identities

Min Cut, Fast Cut, Polynomial Identities Randomzed Algorthms, Summer 016 Mn Cut, Fast Cut, Polynomal Identtes Instructor: Thomas Kesselhem and Kurt Mehlhorn 1 Mn Cuts n Graphs Lecture (5 pages) Throughout ths secton, G = (V, E) s a mult-graph.

More information

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U) Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of

More information

Random Partitions of Samples

Random Partitions of Samples Random Parttons of Samples Klaus Th. Hess Insttut für Mathematsche Stochastk Technsche Unverstät Dresden Abstract In the present paper we construct a decomposton of a sample nto a fnte number of subsamples

More information

Estimation: Part 2. Chapter GREG estimation

Estimation: Part 2. Chapter GREG estimation Chapter 9 Estmaton: Part 2 9. GREG estmaton In Chapter 8, we have seen that the regresson estmator s an effcent estmator when there s a lnear relatonshp between y and x. In ths chapter, we generalzed the

More information

First day August 1, Problems and Solutions

First day August 1, Problems and Solutions FOURTH INTERNATIONAL COMPETITION FOR UNIVERSITY STUDENTS IN MATHEMATICS July 30 August 4, 997, Plovdv, BULGARIA Frst day August, 997 Problems and Solutons Problem. Let {ε n } n= be a sequence of postve

More information

Finding Primitive Roots Pseudo-Deterministically

Finding Primitive Roots Pseudo-Deterministically Electronc Colloquum on Computatonal Complexty, Report No 207 (205) Fndng Prmtve Roots Pseudo-Determnstcally Ofer Grossman December 22, 205 Abstract Pseudo-determnstc algorthms are randomzed search algorthms

More information

Errors for Linear Systems

Errors for Linear Systems Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

DISCRIMINANTS AND RAMIFIED PRIMES. 1. Introduction A prime number p is said to be ramified in a number field K if the prime ideal factorization

DISCRIMINANTS AND RAMIFIED PRIMES. 1. Introduction A prime number p is said to be ramified in a number field K if the prime ideal factorization DISCRIMINANTS AND RAMIFIED PRIMES KEITH CONRAD 1. Introducton A prme number p s sad to be ramfed n a number feld K f the prme deal factorzaton (1.1) (p) = po K = p e 1 1 peg g has some e greater than 1.

More information

TAIL BOUNDS FOR SUMS OF GEOMETRIC AND EXPONENTIAL VARIABLES

TAIL BOUNDS FOR SUMS OF GEOMETRIC AND EXPONENTIAL VARIABLES TAIL BOUNDS FOR SUMS OF GEOMETRIC AND EXPONENTIAL VARIABLES SVANTE JANSON Abstract. We gve explct bounds for the tal probabltes for sums of ndependent geometrc or exponental varables, possbly wth dfferent

More information

Randomness and Computation

Randomness and Computation Randomness and Computaton or, Randomzed Algorthms Mary Cryan School of Informatcs Unversty of Ednburgh RC 208/9) Lecture 0 slde Balls n Bns m balls, n bns, and balls thrown unformly at random nto bns usually

More information

11 Tail Inequalities Markov s Inequality. Lecture 11: Tail Inequalities [Fa 13]

11 Tail Inequalities Markov s Inequality. Lecture 11: Tail Inequalities [Fa 13] Algorthms Lecture 11: Tal Inequaltes [Fa 13] If you hold a cat by the tal you learn thngs you cannot learn any other way. Mark Twan 11 Tal Inequaltes The smple recursve structure of skp lsts made t relatvely

More information

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 6 Luca Trevisan September 12, 2017

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 6 Luca Trevisan September 12, 2017 U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 6 Luca Trevsan September, 07 Scrbed by Theo McKenze Lecture 6 In whch we study the spectrum of random graphs. Overvew When attemptng to fnd n polynomal

More information

LOW BIAS INTEGRATED PATH ESTIMATORS. James M. Calvin

LOW BIAS INTEGRATED PATH ESTIMATORS. James M. Calvin Proceedngs of the 007 Wnter Smulaton Conference S G Henderson, B Bller, M-H Hseh, J Shortle, J D Tew, and R R Barton, eds LOW BIAS INTEGRATED PATH ESTIMATORS James M Calvn Department of Computer Scence

More information

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur Analyss of Varance and Desgn of Exerments-I MODULE III LECTURE - 2 EXPERIMENTAL DESIGN MODELS Dr. Shalabh Deartment of Mathematcs and Statstcs Indan Insttute of Technology Kanur 2 We consder the models

More information

COS 511: Theoretical Machine Learning. Lecturer: Rob Schapire Lecture # 15 Scribe: Jieming Mao April 1, 2013

COS 511: Theoretical Machine Learning. Lecturer: Rob Schapire Lecture # 15 Scribe: Jieming Mao April 1, 2013 COS 511: heoretcal Machne Learnng Lecturer: Rob Schapre Lecture # 15 Scrbe: Jemng Mao Aprl 1, 013 1 Bref revew 1.1 Learnng wth expert advce Last tme, we started to talk about learnng wth expert advce.

More information

Lecture 4: September 12

Lecture 4: September 12 36-755: Advanced Statstcal Theory Fall 016 Lecture 4: September 1 Lecturer: Alessandro Rnaldo Scrbe: Xao Hu Ta Note: LaTeX template courtesy of UC Berkeley EECS dept. Dsclamer: These notes have not been

More information

Provable Security Signatures

Provable Security Signatures Provable Securty Sgnatures UCL - Louvan-la-Neuve Wednesday, July 10th, 2002 LIENS-CNRS Ecole normale supéreure Summary Introducton Sgnature FD PSS Forkng Lemma Generc Model Concluson Provable Securty -

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

SL n (F ) Equals its Own Derived Group

SL n (F ) Equals its Own Derived Group Internatonal Journal of Algebra, Vol. 2, 2008, no. 12, 585-594 SL n (F ) Equals ts Own Derved Group Jorge Macel BMCC-The Cty Unversty of New York, CUNY 199 Chambers street, New York, NY 10007, USA macel@cms.nyu.edu

More information

Exercises of Chapter 2

Exercises of Chapter 2 Exercses of Chapter Chuang-Cheh Ln Department of Computer Scence and Informaton Engneerng, Natonal Chung Cheng Unversty, Mng-Hsung, Chay 61, Tawan. Exercse.6. Suppose that we ndependently roll two standard

More information

Lecture 10 Support Vector Machines II

Lecture 10 Support Vector Machines II Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed

More information

Learning Theory: Lecture Notes

Learning Theory: Lecture Notes Learnng Theory: Lecture Notes Lecturer: Kamalka Chaudhur Scrbe: Qush Wang October 27, 2012 1 The Agnostc PAC Model Recall that one of the constrants of the PAC model s that the data dstrbuton has to be

More information

Case A. P k = Ni ( 2L i k 1 ) + (# big cells) 10d 2 P k.

Case A. P k = Ni ( 2L i k 1 ) + (# big cells) 10d 2 P k. THE CELLULAR METHOD In ths lecture, we ntroduce the cellular method as an approach to ncdence geometry theorems lke the Szemeréd-Trotter theorem. The method was ntroduced n the paper Combnatoral complexty

More information

Some Consequences. Example of Extended Euclidean Algorithm. The Fundamental Theorem of Arithmetic, II. Characterizing the GCD and LCM

Some Consequences. Example of Extended Euclidean Algorithm. The Fundamental Theorem of Arithmetic, II. Characterizing the GCD and LCM Example of Extended Eucldean Algorthm Recall that gcd(84, 33) = gcd(33, 18) = gcd(18, 15) = gcd(15, 3) = gcd(3, 0) = 3 We work backwards to wrte 3 as a lnear combnaton of 84 and 33: 3 = 18 15 [Now 3 s

More information

Eigenvalues of Random Graphs

Eigenvalues of Random Graphs Spectral Graph Theory Lecture 2 Egenvalues of Random Graphs Danel A. Spelman November 4, 202 2. Introducton In ths lecture, we consder a random graph on n vertces n whch each edge s chosen to be n the

More information

Finding Dense Subgraphs in G(n, 1/2)

Finding Dense Subgraphs in G(n, 1/2) Fndng Dense Subgraphs n Gn, 1/ Atsh Das Sarma 1, Amt Deshpande, and Rav Kannan 1 Georga Insttute of Technology,atsh@cc.gatech.edu Mcrosoft Research-Bangalore,amtdesh,annan@mcrosoft.com Abstract. Fndng

More information

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4) I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes

More information

Lecture 5 Decoding Binary BCH Codes

Lecture 5 Decoding Binary BCH Codes Lecture 5 Decodng Bnary BCH Codes In ths class, we wll ntroduce dfferent methods for decodng BCH codes 51 Decodng the [15, 7, 5] 2 -BCH Code Consder the [15, 7, 5] 2 -code C we ntroduced n the last lecture

More information

Complete subgraphs in multipartite graphs

Complete subgraphs in multipartite graphs Complete subgraphs n multpartte graphs FLORIAN PFENDER Unverstät Rostock, Insttut für Mathematk D-18057 Rostock, Germany Floran.Pfender@un-rostock.de Abstract Turán s Theorem states that every graph G

More information

Math 426: Probability MWF 1pm, Gasson 310 Homework 4 Selected Solutions

Math 426: Probability MWF 1pm, Gasson 310 Homework 4 Selected Solutions Exercses from Ross, 3, : Math 26: Probablty MWF pm, Gasson 30 Homework Selected Solutons 3, p. 05 Problems 76, 86 3, p. 06 Theoretcal exercses 3, 6, p. 63 Problems 5, 0, 20, p. 69 Theoretcal exercses 2,

More information

Math 217 Fall 2013 Homework 2 Solutions

Math 217 Fall 2013 Homework 2 Solutions Math 17 Fall 013 Homework Solutons Due Thursday Sept. 6, 013 5pm Ths homework conssts of 6 problems of 5 ponts each. The total s 30. You need to fully justfy your answer prove that your functon ndeed has

More information

1 Matrix representations of canonical matrices

1 Matrix representations of canonical matrices 1 Matrx representatons of canoncal matrces 2-d rotaton around the orgn: ( ) cos θ sn θ R 0 = sn θ cos θ 3-d rotaton around the x-axs: R x = 1 0 0 0 cos θ sn θ 0 sn θ cos θ 3-d rotaton around the y-axs:

More information

Lecture Randomized Load Balancing strategies and their analysis. Probability concepts include, counting, the union bound, and Chernoff bounds.

Lecture Randomized Load Balancing strategies and their analysis. Probability concepts include, counting, the union bound, and Chernoff bounds. U.C. Berkeley CS273: Parallel and Dstrbuted Theory Lecture 1 Professor Satsh Rao August 26, 2010 Lecturer: Satsh Rao Last revsed September 2, 2010 Lecture 1 1 Course Outlne We wll cover a samplng of the

More information

Lecture 3. Ax x i a i. i i

Lecture 3. Ax x i a i. i i 18.409 The Behavor of Algorthms n Practce 2/14/2 Lecturer: Dan Spelman Lecture 3 Scrbe: Arvnd Sankar 1 Largest sngular value In order to bound the condton number, we need an upper bound on the largest

More information

Lecture 21: Numerical methods for pricing American type derivatives

Lecture 21: Numerical methods for pricing American type derivatives Lecture 21: Numercal methods for prcng Amercan type dervatves Xaoguang Wang STAT 598W Aprl 10th, 2014 (STAT 598W) Lecture 21 1 / 26 Outlne 1 Fnte Dfference Method Explct Method Penalty Method (STAT 598W)

More information

2.3 Nilpotent endomorphisms

2.3 Nilpotent endomorphisms s a block dagonal matrx, wth A Mat dm U (C) In fact, we can assume that B = B 1 B k, wth B an ordered bass of U, and that A = [f U ] B, where f U : U U s the restrcton of f to U 40 23 Nlpotent endomorphsms

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they

More information

10-701/ Machine Learning, Fall 2005 Homework 3

10-701/ Machine Learning, Fall 2005 Homework 3 10-701/15-781 Machne Learnng, Fall 2005 Homework 3 Out: 10/20/05 Due: begnnng of the class 11/01/05 Instructons Contact questons-10701@autonlaborg for queston Problem 1 Regresson and Cross-valdaton [40

More information

Lecture 2: Prelude to the big shrink

Lecture 2: Prelude to the big shrink Lecture 2: Prelude to the bg shrnk Last tme A slght detour wth vsualzaton tools (hey, t was the frst day... why not start out wth somethng pretty to look at?) Then, we consdered a smple 120a-style regresson

More information

Hash functions : MAC / HMAC

Hash functions : MAC / HMAC Hash functons : MAC / HMAC Outlne Message Authentcaton Codes Keyed hash famly Uncondtonally Secure MACs Ref: D Stnson: Cryprography Theory and Practce (3 rd ed), Chap 4. Unversal hash famly Notatons: X

More information

The written Master s Examination

The written Master s Examination he wrtten Master s Eamnaton Opton Statstcs and Probablty SPRING 9 Full ponts may be obtaned for correct answers to 8 questons. Each numbered queston (whch may have several parts) s worth the same number

More information

Online Classification: Perceptron and Winnow

Online Classification: Perceptron and Winnow E0 370 Statstcal Learnng Theory Lecture 18 Nov 8, 011 Onlne Classfcaton: Perceptron and Wnnow Lecturer: Shvan Agarwal Scrbe: Shvan Agarwal 1 Introducton In ths lecture we wll start to study the onlne learnng

More information

CIS 700: algorithms for Big Data

CIS 700: algorithms for Big Data CIS 700: algorthms for Bg Data Lecture 5: Dmenson Reducton Sldes at htt://grgory.us/bg-data-class.html Grgory Yaroslavtsev htt://grgory.us Today Dmensonalty reducton AMS as dmensonalty reducton Johnson-Lndenstrauss

More information

ECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006)

ECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006) ECE 534: Elements of Informaton Theory Solutons to Mdterm Eam (Sprng 6) Problem [ pts.] A dscrete memoryless source has an alphabet of three letters,, =,, 3, wth probabltes.4,.4, and., respectvely. (a)

More information

8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS

8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS SECTION 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS 493 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS All the vector spaces you have studed thus far n the text are real vector spaces because the scalars

More information

Exercises. 18 Algorithms

Exercises. 18 Algorithms 18 Algorthms Exercses 0.1. In each of the followng stuatons, ndcate whether f = O(g), or f = Ω(g), or both (n whch case f = Θ(g)). f(n) g(n) (a) n 100 n 200 (b) n 1/2 n 2/3 (c) 100n + log n n + (log n)

More information

COMPLEX NUMBERS AND QUADRATIC EQUATIONS

COMPLEX NUMBERS AND QUADRATIC EQUATIONS COMPLEX NUMBERS AND QUADRATIC EQUATIONS INTRODUCTION We know that x 0 for all x R e the square of a real number (whether postve, negatve or ero) s non-negatve Hence the equatons x, x, x + 7 0 etc are not

More information

Expected Value and Variance

Expected Value and Variance MATH 38 Expected Value and Varance Dr. Neal, WKU We now shall dscuss how to fnd the average and standard devaton of a random varable X. Expected Value Defnton. The expected value (or average value, or

More information

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography CSc 6974 and ECSE 6966 Math. Tech. for Vson, Graphcs and Robotcs Lecture 21, Aprl 17, 2006 Estmatng A Plane Homography Overvew We contnue wth a dscusson of the major ssues, usng estmaton of plane projectve

More information

Vapnik-Chervonenkis theory

Vapnik-Chervonenkis theory Vapnk-Chervonenks theory Rs Kondor June 13, 2008 For the purposes of ths lecture, we restrct ourselves to the bnary supervsed batch learnng settng. We assume that we have an nput space X, and an unknown

More information

Differentiating Gaussian Processes

Differentiating Gaussian Processes Dfferentatng Gaussan Processes Andrew McHutchon Aprl 17, 013 1 Frst Order Dervatve of the Posteror Mean The posteror mean of a GP s gven by, f = x, X KX, X 1 y x, X α 1 Only the x, X term depends on the

More information

STATS 306B: Unsupervised Learning Spring Lecture 10 April 30

STATS 306B: Unsupervised Learning Spring Lecture 10 April 30 STATS 306B: Unsupervsed Learnng Sprng 2014 Lecture 10 Aprl 30 Lecturer: Lester Mackey Scrbe: Joey Arthur, Rakesh Achanta 10.1 Factor Analyss 10.1.1 Recap Recall the factor analyss (FA) model for lnear

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

BOUNDEDNESS OF THE RIESZ TRANSFORM WITH MATRIX A 2 WEIGHTS

BOUNDEDNESS OF THE RIESZ TRANSFORM WITH MATRIX A 2 WEIGHTS BOUNDEDNESS OF THE IESZ TANSFOM WITH MATIX A WEIGHTS Introducton Let L = L ( n, be the functon space wth norm (ˆ f L = f(x C dx d < For a d d matrx valued functon W : wth W (x postve sem-defnte for all

More information

5 The Rational Canonical Form

5 The Rational Canonical Form 5 The Ratonal Canoncal Form Here p s a monc rreducble factor of the mnmum polynomal m T and s not necessarly of degree one Let F p denote the feld constructed earler n the course, consstng of all matrces

More information

Affine transformations and convexity

Affine transformations and convexity Affne transformatons and convexty The purpose of ths document s to prove some basc propertes of affne transformatons nvolvng convex sets. Here are a few onlne references for background nformaton: http://math.ucr.edu/

More information

Hashing. Alexandra Stefan

Hashing. Alexandra Stefan Hashng Alexandra Stefan 1 Hash tables Tables Drect access table (or key-ndex table): key => ndex Hash table: key => hash value => ndex Man components Hash functon Collson resoluton Dfferent keys mapped

More information

More metrics on cartesian products

More metrics on cartesian products More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of

More information

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction ECONOMICS 5* -- NOTE (Summary) ECON 5* -- NOTE The Multple Classcal Lnear Regresson Model (CLRM): Specfcaton and Assumptons. Introducton CLRM stands for the Classcal Lnear Regresson Model. The CLRM s also

More information

U.C. Berkeley CS278: Computational Complexity Professor Luca Trevisan 2/21/2008. Notes for Lecture 8

U.C. Berkeley CS278: Computational Complexity Professor Luca Trevisan 2/21/2008. Notes for Lecture 8 U.C. Berkeley CS278: Computatonal Complexty Handout N8 Professor Luca Trevsan 2/21/2008 Notes for Lecture 8 1 Undrected Connectvty In the undrected s t connectvty problem (abbrevated ST-UCONN) we are gven

More information

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg prnceton unv. F 17 cos 521: Advanced Algorthm Desgn Lecture 7: LP Dualty Lecturer: Matt Wenberg Scrbe: LP Dualty s an extremely useful tool for analyzng structural propertes of lnear programs. Whle there

More information

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017 U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 4s Luca Trevsan September 5, 07 Summary of Lecture 4 In whch we ntroduce semdefnte programmng and apply t to Max Cut. Semdefnte Programmng Recall that

More information

Lecture 3: Shannon s Theorem

Lecture 3: Shannon s Theorem CSE 533: Error-Correctng Codes (Autumn 006 Lecture 3: Shannon s Theorem October 9, 006 Lecturer: Venkatesan Guruswam Scrbe: Wdad Machmouch 1 Communcaton Model The communcaton model we are usng conssts

More information

Foundations of Arithmetic

Foundations of Arithmetic Foundatons of Arthmetc Notaton We shall denote the sum and product of numbers n the usual notaton as a 2 + a 2 + a 3 + + a = a, a 1 a 2 a 3 a = a The notaton a b means a dvdes b,.e. ac = b where c s an

More information

Discussion 11 Summary 11/20/2018

Discussion 11 Summary 11/20/2018 Dscusson 11 Summary 11/20/2018 1 Quz 8 1. Prove for any sets A, B that A = A B ff B A. Soluton: There are two drectons we need to prove: (a) A = A B B A, (b) B A A = A B. (a) Frst, we prove A = A B B A.

More information

Low correlation tensor decomposition via entropy maximization

Low correlation tensor decomposition via entropy maximization CS369H: Herarches of Integer Programmng Relaxatons Sprng 2016-2017 Low correlaton tensor decomposton va entropy maxmzaton Lecture and notes by Sam Hopkns Scrbes: James Hong Overvew CS369H). These notes

More information

Stanford University Graph Partitioning and Expanders Handout 3 Luca Trevisan May 8, 2013

Stanford University Graph Partitioning and Expanders Handout 3 Luca Trevisan May 8, 2013 Stanford Unversty Graph Parttonng and Expanders Handout 3 Luca Trevsan May 8, 03 Lecture 3 In whch we analyze the power method to approxmate egenvalues and egenvectors, and we descrbe some more algorthmc

More information

CS : Algorithms and Uncertainty Lecture 17 Date: October 26, 2016

CS : Algorithms and Uncertainty Lecture 17 Date: October 26, 2016 CS 29-128: Algorthms and Uncertanty Lecture 17 Date: October 26, 2016 Instructor: Nkhl Bansal Scrbe: Mchael Denns 1 Introducton In ths lecture we wll be lookng nto the secretary problem, and an nterestng

More information

The Expectation-Maximization Algorithm

The Expectation-Maximization Algorithm The Expectaton-Maxmaton Algorthm Charles Elan elan@cs.ucsd.edu November 16, 2007 Ths chapter explans the EM algorthm at multple levels of generalty. Secton 1 gves the standard hgh-level verson of the algorthm.

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

Randić Energy and Randić Estrada Index of a Graph

Randić Energy and Randić Estrada Index of a Graph EUROPEAN JOURNAL OF PURE AND APPLIED MATHEMATICS Vol. 5, No., 202, 88-96 ISSN 307-5543 www.ejpam.com SPECIAL ISSUE FOR THE INTERNATIONAL CONFERENCE ON APPLIED ANALYSIS AND ALGEBRA 29 JUNE -02JULY 20, ISTANBUL

More information

Lecture 4: Hashing and Streaming Algorithms

Lecture 4: Hashing and Streaming Algorithms CSE 521: Design and Analysis of Algorithms I Winter 2017 Lecture 4: Hashing and Streaming Algorithms Lecturer: Shayan Oveis Gharan 01/18/2017 Scribe: Yuqing Ai Disclaimer: These notes have not been subjected

More information

Excess Error, Approximation Error, and Estimation Error

Excess Error, Approximation Error, and Estimation Error E0 370 Statstcal Learnng Theory Lecture 10 Sep 15, 011 Excess Error, Approxaton Error, and Estaton Error Lecturer: Shvan Agarwal Scrbe: Shvan Agarwal 1 Introducton So far, we have consdered the fnte saple

More information

Outline and Reading. Dynamic Programming. Dynamic Programming revealed. Computing Fibonacci. The General Dynamic Programming Technique

Outline and Reading. Dynamic Programming. Dynamic Programming revealed. Computing Fibonacci. The General Dynamic Programming Technique Outlne and Readng Dynamc Programmng The General Technque ( 5.3.2) -1 Knapsac Problem ( 5.3.3) Matrx Chan-Product ( 5.3.1) Dynamc Programmng verson 1.4 1 Dynamc Programmng verson 1.4 2 Dynamc Programmng

More information

Part (a) (Number of collisions) Recall we showed that if we throw m balls in n bins, the average number of. Use Chebyshev s inequality to show that:

Part (a) (Number of collisions) Recall we showed that if we throw m balls in n bins, the average number of. Use Chebyshev s inequality to show that: Problem 1: Practce wth Chebyshev and Chernoff bounds) When usng concentraton bounds to analyze randomzed algorthms, one often has to approach the problem n dfferent ways dependng on the specfc bound beng

More information

20. Mon, Oct. 13 What we have done so far corresponds roughly to Chapters 2 & 3 of Lee. Now we turn to Chapter 4. The first idea is connectedness.

20. Mon, Oct. 13 What we have done so far corresponds roughly to Chapters 2 & 3 of Lee. Now we turn to Chapter 4. The first idea is connectedness. 20. Mon, Oct. 13 What we have done so far corresponds roughly to Chapters 2 & 3 of Lee. Now we turn to Chapter 4. The frst dea s connectedness. Essentally, we want to say that a space cannot be decomposed

More information

Chapter 7 Generalized and Weighted Least Squares Estimation. In this method, the deviation between the observed and expected values of

Chapter 7 Generalized and Weighted Least Squares Estimation. In this method, the deviation between the observed and expected values of Chapter 7 Generalzed and Weghted Least Squares Estmaton The usual lnear regresson model assumes that all the random error components are dentcally and ndependently dstrbuted wth constant varance. When

More information

Solutions Homework 4 March 5, 2018

Solutions Homework 4 March 5, 2018 1 Solutons Homework 4 March 5, 018 Soluton to Exercse 5.1.8: Let a IR be a translaton and c > 0 be a re-scalng. ˆb1 (cx + a) cx n + a (cx 1 + a) c x n x 1 cˆb 1 (x), whch shows ˆb 1 s locaton nvarant and

More information

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number

More information

6.842 Randomness and Computation February 18, Lecture 4

6.842 Randomness and Computation February 18, Lecture 4 6.842 Randomness and Computaton February 18, 2014 Lecture 4 Lecturer: Rontt Rubnfeld Scrbe: Amartya Shankha Bswas Topcs 2-Pont Samplng Interactve Proofs Publc cons vs Prvate cons 1 Two Pont Samplng 1.1

More information