Part (a) (Number of collisions) Recall we showed that if we throw m balls in n bins, the average number of. Use Chebyshev s inequality to show that:

Size: px
Start display at page:

Download "Part (a) (Number of collisions) Recall we showed that if we throw m balls in n bins, the average number of. Use Chebyshev s inequality to show that:"

Transcription

1 Problem 1: Practce wth Chebyshev and Chernoff bounds) When usng concentraton bounds to analyze randomzed algorthms, one often has to approach the problem n dfferent ways dependng on the specfc bound beng used. Typcally, Chebyshev s useful when dealng wth more complcated random varables, and n partcular, when they are parwse ndependent; Chernoff bounds are usually used along wth the unon bound for events whch are easer to analyze. We ll now go back and look at a few of our older examples usng both these technques. Part a) Number of collsons) Recall we showed that f we throw m balls n n bns, the average number of collsons X m,n to be µ m,n = ) m 1 n. Use Chebyshev s nequalty to show that: P X m,n µ m,n c µ m,n 1/c. Next suppose we choose m = n, then µ m,n 1. Use Chernoff bounds plus the unon bound to bound the probablty that no bn has more than 1 ball. Compare ths to the more exact analyss you dd n homework 1. Soluton: Let σ m,n be the standard devaton of X m,n. As we dd before, for every par of balls, j), let Y,j be the ndcator that the two balls collde. Note that snce Y,j are {0, 1} valued r.vs, we have EY,j = EY,j, and thus V ary,j ) EY,j. Moreover, observe that Y,j are parwse ndependent,.e., for any, j), j ), the r.v.s Y,j and Y,j are ndependent n partcular, note that Y,j, Y,k and Y j,k are parwse ndependent however they are not mutually ndependent). Now we have X m,n =,j Y,j, and thus µ m,n =,j EY,j and V arx m,n ) =,j V ary,j),j EY,j = µ m,n. Thus, by Chebyshev s nequalty, we have: P X m,n µ m,n c µ m,n 1/c. To use Chernoff bounds, we nstead consder the number of balls n each bn. Let Z b, be the ndcator that ball fell n bn b these are now mutually ndependent. Further, let B b = Z b, be the number of balls n bn b then we know that EB b = m/n, and moreover usng our standard Chernoff bound, we have: P B b = P B b 1 + n/m 1)) EB b exp n/m 1) m/n) + n/m 1) n m) = exp nm + n) ) ) Usng PX 1 + ɛ)µ exp ɛ )) µ + ɛ ) Fnally, by the unon bound, we have PSome bn has balls n exp n m) nm+n). Now f m = ) n, we get exp = Θ1), and thus the bound on PSome bn has balls s not n m) nm+n) 1

2 useful as t grows wth n). On the other hand, n the frst homework, we dd a drect calculaton to show that PNo collsons exp EX m,n ), and thus f m = n, we have PNo collsons 1/. Thus, the Chernoff bound does not gve us the tghtest scalng n ths case. Part b) Coupon collector) For n bns, recall that we defned T to be the frst tme when unque bns were flled, and used these random varables to show that T n,.e., the number of balls we need to throw before every bn has at least one ball, satsfes ET n = nh n = Θn log n). Usng the same random varables, show that P T n ET n cet n π. 6c Hn Next, suppose we throw n m = n log n+cn balls usng Chernoff bounds plus the unon bound, choose c such that no bn s empty wth probablty greater than 1 δ. Hnt: Use =1 1/ = π /6 Soluton: Frst, usng the ndependence of T s let s calculate V art n : n 1 V art n = V art = =0 n 1 =0 n 1 1 p = p =0 Now, usng Chebyshev s nequalty, we get: n n n ) = =1 nn ) n P T n ET n cet n = P T n ET n cnh n V art n cnh n ) n =1 1 π n 6. π 6c H n. Next, let m = n log n + cn, and let B b be the number of balls n bn b. We know EB b = log n + c. Then by the Chernoff bound, we have: ) ) EBb log n + c) PB b 0 = PB b 1 1)EB b exp = exp log n+c) Fnally, usng the unon bound, we have PSome bn s empty npb b 0 = explog n ), and we can set c = log n + log1/δ) to get ths less than δ. Here, usng a drect calculaton s better than the Chernoff bound. In partcular, we have: PB b 0 = 1 n) 1 m e m/n = e c /n By the unon bound, we have PSome bn s empty e c, and thus we need c = log1/δ) to ensure ths s less than δ. The man takeaway agan s that Chernoff bounds are fne when probabltes are small and we do not want the tghtest bounds, but may be weak when probabltes are larger and we want tghter bounds.

3 Problem : The Hoeffdng Extenson) In class, we saw that for a r.v. X Bernoullp ), we have: Ee θx = 1 p + pe θ Pluggng ths nto the Chernoff bound and optmzng over θ, we obtaned a varety of bounds n partcular, for ndependent r.vs {X } wth X = X and µ = EX = p, we showed for any ɛ > 0: ɛ ) µ P X µ ɛµ exp + ɛ We now extend ths to more general bounded r.vs. Part a) Frst, for any θ, argue that the functon fx) = e θx for x 0, 1 s bounded above by the lne jonng 0, 1) and 1, e θ ). Usng ths, fnd constants α, β such that x 0, 1: e θx αx + β Soluton: Recall that fx) = e θx s a convex functon. Now, note that f0) = 1 and f1) = e θ. Therefore, for x 0, 1, t s bounded above by the lne jonng 0, 1) and 1, e θ ), whch means: Part b) e θx e θ 1)x + 1. Next, for any random varable X takng values n 0, 1 such that EX = µ, show that: Ee θx 1 µ + µ e θ. Usng ths, for ndependent r.vs X takng values n 0, 1 wth EX = µ, and defnng X = X, µ = µ, show that: ɛ ) µ P X µ ɛµ exp + ɛ Note: You can drectly use the nequalty for Bernoull r.v.s no need to show the optmzaton.) Soluton: Let s take the expectatons of both sdes of the nequalty n the soluton of the part a): Ee θx Ee θ 1)X + 1 = e θ 1)EX + 1 = e θ 1)µ = 1 µ + µ e θ. Now, let X = X, µ = µ. Usng exactly the same method s we dd n class for Bernoull r.vs, we get: ɛ ) µ P X µ ɛµ exp + ɛ 3

4 Part c) Optonal) Next, for any random varable X takng values n a, b such that EX = µ, a smlar boundng technque as above can be used to show: ) E e θx µ ) 1 exp 8 θ b a) Ths s sometmes referred to as Hoeffdng s lemma for the proof, see the wkpeda artcle) Now consder ndependent r.vs X takng values n a, b wth EX = µ, and as before, let X = X, µ = µ. Usng the above nequalty, optmze over θ to show that: ɛ µ ) PX µ) ɛµ exp n =1 b a ) Soluton: As n the standard Chernoff nequalty, we have: PX µ ɛµ = P e θx µ) e θɛµ mn θ>0 = mn θ>0 mn θ>0 exp Ee θx µ) e θɛµ Π Ee θx µ ) e θɛµ θ 8 ) b a ) θɛµ To optmze, we choose θ = 4ɛµ/ b a ), to get: ɛ µ ) PX µ ɛµ exp b a ) Problem 3: A Weaker Samplng Theorem: Adapted from MU Ex 4.9) In class we saw the followng samplng theorem for estmatng the mean of a {0, 1}-valued random varable: In order to get an estmate wthn ±ɛ wth confdence 1 δ, we need n +ɛ ɛ ln δ. A crucal component n ths proof was usng the Chernoff bound for Bernoullp) random varables. Suppose nstead we want to estmate a more general random varable X for example, the average number of hours of TV watched by a random person) we may not be able to use a Chernoff bound f we do not know the moment generatng functon We now show how to to get a smlar samplng theorem whch only uses knowledge of the mean and varance of ax. We want to estmate a r.v. X wth mean EX and varance V arx, gven..d samples X 1, X,.... Let r = V arx/ex ) we now show that we can estmate t up to accuracy ±ɛex and confdence 1 δ usng O ɛ ln 1 r δ samples. 4

5 Part a) Gven n samples X 1,..., X n, suppose we use the estmator X = n =1 X ) /n. Show that n = Or /ɛ δ) s suffcent to ensure: P X EX ɛex δ Soluton: By lnearty of expectaton, we have that E X = n =1 EX /n = EX. Moreover, snce X s are..d., we have that V ar X = n =1 V arx )/n = V arx)/n = r EX /n. Now, usng Chebyshev s nequalty we get: P X V ar X) EX ɛex ɛ EX = r nɛ To ensure P X EX ɛex δ, we can choose n such that r δ. Usng n = Or /ɛ δ) nɛ samples s thus suffcent. Part b) We say an estmator s a weak estmator f t satsfes that P X EX ɛex 1/4 usng part a), show, that we need Or /ɛ ) samples to obtan a weak estmator. Now suppose we are gven m weak estmates X 1, X,..., X m, and we defne a new estmator X to be the medan of these weak estmates. Show that usng m = Oln1/δ) weak estmates gves us an estmate X that satsfes: P X EX ɛex δ What could go wrong f we used the mean of X 1, X,..., X m nstead of the medan? NOTE: I got the defnton of weak estmator nverted n the Homework - ths s the correct defnton. Essentally, we want the probablty of samples beng close to the mean to be > 1/. Soluton: If we take δ = 1 4 n part a), we see that X s a weak estmator f we use n = Or /ɛ ) samples. Now we want to show that we need Olog 1/δ) such weak estmators to ensure that the medan of these estmators s wthn 1 ± ɛ)ex. Let us ntroduce new r.vs Y s.t. Y = 1, f X EX ɛex, and Y = 0, f X EX ɛex. Y = m = Y s the number of weak estmates for whch P X EX ɛex δ; to ensure that the medan s also wthn 1 ± ɛ)ex, we need Y > m/. In other words, we have: P X EX ɛex = P Y m/. 5

6 Moreover, PY = 0 3/4, and hence EY 3m/4. Now for ndependent random varables Z Bernoullp ), wth Z = n =1 Z, we saw the followng Chernoff Bound: PZ 1 ɛ)ez exp ɛ ) EZ Usng ths, we can wrte: P Y m/ = P Y 1 1/3)3m/4 PY 1 1/3)EY e 1/9)EY e m/18. Now, f we take m 18 ln1/δ) = Olog 1/δ), we get P X EX ɛex δ. Note though that we only needed to count the weak estmators that fell outsde 1 ± ɛ)ex we dd not need to assume they are bounded, or have bounded varance, etc. If nstead we had used the mean of the weak estmators, we could have a problem f these bad estmates happened to take very large values. Usng the medan made our estmate robust to such outlers. Problem 4: Randomzed Set-Cover) In ths problem, we ll look at randomzed roundng, whch s a very powerful technque for solvng large-scale combnatoral optmzaton problems. The man dea s that gven a problem whch can be wrtten as an optmzaton problem wth nteger constrants, we can sometmes solve the relaxed problem wth non-nteger constrants, and then round the solutons to get a good assgnment. We wll hghlght ths technque for the Mnmum Set-Cover problem. We are gven a collecton of m subsets {S 1, S,..., S m } whch are subsets of some large set U of n elements, such that S = U. The Mnmum Set-Cover problem s that of selectng the smallest number of sets C from the collecton {S 1, S,..., S m } such that they cover U,.e., such that each element n U les n at least one of the sets n C. Part a) Argue that the mnmum-set cover problem s equvalent to the followng nteger program: Mnmze x x subject to x 1, e U e S x {0, 1}, {1,,..., m} Let the soluton to ths problem,.e., the mnmum set-cover, be denoted OP T. Next, argue that f we solve the same problem, but now change the last constrant to x 0, 1 for all, then the resultng soluton OP T LP of ths relaxed problem obeys OP T LP OP T. Note that the relaxed problem s an LP and hence can be solved effcently. 6

7 Soluton: For each set S, we assocate a varable x {0, 1} that ndcates of we want to choose S or not. We can then wrte the solutons for Mnmum Set-Cover problem as a vector x {0, 1} m. The objectve s clearly to mnmze the sum of these varables moreover, snce the sets we select must cover each element n U, we need one constrant to ensure ths for each element. If we replace each constrant x {0, 1} wth x 0, 1 for all, then the resultng problem can be solved n a polynomal tme as t s a Lnear Program LP). However, by replacng nteger constrants wth contnuous domans, we have expanded the set of feasble solutons thus, the resultng mnmzaton problem must gve a smaller value as all feasble nteger solutons are wthn our new feasble regon), and hence we have that OP T LP OP T. Part b) Gven a soluton z to the relaxed LP, we now round the values to obtan a feasble soluton for the orgnal mnmum set-cover problem. For each set S, we generate k = c log n..d Bernoullz ) random varables X,1, X,,..., X,k f any of them s 1, then we set x = 1,.e., we add S to our cover C. Prove that the resultng set-cover obeys E C c log n OP T. Soluton: Frst, from the defnton of the roundng process, for any j {1,,..., k}, we have: EX,j = PX,j = 1 = z = OP T LP Therefore, by lnearty of expectaton, we have: E C clog n j=1 EX,j = c log n OP T LP c log n OP T. Part c) Fnally, choose c to ensure that the probablty that the resultng set-cover C does not cover any element e U s less than 1/n. Soluton: For any element e U, and for any j {1,,..., c log n}, let Y e,j = e S X,j, and let Y e = c log n j=1 Y e,j. Note that for the sets S contanng element e, f any of the X,j = 1, then e s covered n other words, Pe s covered = PY e 1. Moreover, EY e,j = e S EX,j = e S z. Let k e = { e S } ; snce e s fractonally covered n the LP, we have z z ke 1, and thus EY e c log n. Now usng our basc Chernoff bound, we have: PY e 0 = P Y e 1 1)EY e e EYe c log n e = 1 n c/ Thus Pe s not covered n c/. Moreover, by the unon bound, we have that: PAny element e U s not covered n 1 c/. 7

8 If c 6, we get 1 c/ =, and hence PEvery element e U s covered 1 n. Note: You can actually get a tghter bound of c 3 usng the followng argument: Frst, we need to observe that probablty of e beng covered s mnmzed when z are all equal,.e., z 1 = = z ke = 1/k e ths s not obvous - try to prove t...). Thus we get for any j: P Y,j = 0 1 z 1 ) 1 z ke ) e S 1 1 k e ) ke 1 e. Therefore, each element s covered wth probablty at least 1 1/e f we draw only one sample of each X. Snce we draw c log n samples, the probablty that element e s not covered s: Pe s not covered ) 1 c log n = 1 e n c. Now va the unon bound, we see that f we take c = 3, then we ll have: PAny element e U s not covered 1 n. Problem 5: Papadmtrou s SAT Algorthm) Optonal) The SAT problem Wkpeda entry), and more generally, the boolean satsfablty SAT) problem Wkpeda entry) are one of the cornerstones of theoretcal algorthms, and also a very useful modelng tool for a varety of optmzaton problems. In the general SAT problem, we want to fnd a satsfyng assgnment for a gven a Boolean expresson n n Boolean.e., {0, 1}, or FALSE/TRUE) varables {X 1, X,..., X n } typcally nvolvng conjunctons.e., logcal AND, denoted as ), dsjunctons.e., logcal OR, denoted as ) and negatons logcal NOT; typcally X denotes the negaton of a varable X). In SAT, the expresson s restrcted to beng a conjuncton AND) of several clauses, where each clause s the dsjuncton OR) of two lterals ether a varable or ts negaton). For example, the expresson X 1 X ) X 1 X 3 ) X X 3 ) s a SAT formula, whch has several satsfyng assgnments ncludng X 1 = 1, X = 1, X 3 = 1). Note that n order to fnd a satsfyng assgnment, we need to set the varables such that each clause n the formula has at least one lteral whch s TRUE. Although the general SAT problem s known to be NP-complete, SAT can be solved n polynomal tme we wll now see a smple randomzed algorthm that demonstrates ths fact: Papapdmtrou s SAT Algorthm): Gven a CNF formula F nvolvng n Boolean varables, and an arbtrary assgnment τ, we check f τ satsfes F. If not, we pck an arbtrary unsatsfed clause, pck one of ts lterals unformly at random, and flp t to get a new assgnment τ. We then repeat ths untl we fnd a satsfyng assgnment. Part a) Assume that F has a unque satsfyng assgnment τ, and for any assgnment τ, let Nτ) be the number of lterals n τ whch agree.e., have the same value) as the correspondng lteral n 8

9 τ. Argue that each tme we execute an teraton of Papadmtrou s SAT algorthm wth nput assgnment τ, the new assgnment τ satsfes: { Nτ Nτ) + 1 wth probablty at least 1 ) = Nτ) 1 wth probablty at most 1 Soluton: Gven any unsatsfed clause, we know that n τ at least one of the lterals s flpped t could be both are flpped). Snce we choose to flp a unform random lteral out of the two, we are guaranteed that we ncrease the agreement between our guess τ and τ by 1 wth probablty at least 1/. Part b) Based on the above, argue that the runnng tme T n of the algorthm s upper bounded by the frst tme { that a symmetrc random walk startng from 0 hts n or n equvalently, T n = arg mn K>0 } K =1 X = n, where X are..d Rademacher random varables). Next, show that ET n = n, and thus, prove that after On 3 ) teratons, Papadmtrou s algorthm termnates wth probablty greater than 1 1/n. Soluton: See ths lecture and the next) from Tm Roughgarden s algorthms course for a beautful exposton of ths proof. One thng you should note: although runnng the algorthm tll On 3 ) steps gves the desred probablty, one can get much better bounds f we nstead stop after On ) steps, and then restart the process. Markov s tells us that after n steps, we fnd the soluton wth probablty at least 1/ now from prevous assgnments, you know that dong Olog n) ndependent trals s suffcent to amplfy the probablty to 1 1/n. However, the analyss works for any startng pont so ths does suggest that On log n) steps of Papadmtrou s algorthm was suffcent... Is ths surprsng? Not really bascally what ths says s that the falure probablty does follow a Chernoff-style exponental decay. We could not show ths drectly as the random varables were not ndependent however, Markov s nequalty stll works, and we can explot that n a clever way to get our result. 9

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017 U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 4s Luca Trevsan September 5, 07 Summary of Lecture 4 In whch we ntroduce semdefnte programmng and apply t to Max Cut. Semdefnte Programmng Recall that

More information

18.1 Introduction and Recap

18.1 Introduction and Recap CS787: Advanced Algorthms Scrbe: Pryananda Shenoy and Shjn Kong Lecturer: Shuch Chawla Topc: Streamng Algorthmscontnued) Date: 0/26/2007 We contnue talng about streamng algorthms n ths lecture, ncludng

More information

Lecture 4: Universal Hash Functions/Streaming Cont d

Lecture 4: Universal Hash Functions/Streaming Cont d CSE 5: Desgn and Analyss of Algorthms I Sprng 06 Lecture 4: Unversal Hash Functons/Streamng Cont d Lecturer: Shayan Oves Gharan Aprl 6th Scrbe: Jacob Schreber Dsclamer: These notes have not been subjected

More information

Stanford University CS254: Computational Complexity Notes 7 Luca Trevisan January 29, Notes for Lecture 7

Stanford University CS254: Computational Complexity Notes 7 Luca Trevisan January 29, Notes for Lecture 7 Stanford Unversty CS54: Computatonal Complexty Notes 7 Luca Trevsan January 9, 014 Notes for Lecture 7 1 Approxmate Countng wt an N oracle We complete te proof of te followng result: Teorem 1 For every

More information

princeton univ. F 13 cos 521: Advanced Algorithm Design Lecture 3: Large deviations bounds and applications Lecturer: Sanjeev Arora

princeton univ. F 13 cos 521: Advanced Algorithm Design Lecture 3: Large deviations bounds and applications Lecturer: Sanjeev Arora prnceton unv. F 13 cos 521: Advanced Algorthm Desgn Lecture 3: Large devatons bounds and applcatons Lecturer: Sanjeev Arora Scrbe: Today s topc s devaton bounds: what s the probablty that a random varable

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 1 10/1/013 Martngale Concentraton Inequaltes and Applcatons Content. 1. Exponental concentraton for martngales wth bounded ncrements.

More information

Eigenvalues of Random Graphs

Eigenvalues of Random Graphs Spectral Graph Theory Lecture 2 Egenvalues of Random Graphs Danel A. Spelman November 4, 202 2. Introducton In ths lecture, we consder a random graph on n vertces n whch each edge s chosen to be n the

More information

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg prnceton unv. F 17 cos 521: Advanced Algorthm Desgn Lecture 7: LP Dualty Lecturer: Matt Wenberg Scrbe: LP Dualty s an extremely useful tool for analyzng structural propertes of lnear programs. Whle there

More information

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009 College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:

More information

Feature Selection: Part 1

Feature Selection: Part 1 CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?

More information

Expected Value and Variance

Expected Value and Variance MATH 38 Expected Value and Varance Dr. Neal, WKU We now shall dscuss how to fnd the average and standard devaton of a random varable X. Expected Value Defnton. The expected value (or average value, or

More information

Notes on Frequency Estimation in Data Streams

Notes on Frequency Estimation in Data Streams Notes on Frequency Estmaton n Data Streams In (one of) the data streamng model(s), the data s a sequence of arrvals a 1, a 2,..., a m of the form a j = (, v) where s the dentty of the tem and belongs to

More information

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could

More information

11 Tail Inequalities Markov s Inequality. Lecture 11: Tail Inequalities [Fa 13]

11 Tail Inequalities Markov s Inequality. Lecture 11: Tail Inequalities [Fa 13] Algorthms Lecture 11: Tal Inequaltes [Fa 13] If you hold a cat by the tal you learn thngs you cannot learn any other way. Mark Twan 11 Tal Inequaltes The smple recursve structure of skp lsts made t relatvely

More information

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011 Stanford Unversty CS359G: Graph Parttonng and Expanders Handout 4 Luca Trevsan January 3, 0 Lecture 4 In whch we prove the dffcult drecton of Cheeger s nequalty. As n the past lectures, consder an undrected

More information

A 2D Bounded Linear Program (H,c) 2D Linear Programming

A 2D Bounded Linear Program (H,c) 2D Linear Programming A 2D Bounded Lnear Program (H,c) h 3 v h 8 h 5 c h 4 h h 6 h 7 h 2 2D Lnear Programmng C s a polygonal regon, the ntersecton of n halfplanes. (H, c) s nfeasble, as C s empty. Feasble regon C s unbounded

More information

Calculation of time complexity (3%)

Calculation of time complexity (3%) Problem 1. (30%) Calculaton of tme complexty (3%) Gven n ctes, usng exhaust search to see every result takes O(n!). Calculaton of tme needed to solve the problem (2%) 40 ctes:40! dfferent tours 40 add

More information

Finding Dense Subgraphs in G(n, 1/2)

Finding Dense Subgraphs in G(n, 1/2) Fndng Dense Subgraphs n Gn, 1/ Atsh Das Sarma 1, Amt Deshpande, and Rav Kannan 1 Georga Insttute of Technology,atsh@cc.gatech.edu Mcrosoft Research-Bangalore,amtdesh,annan@mcrosoft.com Abstract. Fndng

More information

Assortment Optimization under MNL

Assortment Optimization under MNL Assortment Optmzaton under MNL Haotan Song Aprl 30, 2017 1 Introducton The assortment optmzaton problem ams to fnd the revenue-maxmzng assortment of products to offer when the prces of products are fxed.

More information

E Tail Inequalities. E.1 Markov s Inequality. Non-Lecture E: Tail Inequalities

E Tail Inequalities. E.1 Markov s Inequality. Non-Lecture E: Tail Inequalities Algorthms Non-Lecture E: Tal Inequaltes If you hold a cat by the tal you learn thngs you cannot learn any other way. Mar Twan E Tal Inequaltes The smple recursve structure of sp lsts made t relatvely easy

More information

Homework Assignment 3 Due in class, Thursday October 15

Homework Assignment 3 Due in class, Thursday October 15 Homework Assgnment 3 Due n class, Thursday October 15 SDS 383C Statstcal Modelng I 1 Rdge regresson and Lasso 1. Get the Prostrate cancer data from http://statweb.stanford.edu/~tbs/elemstatlearn/ datasets/prostate.data.

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

APPENDIX A Some Linear Algebra

APPENDIX A Some Linear Algebra APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,

More information

1 Matrix representations of canonical matrices

1 Matrix representations of canonical matrices 1 Matrx representatons of canoncal matrces 2-d rotaton around the orgn: ( ) cos θ sn θ R 0 = sn θ cos θ 3-d rotaton around the x-axs: R x = 1 0 0 0 cos θ sn θ 0 sn θ cos θ 3-d rotaton around the y-axs:

More information

Matrix Approximation via Sampling, Subspace Embedding. 1 Solving Linear Systems Using SVD

Matrix Approximation via Sampling, Subspace Embedding. 1 Solving Linear Systems Using SVD Matrx Approxmaton va Samplng, Subspace Embeddng Lecturer: Anup Rao Scrbe: Rashth Sharma, Peng Zhang 0/01/016 1 Solvng Lnear Systems Usng SVD Two applcatons of SVD have been covered so far. Today we loo

More information

Randomness and Computation

Randomness and Computation Randomness and Computaton or, Randomzed Algorthms Mary Cryan School of Informatcs Unversty of Ednburgh RC 208/9) Lecture 0 slde Balls n Bns m balls, n bns, and balls thrown unformly at random nto bns usually

More information

Limited Dependent Variables

Limited Dependent Variables Lmted Dependent Varables. What f the left-hand sde varable s not a contnuous thng spread from mnus nfnty to plus nfnty? That s, gven a model = f (, β, ε, where a. s bounded below at zero, such as wages

More information

TAIL BOUNDS FOR SUMS OF GEOMETRIC AND EXPONENTIAL VARIABLES

TAIL BOUNDS FOR SUMS OF GEOMETRIC AND EXPONENTIAL VARIABLES TAIL BOUNDS FOR SUMS OF GEOMETRIC AND EXPONENTIAL VARIABLES SVANTE JANSON Abstract. We gve explct bounds for the tal probabltes for sums of ndependent geometrc or exponental varables, possbly wth dfferent

More information

MATH 829: Introduction to Data Mining and Analysis The EM algorithm (part 2)

MATH 829: Introduction to Data Mining and Analysis The EM algorithm (part 2) 1/16 MATH 829: Introducton to Data Mnng and Analyss The EM algorthm (part 2) Domnque Gullot Departments of Mathematcal Scences Unversty of Delaware Aprl 20, 2016 Recall 2/16 We are gven ndependent observatons

More information

Lecture Randomized Load Balancing strategies and their analysis. Probability concepts include, counting, the union bound, and Chernoff bounds.

Lecture Randomized Load Balancing strategies and their analysis. Probability concepts include, counting, the union bound, and Chernoff bounds. U.C. Berkeley CS273: Parallel and Dstrbuted Theory Lecture 1 Professor Satsh Rao August 26, 2010 Lecturer: Satsh Rao Last revsed September 2, 2010 Lecture 1 1 Course Outlne We wll cover a samplng of the

More information

Errors for Linear Systems

Errors for Linear Systems Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch

More information

COS 521: Advanced Algorithms Game Theory and Linear Programming

COS 521: Advanced Algorithms Game Theory and Linear Programming COS 521: Advanced Algorthms Game Theory and Lnear Programmng Moses Charkar February 27, 2013 In these notes, we ntroduce some basc concepts n game theory and lnear programmng (LP). We show a connecton

More information

Lecture 21: Numerical methods for pricing American type derivatives

Lecture 21: Numerical methods for pricing American type derivatives Lecture 21: Numercal methods for prcng Amercan type dervatves Xaoguang Wang STAT 598W Aprl 10th, 2014 (STAT 598W) Lecture 21 1 / 26 Outlne 1 Fnte Dfference Method Explct Method Penalty Method (STAT 598W)

More information

Min Cut, Fast Cut, Polynomial Identities

Min Cut, Fast Cut, Polynomial Identities Randomzed Algorthms, Summer 016 Mn Cut, Fast Cut, Polynomal Identtes Instructor: Thomas Kesselhem and Kurt Mehlhorn 1 Mn Cuts n Graphs Lecture (5 pages) Throughout ths secton, G = (V, E) s a mult-graph.

More information

xp(x µ) = 0 p(x = 0 µ) + 1 p(x = 1 µ) = µ

xp(x µ) = 0 p(x = 0 µ) + 1 p(x = 1 µ) = µ CSE 455/555 Sprng 2013 Homework 7: Parametrc Technques Jason J. Corso Computer Scence and Engneerng SUY at Buffalo jcorso@buffalo.edu Solutons by Yngbo Zhou Ths assgnment does not need to be submtted and

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016 U.C. Berkeley CS94: Spectral Methods and Expanders Handout 8 Luca Trevsan February 7, 06 Lecture 8: Spectral Algorthms Wrap-up In whch we talk about even more generalzatons of Cheeger s nequaltes, and

More information

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M CIS56: achne Learnng Lecture 3 (Sept 6, 003) Preparaton help: Xaoyng Huang Lnear Regresson Lnear regresson can be represented by a functonal form: f(; θ) = θ 0 0 +θ + + θ = θ = 0 ote: 0 s a dummy attrbute

More information

NP-Completeness : Proofs

NP-Completeness : Proofs NP-Completeness : Proofs Proof Methods A method to show a decson problem Π NP-complete s as follows. (1) Show Π NP. (2) Choose an NP-complete problem Π. (3) Show Π Π. A method to show an optmzaton problem

More information

Solutions HW #2. minimize. Ax = b. Give the dual problem, and make the implicit equality constraints explicit. Solution.

Solutions HW #2. minimize. Ax = b. Give the dual problem, and make the implicit equality constraints explicit. Solution. Solutons HW #2 Dual of general LP. Fnd the dual functon of the LP mnmze subject to c T x Gx h Ax = b. Gve the dual problem, and make the mplct equalty constrants explct. Soluton. 1. The Lagrangan s L(x,

More information

Introduction to Algorithms

Introduction to Algorithms Introducton to Algorthms 6.046J/8.40J Lecture 7 Prof. Potr Indyk Data Structures Role of data structures: Encapsulate data Support certan operatons (e.g., INSERT, DELETE, SEARCH) Our focus: effcency of

More information

Case A. P k = Ni ( 2L i k 1 ) + (# big cells) 10d 2 P k.

Case A. P k = Ni ( 2L i k 1 ) + (# big cells) 10d 2 P k. THE CELLULAR METHOD In ths lecture, we ntroduce the cellular method as an approach to ncdence geometry theorems lke the Szemeréd-Trotter theorem. The method was ntroduced n the paper Combnatoral complexty

More information

Econ Statistical Properties of the OLS estimator. Sanjaya DeSilva

Econ Statistical Properties of the OLS estimator. Sanjaya DeSilva Econ 39 - Statstcal Propertes of the OLS estmator Sanjaya DeSlva September, 008 1 Overvew Recall that the true regresson model s Y = β 0 + β 1 X + u (1) Applyng the OLS method to a sample of data, we estmate

More information

Lecture 3 January 31, 2017

Lecture 3 January 31, 2017 CS 224: Advanced Algorthms Sprng 207 Prof. Jelan Nelson Lecture 3 January 3, 207 Scrbe: Saketh Rama Overvew In the last lecture we covered Y-fast tres and Fuson Trees. In ths lecture we start our dscusson

More information

6.842 Randomness and Computation February 18, Lecture 4

6.842 Randomness and Computation February 18, Lecture 4 6.842 Randomness and Computaton February 18, 2014 Lecture 4 Lecturer: Rontt Rubnfeld Scrbe: Amartya Shankha Bswas Topcs 2-Pont Samplng Interactve Proofs Publc cons vs Prvate cons 1 Two Pont Samplng 1.1

More information

Solutions to exam in SF1811 Optimization, Jan 14, 2015

Solutions to exam in SF1811 Optimization, Jan 14, 2015 Solutons to exam n SF8 Optmzaton, Jan 4, 25 3 3 O------O -4 \ / \ / The network: \/ where all lnks go from left to rght. /\ / \ / \ 6 O------O -5 2 4.(a) Let x = ( x 3, x 4, x 23, x 24 ) T, where the varable

More information

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons

More information

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography CSc 6974 and ECSE 6966 Math. Tech. for Vson, Graphcs and Robotcs Lecture 21, Aprl 17, 2006 Estmatng A Plane Homography Overvew We contnue wth a dscusson of the major ssues, usng estmaton of plane projectve

More information

Lecture 4. Instructor: Haipeng Luo

Lecture 4. Instructor: Haipeng Luo Lecture 4 Instructor: Hapeng Luo In the followng lectures, we focus on the expert problem and study more adaptve algorthms. Although Hedge s proven to be worst-case optmal, one may wonder how well t would

More information

Module 2. Random Processes. Version 2 ECE IIT, Kharagpur

Module 2. Random Processes. Version 2 ECE IIT, Kharagpur Module Random Processes Lesson 6 Functons of Random Varables After readng ths lesson, ou wll learn about cdf of functon of a random varable. Formula for determnng the pdf of a random varable. Let, X be

More information

Module 9. Lecture 6. Duality in Assignment Problems

Module 9. Lecture 6. Duality in Assignment Problems Module 9 1 Lecture 6 Dualty n Assgnment Problems In ths lecture we attempt to answer few other mportant questons posed n earler lecture for (AP) and see how some of them can be explaned through the concept

More information

Foundations of Arithmetic

Foundations of Arithmetic Foundatons of Arthmetc Notaton We shall denote the sum and product of numbers n the usual notaton as a 2 + a 2 + a 3 + + a = a, a 1 a 2 a 3 a = a The notaton a b means a dvdes b,.e. ac = b where c s an

More information

Lecture 10: May 6, 2013

Lecture 10: May 6, 2013 TTIC/CMSC 31150 Mathematcal Toolkt Sprng 013 Madhur Tulsan Lecture 10: May 6, 013 Scrbe: Wenje Luo In today s lecture, we manly talked about random walk on graphs and ntroduce the concept of graph expander,

More information

Some modelling aspects for the Matlab implementation of MMA

Some modelling aspects for the Matlab implementation of MMA Some modellng aspects for the Matlab mplementaton of MMA Krster Svanberg krlle@math.kth.se Optmzaton and Systems Theory Department of Mathematcs KTH, SE 10044 Stockholm September 2004 1. Consdered optmzaton

More information

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton

More information

THE CHINESE REMAINDER THEOREM. We should thank the Chinese for their wonderful remainder theorem. Glenn Stevens

THE CHINESE REMAINDER THEOREM. We should thank the Chinese for their wonderful remainder theorem. Glenn Stevens THE CHINESE REMAINDER THEOREM KEITH CONRAD We should thank the Chnese for ther wonderful remander theorem. Glenn Stevens 1. Introducton The Chnese remander theorem says we can unquely solve any par of

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Lecture 3: Probability Distributions

Lecture 3: Probability Distributions Lecture 3: Probablty Dstrbutons Random Varables Let us begn by defnng a sample space as a set of outcomes from an experment. We denote ths by S. A random varable s a functon whch maps outcomes nto the

More information

Grover s Algorithm + Quantum Zeno Effect + Vaidman

Grover s Algorithm + Quantum Zeno Effect + Vaidman Grover s Algorthm + Quantum Zeno Effect + Vadman CS 294-2 Bomb 10/12/04 Fall 2004 Lecture 11 Grover s algorthm Recall that Grover s algorthm for searchng over a space of sze wors as follows: consder the

More information

Vapnik-Chervonenkis theory

Vapnik-Chervonenkis theory Vapnk-Chervonenks theory Rs Kondor June 13, 2008 For the purposes of ths lecture, we restrct ourselves to the bnary supervsed batch learnng settng. We assume that we have an nput space X, and an unknown

More information

Ensemble Methods: Boosting

Ensemble Methods: Boosting Ensemble Methods: Boostng Ncholas Ruozz Unversty of Texas at Dallas Based on the sldes of Vbhav Gogate and Rob Schapre Last Tme Varance reducton va baggng Generate new tranng data sets by samplng wth replacement

More information

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number

More information

Maximal Margin Classifier

Maximal Margin Classifier CS81B/Stat41B: Advanced Topcs n Learnng & Decson Makng Mamal Margn Classfer Lecturer: Mchael Jordan Scrbes: Jana van Greunen Corrected verson - /1/004 1 References/Recommended Readng 1.1 Webstes www.kernel-machnes.org

More information

Estimation: Part 2. Chapter GREG estimation

Estimation: Part 2. Chapter GREG estimation Chapter 9 Estmaton: Part 2 9. GREG estmaton In Chapter 8, we have seen that the regresson estmator s an effcent estmator when there s a lnear relatonshp between y and x. In ths chapter, we generalzed the

More information

Applied Stochastic Processes

Applied Stochastic Processes STAT455/855 Fall 23 Appled Stochastc Processes Fnal Exam, Bref Solutons 1. (15 marks) (a) (7 marks) The dstrbuton of Y s gven by ( ) ( ) y 2 1 5 P (Y y) for y 2, 3,... The above follows because each of

More information

Lecture 10 Support Vector Machines II

Lecture 10 Support Vector Machines II Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed

More information

find (x): given element x, return the canonical element of the set containing x;

find (x): given element x, return the canonical element of the set containing x; COS 43 Sprng, 009 Dsjont Set Unon Problem: Mantan a collecton of dsjont sets. Two operatons: fnd the set contanng a gven element; unte two sets nto one (destructvely). Approach: Canoncal element method:

More information

Introduction to Vapor/Liquid Equilibrium, part 2. Raoult s Law:

Introduction to Vapor/Liquid Equilibrium, part 2. Raoult s Law: CE304, Sprng 2004 Lecture 4 Introducton to Vapor/Lqud Equlbrum, part 2 Raoult s Law: The smplest model that allows us do VLE calculatons s obtaned when we assume that the vapor phase s an deal gas, and

More information

Linear Regression Analysis: Terminology and Notation

Linear Regression Analysis: Terminology and Notation ECON 35* -- Secton : Basc Concepts of Regresson Analyss (Page ) Lnear Regresson Analyss: Termnology and Notaton Consder the generc verson of the smple (two-varable) lnear regresson model. It s represented

More information

Math 217 Fall 2013 Homework 2 Solutions

Math 217 Fall 2013 Homework 2 Solutions Math 17 Fall 013 Homework Solutons Due Thursday Sept. 6, 013 5pm Ths homework conssts of 6 problems of 5 ponts each. The total s 30. You need to fully justfy your answer prove that your functon ndeed has

More information

Computing Correlated Equilibria in Multi-Player Games

Computing Correlated Equilibria in Multi-Player Games Computng Correlated Equlbra n Mult-Player Games Chrstos H. Papadmtrou Presented by Zhanxang Huang December 7th, 2005 1 The Author Dr. Chrstos H. Papadmtrou CS professor at UC Berkley (taught at Harvard,

More information

P exp(tx) = 1 + t 2k M 2k. k N

P exp(tx) = 1 + t 2k M 2k. k N 1. Subgaussan tals Defnton. Say that a random varable X has a subgaussan dstrbuton wth scale factor σ< f P exp(tx) exp(σ 2 t 2 /2) for all real t. For example, f X s dstrbuted N(,σ 2 ) then t s subgaussan.

More information

The EM Algorithm (Dempster, Laird, Rubin 1977) The missing data or incomplete data setting: ODL(φ;Y ) = [Y;φ] = [Y X,φ][X φ] = X

The EM Algorithm (Dempster, Laird, Rubin 1977) The missing data or incomplete data setting: ODL(φ;Y ) = [Y;φ] = [Y X,φ][X φ] = X The EM Algorthm (Dempster, Lard, Rubn 1977 The mssng data or ncomplete data settng: An Observed Data Lkelhood (ODL that s a mxture or ntegral of Complete Data Lkelhoods (CDL. (1a ODL(;Y = [Y;] = [Y,][

More information

n α j x j = 0 j=1 has a nontrivial solution. Here A is the n k matrix whose jth column is the vector for all t j=0

n α j x j = 0 j=1 has a nontrivial solution. Here A is the n k matrix whose jth column is the vector for all t j=0 MODULE 2 Topcs: Lnear ndependence, bass and dmenson We have seen that f n a set of vectors one vector s a lnear combnaton of the remanng vectors n the set then the span of the set s unchanged f that vector

More information

CS 2750 Machine Learning. Lecture 5. Density estimation. CS 2750 Machine Learning. Announcements

CS 2750 Machine Learning. Lecture 5. Density estimation. CS 2750 Machine Learning. Announcements CS 750 Machne Learnng Lecture 5 Densty estmaton Mlos Hauskrecht mlos@cs.ptt.edu 539 Sennott Square CS 750 Machne Learnng Announcements Homework Due on Wednesday before the class Reports: hand n before

More information

Affine transformations and convexity

Affine transformations and convexity Affne transformatons and convexty The purpose of ths document s to prove some basc propertes of affne transformatons nvolvng convex sets. Here are a few onlne references for background nformaton: http://math.ucr.edu/

More information

Difference Equations

Difference Equations Dfference Equatons c Jan Vrbk 1 Bascs Suppose a sequence of numbers, say a 0,a 1,a,a 3,... s defned by a certan general relatonshp between, say, three consecutve values of the sequence, e.g. a + +3a +1

More information

Low correlation tensor decomposition via entropy maximization

Low correlation tensor decomposition via entropy maximization CS369H: Herarches of Integer Programmng Relaxatons Sprng 2016-2017 Low correlaton tensor decomposton va entropy maxmzaton Lecture and notes by Sam Hopkns Scrbes: James Hong Overvew CS369H). These notes

More information

MMA and GCMMA two methods for nonlinear optimization

MMA and GCMMA two methods for nonlinear optimization MMA and GCMMA two methods for nonlnear optmzaton Krster Svanberg Optmzaton and Systems Theory, KTH, Stockholm, Sweden. krlle@math.kth.se Ths note descrbes the algorthms used n the author s 2007 mplementatons

More information

MATH 5707 HOMEWORK 4 SOLUTIONS 2. 2 i 2p i E(X i ) + E(Xi 2 ) ä i=1. i=1

MATH 5707 HOMEWORK 4 SOLUTIONS 2. 2 i 2p i E(X i ) + E(Xi 2 ) ä i=1. i=1 MATH 5707 HOMEWORK 4 SOLUTIONS CİHAN BAHRAN 1. Let v 1,..., v n R m, all lengths v are not larger than 1. Let p 1,..., p n [0, 1] be arbtrary and set w = p 1 v 1 + + p n v n. Then there exst ε 1,..., ε

More information

Section 8.3 Polar Form of Complex Numbers

Section 8.3 Polar Form of Complex Numbers 80 Chapter 8 Secton 8 Polar Form of Complex Numbers From prevous classes, you may have encountered magnary numbers the square roots of negatve numbers and, more generally, complex numbers whch are the

More information

Appendix B. Criterion of Riemann-Stieltjes Integrability

Appendix B. Criterion of Riemann-Stieltjes Integrability Appendx B. Crteron of Remann-Steltes Integrablty Ths note s complementary to [R, Ch. 6] and [T, Sec. 3.5]. The man result of ths note s Theorem B.3, whch provdes the necessary and suffcent condtons for

More information

1 Convex Optimization

1 Convex Optimization Convex Optmzaton We wll consder convex optmzaton problems. Namely, mnmzaton problems where the objectve s convex (we assume no constrants for now). Such problems often arse n machne learnng. For example,

More information

Learning Theory: Lecture Notes

Learning Theory: Lecture Notes Learnng Theory: Lecture Notes Lecturer: Kamalka Chaudhur Scrbe: Qush Wang October 27, 2012 1 The Agnostc PAC Model Recall that one of the constrants of the PAC model s that the data dstrbuton has to be

More information

and problem sheet 2

and problem sheet 2 -8 and 5-5 problem sheet Solutons to the followng seven exercses and optonal bonus problem are to be submtted through gradescope by :0PM on Wednesday th September 08. There are also some practce problems,

More information

Lecture 20: Lift and Project, SDP Duality. Today we will study the Lift and Project method. Then we will prove the SDP duality theorem.

Lecture 20: Lift and Project, SDP Duality. Today we will study the Lift and Project method. Then we will prove the SDP duality theorem. prnceton u. sp 02 cos 598B: algorthms and complexty Lecture 20: Lft and Project, SDP Dualty Lecturer: Sanjeev Arora Scrbe:Yury Makarychev Today we wll study the Lft and Project method. Then we wll prove

More information

Note on EM-training of IBM-model 1

Note on EM-training of IBM-model 1 Note on EM-tranng of IBM-model INF58 Language Technologcal Applcatons, Fall The sldes on ths subject (nf58 6.pdf) ncludng the example seem nsuffcent to gve a good grasp of what s gong on. Hence here are

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

The Second Anti-Mathima on Game Theory

The Second Anti-Mathima on Game Theory The Second Ant-Mathma on Game Theory Ath. Kehagas December 1 2006 1 Introducton In ths note we wll examne the noton of game equlbrum for three types of games 1. 2-player 2-acton zero-sum games 2. 2-player

More information

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U) Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of

More information

Math1110 (Spring 2009) Prelim 3 - Solutions

Math1110 (Spring 2009) Prelim 3 - Solutions Math 1110 (Sprng 2009) Solutons to Prelm 3 (04/21/2009) 1 Queston 1. (16 ponts) Short answer. Math1110 (Sprng 2009) Prelm 3 - Solutons x a 1 (a) (4 ponts) Please evaluate lm, where a and b are postve numbers.

More information

First Year Examination Department of Statistics, University of Florida

First Year Examination Department of Statistics, University of Florida Frst Year Examnaton Department of Statstcs, Unversty of Florda May 7, 010, 8:00 am - 1:00 noon Instructons: 1. You have four hours to answer questons n ths examnaton.. You must show your work to receve

More information

Mean Field / Variational Approximations

Mean Field / Variational Approximations Mean Feld / Varatonal Appromatons resented by Jose Nuñez 0/24/05 Outlne Introducton Mean Feld Appromaton Structured Mean Feld Weghted Mean Feld Varatonal Methods Introducton roblem: We have dstrbuton but

More information

Complete subgraphs in multipartite graphs

Complete subgraphs in multipartite graphs Complete subgraphs n multpartte graphs FLORIAN PFENDER Unverstät Rostock, Insttut für Mathematk D-18057 Rostock, Germany Floran.Pfender@un-rostock.de Abstract Turán s Theorem states that every graph G

More information

Bezier curves. Michael S. Floater. August 25, These notes provide an introduction to Bezier curves. i=0

Bezier curves. Michael S. Floater. August 25, These notes provide an introduction to Bezier curves. i=0 Bezer curves Mchael S. Floater August 25, 211 These notes provde an ntroducton to Bezer curves. 1 Bernsten polynomals Recall that a real polynomal of a real varable x R, wth degree n, s a functon of the

More information

The Geometry of Logit and Probit

The Geometry of Logit and Probit The Geometry of Logt and Probt Ths short note s meant as a supplement to Chapters and 3 of Spatal Models of Parlamentary Votng and the notaton and reference to fgures n the text below s to those two chapters.

More information

Hidden Markov Models

Hidden Markov Models Hdden Markov Models Namrata Vaswan, Iowa State Unversty Aprl 24, 204 Hdden Markov Model Defntons and Examples Defntons:. A hdden Markov model (HMM) refers to a set of hdden states X 0, X,..., X t,...,

More information

Chapter 1. Probability

Chapter 1. Probability Chapter. Probablty Mcroscopc propertes of matter: quantum mechancs, atomc and molecular propertes Macroscopc propertes of matter: thermodynamcs, E, H, C V, C p, S, A, G How do we relate these two propertes?

More information

Lecture 14: Bandits with Budget Constraints

Lecture 14: Bandits with Budget Constraints IEOR 8100-001: Learnng and Optmzaton for Sequental Decson Makng 03/07/16 Lecture 14: andts wth udget Constrants Instructor: Shpra Agrawal Scrbed by: Zhpeng Lu 1 Problem defnton In the regular Mult-armed

More information