Lecture Space-Bounded Derandomization
|
|
- Bennett Patterson
- 5 years ago
- Views:
Transcription
1 Notes on Complexty Theory Last updated: October, 2008 Jonathan Katz Lecture Space-Bounded Derandomzaton 1 Space-Bounded Derandomzaton We now dscuss derandomzaton of space-bounded algorthms. Here non-trval results can be shown wthout mang any unproven assumptons, n contrast to what s currently nown for derandomzng tme-bounded algorthms. We show frst that 1 BPL SPACElog 2 n) and then mprove the analyss and show that 2 BPL TmeSpcpolyn), log 2 n) SC. Note: we already now RL N L SPACElog 2 n) but ths does not by tself mply BPL SPACElog 2 n).) Wth regard to the frst result, we actually prove somethng more general: Theorem 1 Any randomzed algorthm wth two-sded error) that uses space S Ωlog n) and R random bts can be converted to one that uses space OS log R) and OS log R) random bts. Snce any algorthm usng space S uses tme at most 2 S by our conventon regardng probablstc machnes) and hence at most ths many random bts, the followng s an mmedate corollary: Corollary 2 For S Ωlog n) t holds that BPSPACES) SPACES 2 ). Proof Let L BPSPACES). Theorem 1 shows that L can be decded by a probablstc machne wth two-sded error usng OS 2 ) space and OS 2 ) random bts. Enumeratng over all random bts and tang majorty, we obtan a determnstc algorthm that uses OS 2 ) space. 2 BPL SPACElog 2 n) We now prove Theorem 1. Let M be a probablstc machne runnng n space S and tme 2 S ), usng R random bts, and decdng a language L wth two-sded error. Note that S, R are functons of the nput length n, and the theorem requres S Ωlog n).) We wll assume wthout loss of generalty that M always uses exactly R random bts on all nputs. Fxng an nput x and lettng l be some parameter, we wll vew the computaton of M x as a random wal on a mult-graph n the followng way: the nodes of the graph correspond to all N def 2 OS) possble confguratons of M x, and there s an edge from a to b labeled by the strng r {0, 1} l f and only f M x moves from confguraton a to confguraton b after readng r as ts next l random bts. Computaton of M x s then equvalent to a random wal of length R/l on ths graph, begnnng from the node correspondng to the ntal confguraton of M x. f x L then the probablty that ths random 1 BPL s the two-sded-error verson of RL. 2 SC stands for Steve s class, and captures computaton that smultaneously uses polynomal tme and polylogarthmc space. Space-Bounded Derandomzaton-1
2 wal ends up n an acceptng state s at least 2/3, whle f x L then the probablty that ths random wal ends up n an acceptng state s at most 1/3. It wll be convenent to represent ths process usng an N N transton matrx Q x, where the entry n column, row j s the probablty that M x moves from confguraton to confguraton j after readng l random bts. Vectors of length N whose entres are non-negatve and sum to 1 correspond to probablty dstrbutons over the confguratons of M x n the natural way. If we let s denote the probablty dstrbuton that places probablty 1 on the ntal confguraton of M x and 0 elsewhere), then Q R/l x s corresponds to the probablty dstrbuton over the fnal confguraton of M x ; thus: x L accept x L accept ) Q R/l x s Q R/l x s 3/4 ) 1/4. The statstcal dfference between two vectors/probablty dstrbutons s, s s SDs, s ) def 1 2 s s 1 2 s s. If Q, Q are two transton matrces meanng that all entres are non-negatve, and the entres n each column sum to 1 then we abuse notaton and defne SDQ, Q ) def max s {SDQs, Q s)}, where the maxmum s taen over all s that correspond to probablty dstrbutons. Note that f Q, Q are N N transton matrces and max,j { Q,j Q,j } ε, then SDQ, Q ) Nε/ A Useful Lemma The pseudorandom generator we construct wll use a famly H of parwse-ndependent functons as a buldng bloc. Defnton 1 H {h : {0, 1} l {0, 1} l } s a famly of parwse-ndependent functons f for all dstnct x 1, x 2 {0, 1} l and any y 1, y 2 {0, 1} l we have: Pr [hx 1) y 1 hx 2 ) y 2 ] 2 2l. h H It s easy to construct a parwse-ndependent famly H whose functons map l-bt strngs to l-bt strngs and such that 1) H 2 2l and so choosng a random member of H s equvalent to choosng a random 2l-bt strng) and 2) functons n H can be evaluated n Ol) space. For S {0, 1} l, defne ρs) def S /2 l. We defne a useful property and then show that a functon chosen from a parwse-ndependent famly satsfes the property wth hgh probablty. Defnton 2 Let A, B {0, 1} l, h : {0, 1} l {0, 1} l, and ε > 0. We say h s ε, A, B)-good f: Pr x A ] hx) B ρa) ρb) ε. x {0,1} l [ Space-Bounded Derandomzaton-2
3 Note that ths s equvalent to sayng that h s ε, A, B)-good f Pr [hx) B] ρb) ε/ρa). x A Lemma 3 Let A, B {0, 1} l, H be a famly of parwse-ndependent functons, and ε > 0. Then: ρa)ρb) Pr [h s not ε, A, B)-good] h H 2 l ε 2. Proof The proof s farly straghtforward. Consder the quantty [ ) ] 2 µ def Exp h H ρb) Pr [hx) B] x A Exp h H [ρb) 2 + Pr [hx 1) B] Pr [hx 2) B] 2ρB) Pr x 1 A x 2 A ρb) 2 [ ] + Exp x1,x 2 A; h H δhx1 ) B δ hx2 ) B 2ρB) δ hx1 ) B, ] [hx 1) B] x 1 A where δ hx) B s an ndcator random varable whch s equal to 1 f hx) B and 0 otherwse. Snce H s parwse ndependent, t follows that: For any x 1 we have Exp h H [δ hx1 ) B] Pr h H [hx 1 ) B] ρb). For any x 1 x 2 we have Exp h H [δ hx1 ) B δ hx2 ) B] Exp h H [δ hx1 ) B] ρb). For any x 1 x 2 we have Exp h H [δ hx1 ) B δ hx2 ) B] Pr h H [hx 1 ) B hx 2 ) B] ρb) 2. Usng the above, we obtan µ ρb) 2 + ρb) + ρb)2 1) 2ρB) 2 ρb) ρb)2 ρb)1 ρb)). Usng Marov s nequalty, Pr [h s not ε, A, B)-good] Pr h H h H [ ) ] 2 Pr [hx) B] ρb) > ε/ρa)) 2 x A µ ρa)2 ε 2 ρb)1 ρb))ρa) 2 l ε 2 ρb)ρa) 2 l ε The Pseudorandom Generator and Its Analyss The Basc Step We frst show how to reduce the number of random bts by roughly half. Let H denote a parwsendependent famly of functons, and fx an nput x. Let Q denote the transton matrx correspondng to transtons n M x after readng l random bts; that s, the, j)th entry of Q s the Space-Bounded Derandomzaton-3
4 probablty that M x, startng n confguraton, moves to confguraton j after readng l random bts. So Q 2 s a transton matrx denotng the probablty that M x, startng n confguraton, moves to confguraton j after readng 2l random bts. Fxng h H, let Q h be a transton matrx where the, j)th entry n Q h s the probablty that M x, startng n confguraton, moves to confguraton j after readng the 2l random bts r hr) where r {0, 1} l s chosen unformly at random). Put dfferently, Q 2 corresponds to tang two unform and ndependent steps of a random wal, whereas Q h corresponds to tang two steps of a random wal where the frst step gven by r) s random and the second step namely, hr)) s a determnstc functon of the frst. We now show that these two transton matrces are very close. Specfcally: Defnton 3 Let Q, Q h, l be as defned above, and ε 0. We say h H s ε-good for Q f SDQ h, Q 2 ) ε/2. Lemma 4 Let H be a parwse-ndependent functon famly, and let Q be an N N transton matrx where transtons correspond to readng l random bts. For any ε > 0 we have: N 6 Pr [h s not ε-good for Q] h H ε 2 2 l. Proof For, j [N] correspondng to confguratons n M x ), defne B,j def {x {0, 1} l x taes Q from to j}. For fxed, j,, we now from Lemma 3 that the probablty that h s not ε/n 2, B,j, B j, )-good s at most N 4 ρb,j )/ε 2 2 l. Applyng a unon bound over all N 3 trples, j, [N], and notng that for any we have j ρb,j) 1, we have that h s ε/n 2, B,j, B j, )-good for all, j, except wth probablty at most N 6 /ε 2 2 l. We show that whenever h s ε/n 2, B,j, B j, )-good for all, j,, then h s ε-good for Q. Consder the, )th entry n Q h ; ths s gven by: j [N] Pr[r B,j hr) B j, ]. On the other hand, the, )th entry n Q 2 s: j [N] ρb,j) ρb j, ). Snce h s ε/n 2, B,j, B j, )-good for every, j,, the absolute value of ther dfference s )) Pr[r B,j hr) B j, ] ρb,j ) ρb j, j [N] j [N] j [N] Pr[r B,j hr) B j, ] ρb,j ) ρb j, ) ε/n 2 ε/n. It follows that SDQ h, Q 2 ) ε/2 as desred. The lemma above gves us a pseudorandom generator that reduces the requred randomness by roughly) half. Specfcally, defne a pseudorandom generator G 1 : {0, 1} 2l+R/2 {0, 1} R va: G 1 r 1,..., r R/2l ; h) r 1 hr 1 ) r R/2l hr R/2l ), 1) Space-Bounded Derandomzaton-4
5 where h H so h 2l) and r {0, 1} l. Assume h s ε-good for Q. Runnng M x usng the output of G 1 h, ) as the random tape generates the probablty dstrbuton R/2l Q h Q h s for the fnal confguraton, where s denotes the ntal confguraton of M x.e., s s the probablty dstrbuton that places probablty 1 on the ntal confguraton of M x, and 0 elsewhere). Runnng M x on a truly random tape generates the probablty dstrbuton R/2l Q 2 Q 2 s for the fnal confguraton. Lettng R/2l we have 2 SD Q h Q h s, ) Q 2 Q 2 s ) Q h Q h Q 2 Q 2 s 1 ) Q h Q h Q 2 Q 2 Q h Q h Q 2 Q 2 s ) Q h Q h Q 2 Q 2 Q h Q h Q 2 Q 2 s 1 1 Q h Q h Q h Q 2 ) Q 2 Q 2 s 0 ε. Ths means that the behavor of M x when run usng the output of the pseudorandom generator s very close to the behavor of M x when run usng a truly random tape: n partcular, f x L then M x n the former case accepts wth probablty at most Pr[accepts h s ε-good for Q] + Pr[h s not ε-good for Q] 1/4 + ε/2) + N 6 /ε 2 2 l ; smlarly, f x L then M x n the former case accepts wth probablty at least 3/4 ε/2 N 6 /ε 2 2 l. Summarzng and slghtly generalzng): Corollary 5 Let H be a parwse-ndependent functon famly, let Q be an N N transton matrx where transtons correspond to readng l random bts, let > 0 be an nteger, and let ε > 0. Then except wth probablty at most N 6 /ε 2 2 l over choce of h H we have: SD Q h Q h, Q 2 Q 2 ) ε/2. Space-Bounded Derandomzaton-5
6 2.2.2 Recursng Fxng h 1 H, note that Q h1 s a transton matrx and so we can apply Corollary 5 to t as well. Moreover, f Q uses R random bts then Q h1 uses R/2 random bts treatng h 1 as fxed). Contnung n ths way for I def logr/2l) + 1 logr/l) teratons, we obtan a transton matrx Q h1,...,h I. Say all h are ε-good f h 1 s ε-good for Q, and for each > 1 t holds that h s ε-good for Q h1,...,h 1. By Corollary 5 we have: All h are ε-good except wth probablty at most N 6 I/ε 2 2 l. If all h are ε-good then SDQ h1,...,h I, R/2l Q 2 Q 2 ) ε 2 I 1 R 2 l ε ) R 2 l 1. Equvalently, we obtan a pseudorandom generator G I r; h 1,..., h I ) def G I 1 r; h 1,..., h I 1 ) G I 1 h I r); h 1,..., h I 1 ), where G 1 s as n Equaton 1) Puttng t All Together We now easly obtan the desred derandomzaton. Recall N 2 Os). Set ε 2 S /10, and set l ΘS) so that N 6 S 1/20. Then the number of random bts used as nput to G ε 2 2 l I from the prevous secton) s Ol logr/l) + l) OS log R) and the space used s bounded by that as well usng the fact that each h H can be evaluated usng space Ol) OS)). All h are good except wth probablty at most N 6 logr/l)/ε 2 2 l N 6 S/ε 2 2 l 1/20; assumng all h are good, the statstcal dfference between an executon of the orgnal algorthm and the algorthm run wth a pseudorandom tape s bounded by 2 S /20 R 1/20. Theorem 1 follows easly. 3 BPL SC A determnstc algorthm usng space Olog 2 n) mght potentally run for 2 Olog2 n) steps; n fact, as descrbed, the algorthm from the proof of Corollary 2 uses ths much tme. For the partcular pseudorandom generator we have descrbed, however, t s possble to do better. The ey observaton s that nstead of just choosng the h 1,..., h I at random and smply hopng that they are all ε-good, we wll nstead determnstcally search for h 1,..., h I whch are each ε-good. Ths can be done n polynomal tme when S Olog n)) because: 1) for a gven transton matrx Q h1,...,h 1 and canddate h, t s possble to determne n polynomal tme and polylogarthmc space whether h s ε-good for Q h1,...,h 1 ths reles on the fact that the number of confguratons N s polynomal n n); 2) there are only a polynomal number of possbltes for each h snce l ΘS) Olog n)). Once we have found the good {h }, we then cycle through all possble choces of the seed r {0, 1} l and tae majorty as before). Snce there are a polynomal number of possble seeds, the algorthm as a whole runs n polynomal tme. Space-Bounded Derandomzaton-6
7 For completeness, we dscuss the case of general S Ωlog n) assumng R 2 S. Checng whether a partcular h s ε-good requres tme 2 OS). There are 2 OS) functons to search through at each stage, and OS) stages altogether. Fnally, once we obtan the good {h } we must then enumerate through 2 OS) seeds. The end result s that BPSPACES) TmeSpc2 OS), S 2 ).) Bblographc Notes The results descrbed here are due to [2, 3], both of whch are very readable. See also [1, Lecture 16] for a slghtly dfferent presentaton. References [1] O. Goldrech. Introducton to Complexty Theory July 31, 1999). [2] N. Nsan. Pseudorandom Generators for Space-Bounded Computaton. STOC 90. [3] N. Nsan. RL SC. Computatonal Complexty 4: 1 11, Prelmnary verson n STOC 92.) Space-Bounded Derandomzaton-7
Notes on Complexity Theory Last updated: December, Lecture 27
Notes on Complexity Theory Last updated: December, 2011 Jonathan Katz Lecture 27 1 Space-Bounded Derandomization We now discuss derandomization of space-bounded algorithms. Here non-trivial results can
More informationFinding Dense Subgraphs in G(n, 1/2)
Fndng Dense Subgraphs n Gn, 1/ Atsh Das Sarma 1, Amt Deshpande, and Rav Kannan 1 Georga Insttute of Technology,atsh@cc.gatech.edu Mcrosoft Research-Bangalore,amtdesh,annan@mcrosoft.com Abstract. Fndng
More informationLecture 4: Universal Hash Functions/Streaming Cont d
CSE 5: Desgn and Analyss of Algorthms I Sprng 06 Lecture 4: Unversal Hash Functons/Streamng Cont d Lecturer: Shayan Oves Gharan Aprl 6th Scrbe: Jacob Schreber Dsclamer: These notes have not been subjected
More informationNotes on Frequency Estimation in Data Streams
Notes on Frequency Estmaton n Data Streams In (one of) the data streamng model(s), the data s a sequence of arrvals a 1, a 2,..., a m of the form a j = (, v) where s the dentty of the tem and belongs to
More informationU.C. Berkeley CS278: Computational Complexity Professor Luca Trevisan 2/21/2008. Notes for Lecture 8
U.C. Berkeley CS278: Computatonal Complexty Handout N8 Professor Luca Trevsan 2/21/2008 Notes for Lecture 8 1 Undrected Connectvty In the undrected s t connectvty problem (abbrevated ST-UCONN) we are gven
More informationStanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011
Stanford Unversty CS359G: Graph Parttonng and Expanders Handout 4 Luca Trevsan January 3, 0 Lecture 4 In whch we prove the dffcult drecton of Cheeger s nequalty. As n the past lectures, consder an undrected
More informationEigenvalues of Random Graphs
Spectral Graph Theory Lecture 2 Egenvalues of Random Graphs Danel A. Spelman November 4, 202 2. Introducton In ths lecture, we consder a random graph on n vertces n whch each edge s chosen to be n the
More informationStanford University CS254: Computational Complexity Notes 7 Luca Trevisan January 29, Notes for Lecture 7
Stanford Unversty CS54: Computatonal Complexty Notes 7 Luca Trevsan January 9, 014 Notes for Lecture 7 1 Approxmate Countng wt an N oracle We complete te proof of te followng result: Teorem 1 For every
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 1 10/1/013 Martngale Concentraton Inequaltes and Applcatons Content. 1. Exponental concentraton for martngales wth bounded ncrements.
More informationMin Cut, Fast Cut, Polynomial Identities
Randomzed Algorthms, Summer 016 Mn Cut, Fast Cut, Polynomal Identtes Instructor: Thomas Kesselhem and Kurt Mehlhorn 1 Mn Cuts n Graphs Lecture (5 pages) Throughout ths secton, G = (V, E) s a mult-graph.
More informationE Tail Inequalities. E.1 Markov s Inequality. Non-Lecture E: Tail Inequalities
Algorthms Non-Lecture E: Tal Inequaltes If you hold a cat by the tal you learn thngs you cannot learn any other way. Mar Twan E Tal Inequaltes The smple recursve structure of sp lsts made t relatvely easy
More informationU.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016
U.C. Berkeley CS94: Spectral Methods and Expanders Handout 8 Luca Trevsan February 7, 06 Lecture 8: Spectral Algorthms Wrap-up In whch we talk about even more generalzatons of Cheeger s nequaltes, and
More informationLecture 10: May 6, 2013
TTIC/CMSC 31150 Mathematcal Toolkt Sprng 013 Madhur Tulsan Lecture 10: May 6, 013 Scrbe: Wenje Luo In today s lecture, we manly talked about random walk on graphs and ntroduce the concept of graph expander,
More information18.1 Introduction and Recap
CS787: Advanced Algorthms Scrbe: Pryananda Shenoy and Shjn Kong Lecturer: Shuch Chawla Topc: Streamng Algorthmscontnued) Date: 0/26/2007 We contnue talng about streamng algorthms n ths lecture, ncludng
More informationVapnik-Chervonenkis theory
Vapnk-Chervonenks theory Rs Kondor June 13, 2008 For the purposes of ths lecture, we restrct ourselves to the bnary supervsed batch learnng settng. We assume that we have an nput space X, and an unknown
More informationDifference Equations
Dfference Equatons c Jan Vrbk 1 Bascs Suppose a sequence of numbers, say a 0,a 1,a,a 3,... s defned by a certan general relatonshp between, say, three consecutve values of the sequence, e.g. a + +3a +1
More informationChapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems
Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons
More informationProblem Set 9 Solutions
Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem
More information3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X
Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number
More informationU.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 6 Luca Trevisan September 12, 2017
U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 6 Luca Trevsan September, 07 Scrbed by Theo McKenze Lecture 6 In whch we study the spectrum of random graphs. Overvew When attemptng to fnd n polynomal
More information11 Tail Inequalities Markov s Inequality. Lecture 11: Tail Inequalities [Fa 13]
Algorthms Lecture 11: Tal Inequaltes [Fa 13] If you hold a cat by the tal you learn thngs you cannot learn any other way. Mark Twan 11 Tal Inequaltes The smple recursve structure of skp lsts made t relatvely
More informationfind (x): given element x, return the canonical element of the set containing x;
COS 43 Sprng, 009 Dsjont Set Unon Problem: Mantan a collecton of dsjont sets. Two operatons: fnd the set contanng a gven element; unte two sets nto one (destructvely). Approach: Canoncal element method:
More information2.3 Nilpotent endomorphisms
s a block dagonal matrx, wth A Mat dm U (C) In fact, we can assume that B = B 1 B k, wth B an ordered bass of U, and that A = [f U ] B, where f U : U U s the restrcton of f to U 40 23 Nlpotent endomorphsms
More informationThe Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction
ECONOMICS 5* -- NOTE (Summary) ECON 5* -- NOTE The Multple Classcal Lnear Regresson Model (CLRM): Specfcaton and Assumptons. Introducton CLRM stands for the Classcal Lnear Regresson Model. The CLRM s also
More informationGrover s Algorithm + Quantum Zeno Effect + Vaidman
Grover s Algorthm + Quantum Zeno Effect + Vadman CS 294-2 Bomb 10/12/04 Fall 2004 Lecture 11 Grover s algorthm Recall that Grover s algorthm for searchng over a space of sze wors as follows: consder the
More informationEdge Isoperimetric Inequalities
November 7, 2005 Ross M. Rchardson Edge Isopermetrc Inequaltes 1 Four Questons Recall that n the last lecture we looked at the problem of sopermetrc nequaltes n the hypercube, Q n. Our noton of boundary
More informationBasic Regular Expressions. Introduction. Introduction to Computability. Theory. Motivation. Lecture4: Regular Expressions
Introducton to Computablty Theory Lecture: egular Expressons Prof Amos Israel Motvaton If one wants to descrbe a regular language, La, she can use the a DFA, Dor an NFA N, such L ( D = La that that Ths
More informationAPPROXIMATE PRICES OF BASKET AND ASIAN OPTIONS DUPONT OLIVIER. Premia 14
APPROXIMAE PRICES OF BASKE AND ASIAN OPIONS DUPON OLIVIER Prema 14 Contents Introducton 1 1. Framewor 1 1.1. Baset optons 1.. Asan optons. Computng the prce 3. Lower bound 3.1. Closed formula for the prce
More information6.842 Randomness and Computation February 18, Lecture 4
6.842 Randomness and Computaton February 18, 2014 Lecture 4 Lecturer: Rontt Rubnfeld Scrbe: Amartya Shankha Bswas Topcs 2-Pont Samplng Interactve Proofs Publc cons vs Prvate cons 1 Two Pont Samplng 1.1
More informationLecture 10 Support Vector Machines II
Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed
More informationThe Geometry of Logit and Probit
The Geometry of Logt and Probt Ths short note s meant as a supplement to Chapters and 3 of Spatal Models of Parlamentary Votng and the notaton and reference to fgures n the text below s to those two chapters.
More informationCalculation of time complexity (3%)
Problem 1. (30%) Calculaton of tme complexty (3%) Gven n ctes, usng exhaust search to see every result takes O(n!). Calculaton of tme needed to solve the problem (2%) 40 ctes:40! dfferent tours 40 add
More informationLecture Notes on Linear Regression
Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume
More informationMATH 5707 HOMEWORK 4 SOLUTIONS 2. 2 i 2p i E(X i ) + E(Xi 2 ) ä i=1. i=1
MATH 5707 HOMEWORK 4 SOLUTIONS CİHAN BAHRAN 1. Let v 1,..., v n R m, all lengths v are not larger than 1. Let p 1,..., p n [0, 1] be arbtrary and set w = p 1 v 1 + + p n v n. Then there exst ε 1,..., ε
More informationAPPENDIX A Some Linear Algebra
APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,
More informationLINEAR REGRESSION ANALYSIS. MODULE VIII Lecture Indicator Variables
LINEAR REGRESSION ANALYSIS MODULE VIII Lecture - 7 Indcator Varables Dr. Shalabh Department of Maematcs and Statstcs Indan Insttute of Technology Kanpur Indcator varables versus quanttatve explanatory
More informationUsing T.O.M to Estimate Parameter of distributions that have not Single Exponential Family
IOSR Journal of Mathematcs IOSR-JM) ISSN: 2278-5728. Volume 3, Issue 3 Sep-Oct. 202), PP 44-48 www.osrjournals.org Usng T.O.M to Estmate Parameter of dstrbutons that have not Sngle Exponental Famly Jubran
More informationBOUNDEDNESS OF THE RIESZ TRANSFORM WITH MATRIX A 2 WEIGHTS
BOUNDEDNESS OF THE IESZ TANSFOM WITH MATIX A WEIGHTS Introducton Let L = L ( n, be the functon space wth norm (ˆ f L = f(x C dx d < For a d d matrx valued functon W : wth W (x postve sem-defnte for all
More informationErrors for Linear Systems
Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch
More informationFinding Primitive Roots Pseudo-Deterministically
Electronc Colloquum on Computatonal Complexty, Report No 207 (205) Fndng Prmtve Roots Pseudo-Determnstcally Ofer Grossman December 22, 205 Abstract Pseudo-determnstc algorthms are randomzed search algorthms
More informationFoundations of Arithmetic
Foundatons of Arthmetc Notaton We shall denote the sum and product of numbers n the usual notaton as a 2 + a 2 + a 3 + + a = a, a 1 a 2 a 3 a = a The notaton a b means a dvdes b,.e. ac = b where c s an
More informationLinear Approximation with Regularization and Moving Least Squares
Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...
More informationSupplement to Clustering with Statistical Error Control
Supplement to Clusterng wth Statstcal Error Control Mchael Vogt Unversty of Bonn Matthas Schmd Unversty of Bonn In ths supplement, we provde the proofs that are omtted n the paper. In partcular, we derve
More information1 GSW Iterative Techniques for y = Ax
1 for y = A I m gong to cheat here. here are a lot of teratve technques that can be used to solve the general case of a set of smultaneous equatons (wrtten n the matr form as y = A), but ths chapter sn
More informationHomework Assignment 3 Due in class, Thursday October 15
Homework Assgnment 3 Due n class, Thursday October 15 SDS 383C Statstcal Modelng I 1 Rdge regresson and Lasso 1. Get the Prostrate cancer data from http://statweb.stanford.edu/~tbs/elemstatlearn/ datasets/prostate.data.
More informationSupplement: Proofs and Technical Details for The Solution Path of the Generalized Lasso
Supplement: Proofs and Techncal Detals for The Soluton Path of the Generalzed Lasso Ryan J. Tbshran Jonathan Taylor In ths document we gve supplementary detals to the paper The Soluton Path of the Generalzed
More informationModule 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur
Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:
More informationMatrix Approximation via Sampling, Subspace Embedding. 1 Solving Linear Systems Using SVD
Matrx Approxmaton va Samplng, Subspace Embeddng Lecturer: Anup Rao Scrbe: Rashth Sharma, Peng Zhang 0/01/016 1 Solvng Lnear Systems Usng SVD Two applcatons of SVD have been covered so far. Today we loo
More informationLecture 4. Instructor: Haipeng Luo
Lecture 4 Instructor: Hapeng Luo In the followng lectures, we focus on the expert problem and study more adaptve algorthms. Although Hedge s proven to be worst-case optmal, one may wonder how well t would
More informationDistribution of subgraphs of random regular graphs
Dstrbuton of subgraphs of random regular graphs Zhcheng Gao Faculty of Busness Admnstraton Unversty of Macau Macau Chna zcgao@umac.mo N. C. Wormald Department of Combnatorcs and Optmzaton Unversty of Waterloo
More information1 The Mistake Bound Model
5-850: Advanced Algorthms CMU, Sprng 07 Lecture #: Onlne Learnng and Multplcatve Weghts February 7, 07 Lecturer: Anupam Gupta Scrbe: Bryan Lee,Albert Gu, Eugene Cho he Mstake Bound Model Suppose there
More informationTAIL BOUNDS FOR SUMS OF GEOMETRIC AND EXPONENTIAL VARIABLES
TAIL BOUNDS FOR SUMS OF GEOMETRIC AND EXPONENTIAL VARIABLES SVANTE JANSON Abstract. We gve explct bounds for the tal probabltes for sums of ndependent geometrc or exponental varables, possbly wth dfferent
More information1 Convex Optimization
Convex Optmzaton We wll consder convex optmzaton problems. Namely, mnmzaton problems where the objectve s convex (we assume no constrants for now). Such problems often arse n machne learnng. For example,
More informationGeneralized Linear Methods
Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set
More informationMath 426: Probability MWF 1pm, Gasson 310 Homework 4 Selected Solutions
Exercses from Ross, 3, : Math 26: Probablty MWF pm, Gasson 30 Homework Selected Solutons 3, p. 05 Problems 76, 86 3, p. 06 Theoretcal exercses 3, 6, p. 63 Problems 5, 0, 20, p. 69 Theoretcal exercses 2,
More informationarxiv: v1 [math.co] 1 Mar 2014
Unon-ntersectng set systems Gyula O.H. Katona and Dánel T. Nagy March 4, 014 arxv:1403.0088v1 [math.co] 1 Mar 014 Abstract Three ntersecton theorems are proved. Frst, we determne the sze of the largest
More informationBernoulli Numbers and Polynomials
Bernoull Numbers and Polynomals T. Muthukumar tmk@tk.ac.n 17 Jun 2014 The sum of frst n natural numbers 1, 2, 3,..., n s n n(n + 1 S 1 (n := m = = n2 2 2 + n 2. Ths formula can be derved by notng that
More informationComplete subgraphs in multipartite graphs
Complete subgraphs n multpartte graphs FLORIAN PFENDER Unverstät Rostock, Insttut für Mathematk D-18057 Rostock, Germany Floran.Pfender@un-rostock.de Abstract Turán s Theorem states that every graph G
More informationMaximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models
ECO 452 -- OE 4: Probt and Logt Models ECO 452 -- OE 4 Maxmum Lkelhood Estmaton of Bnary Dependent Varables Models: Probt and Logt hs note demonstrates how to formulate bnary dependent varables models
More informationCHAPTER 14 GENERAL PERTURBATION THEORY
CHAPTER 4 GENERAL PERTURBATION THEORY 4 Introducton A partcle n orbt around a pont mass or a sphercally symmetrc mass dstrbuton s movng n a gravtatonal potental of the form GM / r In ths potental t moves
More informationThe Expectation-Maximization Algorithm
The Expectaton-Maxmaton Algorthm Charles Elan elan@cs.ucsd.edu November 16, 2007 Ths chapter explans the EM algorthm at multple levels of generalty. Secton 1 gves the standard hgh-level verson of the algorthm.
More informationLectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix
Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could
More informationBasically, if you have a dummy dependent variable you will be estimating a probability.
ECON 497: Lecture Notes 13 Page 1 of 1 Metropoltan State Unversty ECON 497: Research and Forecastng Lecture Notes 13 Dummy Dependent Varable Technques Studenmund Chapter 13 Bascally, f you have a dummy
More information12. The Hamilton-Jacobi Equation Michael Fowler
1. The Hamlton-Jacob Equaton Mchael Fowler Back to Confguraton Space We ve establshed that the acton, regarded as a functon of ts coordnate endponts and tme, satsfes ( ) ( ) S q, t / t+ H qpt,, = 0, and
More informationLecture 4: Constant Time SVD Approximation
Spectral Algorthms and Representatons eb. 17, Mar. 3 and 8, 005 Lecture 4: Constant Tme SVD Approxmaton Lecturer: Santosh Vempala Scrbe: Jangzhuo Chen Ths topc conssts of three lectures 0/17, 03/03, 03/08),
More informationprinceton univ. F 13 cos 521: Advanced Algorithm Design Lecture 3: Large deviations bounds and applications Lecturer: Sanjeev Arora
prnceton unv. F 13 cos 521: Advanced Algorthm Desgn Lecture 3: Large devatons bounds and applcatons Lecturer: Sanjeev Arora Scrbe: Today s topc s devaton bounds: what s the probablty that a random varable
More informationLecture 3: Shannon s Theorem
CSE 533: Error-Correctng Codes (Autumn 006 Lecture 3: Shannon s Theorem October 9, 006 Lecturer: Venkatesan Guruswam Scrbe: Wdad Machmouch 1 Communcaton Model The communcaton model we are usng conssts
More informationStructure and Drive Paul A. Jensen Copyright July 20, 2003
Structure and Drve Paul A. Jensen Copyrght July 20, 2003 A system s made up of several operatons wth flow passng between them. The structure of the system descrbes the flow paths from nputs to outputs.
More informationNotes prepared by Prof Mrs) M.J. Gholba Class M.Sc Part(I) Information Technology
Inverse transformatons Generaton of random observatons from gven dstrbutons Assume that random numbers,,, are readly avalable, where each tself s a random varable whch s unformly dstrbuted over the range(,).
More informationLearning Theory: Lecture Notes
Learnng Theory: Lecture Notes Lecturer: Kamalka Chaudhur Scrbe: Qush Wang October 27, 2012 1 The Agnostc PAC Model Recall that one of the constrants of the PAC model s that the data dstrbuton has to be
More informationU.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017
U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 4s Luca Trevsan September 5, 07 Summary of Lecture 4 In whch we ntroduce semdefnte programmng and apply t to Max Cut. Semdefnte Programmng Recall that
More informationLecture 2: Gram-Schmidt Vectors and the LLL Algorithm
NYU, Fall 2016 Lattces Mn Course Lecture 2: Gram-Schmdt Vectors and the LLL Algorthm Lecturer: Noah Stephens-Davdowtz 2.1 The Shortest Vector Problem In our last lecture, we consdered short solutons to
More informationLecture 5 Decoding Binary BCH Codes
Lecture 5 Decodng Bnary BCH Codes In ths class, we wll ntroduce dfferent methods for decodng BCH codes 51 Decodng the [15, 7, 5] 2 -BCH Code Consder the [15, 7, 5] 2 -code C we ntroduced n the last lecture
More informationCHAPTER 17 Amortized Analysis
CHAPTER 7 Amortzed Analyss In an amortzed analyss, the tme requred to perform a sequence of data structure operatons s averaged over all the operatons performed. It can be used to show that the average
More informationConvergence of random processes
DS-GA 12 Lecture notes 6 Fall 216 Convergence of random processes 1 Introducton In these notes we study convergence of dscrete random processes. Ths allows to characterze phenomena such as the law of large
More informationLecture 12: Discrete Laplacian
Lecture 12: Dscrete Laplacan Scrbe: Tanye Lu Our goal s to come up wth a dscrete verson of Laplacan operator for trangulated surfaces, so that we can use t n practce to solve related problems We are mostly
More informationFor now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.
Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson
More informationRandomness and Computation
Randomness and Computaton or, Randomzed Algorthms Mary Cryan School of Informatcs Unversty of Ednburgh RC 208/9) Lecture 0 slde Balls n Bns m balls, n bns, and balls thrown unformly at random nto bns usually
More informationTransfer Functions. Convenient representation of a linear, dynamic model. A transfer function (TF) relates one input and one output: ( ) system
Transfer Functons Convenent representaton of a lnear, dynamc model. A transfer functon (TF) relates one nput and one output: x t X s y t system Y s The followng termnology s used: x y nput output forcng
More informationThe Feynman path integral
The Feynman path ntegral Aprl 3, 205 Hesenberg and Schrödnger pctures The Schrödnger wave functon places the tme dependence of a physcal system n the state, ψ, t, where the state s a vector n Hlbert space
More informationHidden Markov Models & The Multivariate Gaussian (10/26/04)
CS281A/Stat241A: Statstcal Learnng Theory Hdden Markov Models & The Multvarate Gaussan (10/26/04) Lecturer: Mchael I. Jordan Scrbes: Jonathan W. Hu 1 Hdden Markov Models As a bref revew, hdden Markov models
More informationMore metrics on cartesian products
More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of
More informationModule 9. Lecture 6. Duality in Assignment Problems
Module 9 1 Lecture 6 Dualty n Assgnment Problems In ths lecture we attempt to answer few other mportant questons posed n earler lecture for (AP) and see how some of them can be explaned through the concept
More informationThe Second Anti-Mathima on Game Theory
The Second Ant-Mathma on Game Theory Ath. Kehagas December 1 2006 1 Introducton In ths note we wll examne the noton of game equlbrum for three types of games 1. 2-player 2-acton zero-sum games 2. 2-player
More informationExpected Value and Variance
MATH 38 Expected Value and Varance Dr. Neal, WKU We now shall dscuss how to fnd the average and standard devaton of a random varable X. Expected Value Defnton. The expected value (or average value, or
More information2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification
E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton
More information9 Characteristic classes
THEODORE VORONOV DIFFERENTIAL GEOMETRY. Sprng 2009 [under constructon] 9 Characterstc classes 9.1 The frst Chern class of a lne bundle Consder a complex vector bundle E B of rank p. We shall construct
More informationx = , so that calculated
Stat 4, secton Sngle Factor ANOVA notes by Tm Plachowsk n chapter 8 we conducted hypothess tests n whch we compared a sngle sample s mean or proporton to some hypotheszed value Chapter 9 expanded ths to
More informationNUMERICAL DIFFERENTIATION
NUMERICAL DIFFERENTIATION 1 Introducton Dfferentaton s a method to compute the rate at whch a dependent output y changes wth respect to the change n the ndependent nput x. Ths rate of change s called the
More informationn α j x j = 0 j=1 has a nontrivial solution. Here A is the n k matrix whose jth column is the vector for all t j=0
MODULE 2 Topcs: Lnear ndependence, bass and dmenson We have seen that f n a set of vectors one vector s a lnear combnaton of the remanng vectors n the set then the span of the set s unchanged f that vector
More informationCOS 511: Theoretical Machine Learning. Lecturer: Rob Schapire Lecture # 15 Scribe: Jieming Mao April 1, 2013
COS 511: heoretcal Machne Learnng Lecturer: Rob Schapre Lecture # 15 Scrbe: Jemng Mao Aprl 1, 013 1 Bref revew 1.1 Learnng wth expert advce Last tme, we started to talk about learnng wth expert advce.
More informationMath 217 Fall 2013 Homework 2 Solutions
Math 17 Fall 013 Homework Solutons Due Thursday Sept. 6, 013 5pm Ths homework conssts of 6 problems of 5 ponts each. The total s 30. You need to fully justfy your answer prove that your functon ndeed has
More informationExample: (13320, 22140) =? Solution #1: The divisors of are 1, 2, 3, 4, 5, 6, 9, 10, 12, 15, 18, 20, 27, 30, 36, 41,
The greatest common dvsor of two ntegers a and b (not both zero) s the largest nteger whch s a common factor of both a and b. We denote ths number by gcd(a, b), or smply (a, b) when there s no confuson
More informationCredit Card Pricing and Impact of Adverse Selection
Credt Card Prcng and Impact of Adverse Selecton Bo Huang and Lyn C. Thomas Unversty of Southampton Contents Background Aucton model of credt card solctaton - Errors n probablty of beng Good - Errors n
More informationChapter 8 Indicator Variables
Chapter 8 Indcator Varables In general, e explanatory varables n any regresson analyss are assumed to be quanttatve n nature. For example, e varables lke temperature, dstance, age etc. are quanttatve n
More informationOutline and Reading. Dynamic Programming. Dynamic Programming revealed. Computing Fibonacci. The General Dynamic Programming Technique
Outlne and Readng Dynamc Programmng The General Technque ( 5.3.2) -1 Knapsac Problem ( 5.3.3) Matrx Chan-Product ( 5.3.1) Dynamc Programmng verson 1.4 1 Dynamc Programmng verson 1.4 2 Dynamc Programmng
More informationExercises of Chapter 2
Exercses of Chapter Chuang-Cheh Ln Department of Computer Scence and Informaton Engneerng, Natonal Chung Cheng Unversty, Mng-Hsung, Chay 61, Tawan. Exercse.6. Suppose that we ndependently roll two standard
More informationCSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography
CSc 6974 and ECSE 6966 Math. Tech. for Vson, Graphcs and Robotcs Lecture 21, Aprl 17, 2006 Estmatng A Plane Homography Overvew We contnue wth a dscusson of the major ssues, usng estmaton of plane projectve
More informationEconomics 101. Lecture 4 - Equilibrium and Efficiency
Economcs 0 Lecture 4 - Equlbrum and Effcency Intro As dscussed n the prevous lecture, we wll now move from an envronment where we looed at consumers mang decsons n solaton to analyzng economes full of
More informationLecture 4: November 17, Part 1 Single Buffer Management
Lecturer: Ad Rosén Algorthms for the anagement of Networs Fall 2003-2004 Lecture 4: November 7, 2003 Scrbe: Guy Grebla Part Sngle Buffer anagement In the prevous lecture we taled about the Combned Input
More information