Handout 7. and Pr [M(x) = χ L (x) M(x) =? ] = 1.
|
|
- Arleen Gardner
- 6 years ago
- Views:
Transcription
1 Notes on Coplexity Theory Last updated: October, 2005 Jonathan Katz Handout 7 1 More on Randoized Coplexity Classes Reinder: so far we have seen RP,coRP, and BPP. We introduce two ore tie-bounded randoized coplexity classes: ZPP and PP. 1.1 ZPP ZPP ay be defined in various ways; we will pick one arbitrarily and state the others (and their equivalence) as clais. Definition 1 Class ZPP consists of languages L for which there exists a ppt achine M which is allowed to output 1, 0, or? and such that, for all x: Pr [M(x) =? ] 1 2 and Pr [M(x) = χ L (x) M(x) =? ] = 1. That is, M always outputs either the correct answer or a special sybol? denoting don t know, and it outputs don t know with probability at ost half. Clai 1 ZPP = RP corp and hence ZPP BPP. Clai 2 ZPP consists of languages L for which there exists a achine M running in expected polynoial tie and for which: Pr[M(x) = χ L (x)] = 1. (A achine runs in expected polynoial tie if there exists a polynoial p such that for all x, the expected running tie of M(x) is at ost p( x ).) Suarizing what we have so far: P ZPP RP BPP. However (soewhat surprisingly), it is currently believed that P = BPP. 1.2 PP RP, corp, BPP, and ZPP represent possibly-useful relaxations of deterinistic polynoialtie coputation. The next class does not, as we will see. Definition 2 L PP if there exists a ppt achine M such that: Pr[M(x) = χ L (x)] >
2 Note that requiring only Pr[M(x) = χ L (x)] 1 2 akes the definition trivial, as it can be achieved by flipping a rando coin. Note also that error reduction does not apply to PP, since to increase the gap between acceptance and rejection to an inverse polynoial we ight have to run our original algorith an exponential nuber of ties (and so we would no longer have a ppt algorith). Clai 3 BPP PP PSPACE. The first inclusion follows iediately fro the definitions. The second inclusion follows since, given a ppt Turing achine M, we ay enuerate all coins (and take a ajority vote) in PSPACE. The following shows that PP is too lax a definition: Clai 4 N P PP. Proof Let L N P, and assue that inputs x L have witnesses of size exactly p( x ) (by padding, this is w.l.o.g.). Consider the following PP algorith for L on input x: with probability 1/2 2 p( x ) /4, accept. Otherwise, choose a string w {0,1} p( x ) at rando and accept iff w is a witness for x. Clearly, if x L then the probability that this algorith accepts is less than 1/2. On the other hand, if x L then there is at least one witness and so the probability of acceptance is at least: p( x ) /4 + 2 p( x ) /2 > BPP in Relation to Deterinistic Coplexity Classes BPP is currently considered to be the right notion of what is efficiently coputable. It is therefore of interest to see exactly how powerful this class is. 2.1 BPP P/poly We claied in an earlier lecture that P/poly provides an upper bound to efficient coputation. We show that this is true with regard to BPP. Theore 5 BPP P/poly. Proof Let L BPP. We know that there exists a probabilistic polynoial-tie Turing achine M and a polynoial p such that M uses a rando tape of length p( x ) and Pr r [M(x;r) χ L (x)] < 2 2 x 2. An equivalent way of looking at this is that for any n and each x {0,1} n there is a set of bad coins for x (for which M(x) returns the wrong answer), but the size of this bad set is saller than 2 p(n) 2 2n2. Taking the union of these bad sets over all x {0,1} n, we find that the total nuber of rando coins which are bad for soe x {0,1} n is at ost 2 p(n) 2 n 2n2 < 2 p(n). In particular, for each n there exists at least one rando tape rn {0,1} p(n) which is good for every x {0,1} n (in fact, there are any such rando tapes). If we let the sequence of advice strings be exactly these {rn} n N, we obtain the result of the theore. 7-2
3 2.2 BPP is in the Polynoial Hierarchy We noted earlier that it is not known whether BPP N P. However, we can place BPP in the polynoial hierarchy. Here, we give two different proofs that BPP Σ 2 Π Lauteann s Proof We first prove soe easy propositions. For S {0,1}, say S is large if S (1 1 ) 2. Say S is sall if S < 2. Finally, for a string z {0,1} define S z def = {s z s S}. We first prove: Proposition 6 Let S {0,1} be sall. Then for all z 1,...,z {0,1} we have (S z i) {0,1}. This follows easily by counting. On the one hand, {0,1} = 2. On the other hand, for any z 1,...,z we have (S z i ) S z i Furtherore: Proposition 7 If S is large, then: [ Pr z 1,...,z {0,1} = S < 2. ] ( 2 (S z i ) = {0,1} > 1 To see this, consider first the probability that soe fixed y is not in i (S z i). This is given by: ). Pr z 1,...,z {0,1} [y i (S z i )] = Pr z i {0,1} [y S z i] ( ) 1. Applying a union bound over all y {0,1}, we see that the probability that there exists a y {0,1} which is not in i (S z i) is at ost 2. The proposition follows. Given L BPP, there exist a polynoial and an algorith M such that M uses ( x ) rando coins and errs with probability less than 1/( x ). For any x {0,1} n, let S x {0,1} ( x ) denote the set of rando coins for which M(x;r) outputs 1. Thus, if x L (and letting = ( x )) we have S x (1 1 ) 2 while if x L then S x < 2. This leads to the following Σ 2 characterization of L: x L z 1,...,z {0,1} y {0,1} : y (S x z i ). 7-3
4 Note that the desired condition can be efficiently verified by checking whether M(x;y z i ). i We conclude that BPP Σ 2. Using the fact that BPP is closed under copleent gives the claied result The Sipser-Gács Proof The proof in the previous section is a bit easier than the one we present here, but the technique here is quite useful. First, soe notation. Let h : {0,1} R be a function, let S {0,1}, and let s S. We say that h isolates s if h(s ) h(s) for all s S \ {s}. We say a collection of functions H def = {h 1,...,h l } isolates s if there exists an h i H which isolates s. Finally, we say H isolates S if H isolates every eleent of S. Proposition 8 Let H be a faily of pairwise-independent hash functions apping {0,1} to soe set R. Let S {0,1}. Then If S > R then: If S +1 R then: Pr [{h 1,...,h } isolates S] = 0. h 1,...,h H Pr [{h 1,...,h } isolates S] > 0. h 1,...,h H Proof For the first part of the proposition, note that each hash function h i can isolate at ost R eleents. So hash functions can isolate at ost R eleents. Since S > R, the clai follows. For the second part, assue S is non-epty (if S is epty then the theore is trivially true). Consider the probability that a particular s S is not isolated: ( ) Pr [{h 1,...,h } does not isolate s] = Pr [h i does not isolate s] h 1,...,h H h i H Pr[h(s ) = h(s)] h < s S\{s} ( ) S. R Suing over all s S, we see that the probability that S is not isolated is less than 1. This gives the stated result. S +1 R Let L BPP. Then there exists a achine M using ( x ) rando coins which errs with probability less than 1/4( x ). Let x {0,1} n, set = ( x ), and define S x as in the previous section. Set R = 2 /2. Note that if x L then S x 3 2 /4 > R. On 7-4
5 the other hand, if x L then S x < 2 /4 and so Sx +1 R < 1/4. The above proposition thus leads to the following Π 2 characterization of L: x L h 1,...,h H s,s 1,...,s : i ((s s i ) (h i (s) = h i (s i )) (s,s i S x )) (note that ebership in S x can be verified in polynoial tie, as in the previous section). Closure of BPP under copleent gives the stated result. 3 Randoized Space Classes An iportant note regarding randoized space classes is that we do not allow the achine to store its previous rando coin flips for free (it can, if it chooses, write its rando choice(s) on its work tape). If we consider the odel in which a randoized Turing achine is siply a Turing achine with access to a rando tape, this iplies that we allow only unidirectional access to the rando tape. There is an additional subtlety regarding the definition of randoized space classes: we need to also bound the running tie. In particular, we will define rspace(s(n)) as follows: Definition 3 A language L is in rspace(s(n)) if there exists a randoized Turing achine M using s(n) space and 2 O(s(n)) tie and such that x L Pr[M(x) = 1] 1 2 and x L Pr[M(x) = 0] = 1. Without the tie restriction, we get a class which is too powerful: Proposition 9 Define rspace as above, but without the tie restriction. Then for any space-constructible s(n) log n we have rspace (s(n)) = nspace(s(n)). Proof (Sketch) Showing that rspace (s(n)) nspace(s(n)) is easy. We turn to the other direction. The basic idea is that, given a language L nspace(s(n)), we construct a achine which on input x guesses valid witnesses for x (where a witness here is an accepting coputation of the non-deterinistic achine on input x). Since there ay only be a single witness, we guess a doubly-exponential nuber of ties. This is where the absence of a tie bound akes a difference. In ore detail, given L as above we know that any x L has a witness (i.e., an accepting coputation) of length at ost l(n) = 2 O(s(n)). Assuing such a witness exists, we can guess it with probability at least 2 l(n) (and verify whether we have a witness or not using space O(s(n))). Equivalently, the expected nuber of ties until we guess the witness is 2 l(n). The intuition is that if we guess 2 l(n) witnesses, we have a good chance of guessing a correct one. Looking at it in that way, the proble boils down to ipleenting a counter that can count up to 2 l(n). The naive idea of using a standard l(n)-bit counter will not work, since l(n) is exponential in s(n)! Instead, we use a randoized counter: each tie after guessing a witness, flip l(n) coins and if they are all 0 stop; otherwise, continue. This can be done using a counter of size log l(n) = O(s(n)). 7-5
6 Note that the Turing achine thus defined ay run for infinite tie; however, the probability that it does so is 0. In any case, it never uses ore than s(n) space, as required. Furtherore (using the fact that infinite runs occur with probability 0) the achine satisfies x L Pr[M(x) = 1] 1 2 and x L Pr[M(x) = 0] = RL Define RL = rspace(log n). We show that undirected graph connectivity is in RL (here, we are given an undirected graph and vertices s,t and asked to decide whether there is a path fro s to t). This follows easily fro the following result: Theore 10 Let G be an n-vertex undirected graph, and s an arbitrary vertex in G. A rando walk of length 4n 3 beginning at s visits all vertices in the connected coponent of s with probability at least 1/2. Proof In the next lecture we will discuss Markov chains and rando walks on undirected graphs, and will show that if G is non-bipartite then for any edge (u,v) in the connected coponent containing s, the expected tie to ove fro vertex u to vertex v is at ost 2 E + n n 2. (Note that if G is bipartite, we can ake it non-bipartite by entally adding n self-loops; the expected tie to ove fro u to v is then at ost 2 E + n as claied. But taking a rando walk in the actual graph (without self-loops) can only result in a lower expected tie to ove fro u to v.) Consider any spanning tree of the connected coponent containing s; this will contain n 1 edges. Considering any traversal of this spanning tree (which traverses fewer than 2n edges), we see that the expected tie to reach every vertex in the connected coponent is at ost 2n n 2 = 2n 3. Taking a rando walk for twice this any steps eans we will reach the entire coponent with probability at least half. We reark that the analogous result does not hold for directed graphs. (If it did, we would have RL = N L which is considered unlikely. To be clear: it is possible that soe other algorith can solve directed connectivity in RL, but the above algorith does not.) Next tie, we will see another application of rando walks to solving 2-SAT. Bibliographic Notes Sections 1 and 3 are adapted fro [2, Lecture 7], and Section 2.2 fro [1, Lectures 19 20]. References [1] J.-Y. Cai. Scribe notes for CS 810: Introduction to Coplexity Theory [2] O. Goldreich. Introduction to Coplexity Theory (July 31, 1999). 7-6
Notes on Complexity Theory Last updated: November, Lecture 10
Notes on Complexity Theory Last updated: November, 2015 Lecture 10 Notes by Jonathan Katz, lightly edited by Dov Gordon. 1 Randomized Time Complexity 1.1 How Large is BPP? We know that P ZPP = RP corp
More informationHandout 5. α a1 a n. }, where. xi if a i = 1 1 if a i = 0.
Notes on Complexity Theory Last updated: October, 2005 Jonathan Katz Handout 5 1 An Improved Upper-Bound on Circuit Size Here we show the result promised in the previous lecture regarding an upper-bound
More informationLecture 5: Logspace reductions and completeness
Coputational Coplexity Theory, Fall 2010 Septeber 8 Lecture 5: Logspace reductions and copleteness Lecturer: Kristoffer Arnsfelt Hansen Scribe: Claes Højer Jensen Recall the given relation between classes:
More information13.2 Fully Polynomial Randomized Approximation Scheme for Permanent of Random 0-1 Matrices
CS71 Randoness & Coputation Spring 018 Instructor: Alistair Sinclair Lecture 13: February 7 Disclaier: These notes have not been subjected to the usual scrutiny accorded to foral publications. They ay
More informationUmans Complexity Theory Lectures
Umans Complexity Theory Lectures Lecture 8: Introduction to Randomized Complexity: - Randomized complexity classes, - Error reduction, - in P/poly - Reingold s Undirected Graph Reachability in RL Randomized
More informationLecture 12: Randomness Continued
CS 710: Complexity Theory 2/25/2010 Lecture 12: Randomness Continued Instructor: Dieter van Melkebeek Scribe: Beth Skubak & Nathan Collins In the last lecture we introduced randomized computation in terms
More informationLecture 3. 1 Terminology. 2 Non-Deterministic Space Complexity. Notes on Complexity Theory: Fall 2005 Last updated: September, 2005.
Notes on Complexity Theory: Fall 2005 Last updated: September, 2005 Jonathan Katz Lecture 3 1 Terminology For any complexity class C, we define the class coc as follows: coc def = { L L C }. One class
More information1 Generalization bounds based on Rademacher complexity
COS 5: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #0 Scribe: Suqi Liu March 07, 08 Last tie we started proving this very general result about how quickly the epirical average converges
More informationLecture 17. In this lecture, we will continue our discussion on randomization.
ITCS:CCT09 : Computational Complexity Theory May 11, 2009 Lecturer: Jayalal Sarma M.N. Lecture 17 Scribe: Hao Song In this lecture, we will continue our discussion on randomization. 1 BPP and the Polynomial
More informationU.C. Berkeley CS278: Computational Complexity Professor Luca Trevisan 9/6/2004. Notes for Lecture 3
U.C. Berkeley CS278: Computational Complexity Handout N3 Professor Luca Trevisan 9/6/2004 Notes for Lecture 3 Revised 10/6/04 1 Space-Bounded Complexity Classes A machine solves a problem using space s(
More informationCSE525: Randomized Algorithms and Probabilistic Analysis May 16, Lecture 13
CSE55: Randoied Algoriths and obabilistic Analysis May 6, Lecture Lecturer: Anna Karlin Scribe: Noah Siegel, Jonathan Shi Rando walks and Markov chains This lecture discusses Markov chains, which capture
More informationNotes on Space-Bounded Complexity
U.C. Berkeley CS172: Automata, Computability and Complexity Handout 6 Professor Luca Trevisan 4/13/2004 Notes on Space-Bounded Complexity These are notes for CS278, Computational Complexity, scribed by
More informationNotes for Lecture 3... x 4
Stanford University CS254: Computational Complexity Notes 3 Luca Trevisan January 18, 2012 Notes for Lecture 3 In this lecture we introduce the computational model of boolean circuits and prove that polynomial
More informationNotes on Space-Bounded Complexity
U.C. Berkeley CS172: Automata, Computability and Complexity Handout 7 Professor Luca Trevisan April 14, 2015 Notes on Space-Bounded Complexity These are notes for CS278, Computational Complexity, scribed
More informationQuantum algorithms (CO 781, Winter 2008) Prof. Andrew Childs, University of Waterloo LECTURE 15: Unstructured search and spatial search
Quantu algoriths (CO 781, Winter 2008) Prof Andrew Childs, University of Waterloo LECTURE 15: Unstructured search and spatial search ow we begin to discuss applications of quantu walks to search algoriths
More informationDSPACE(n)? = NSPACE(n): A Degree Theoretic Characterization
DSPACE(n)? = NSPACE(n): A Degree Theoretic Characterization Manindra Agrawal Departent of Coputer Science and Engineering Indian Institute of Technology, Kanpur 208016, INDIA eail: anindra@iitk.ernet.in
More informationMath 262A Lecture Notes - Nechiporuk s Theorem
Math 6A Lecture Notes - Nechiporuk s Theore Lecturer: Sa Buss Scribe: Stefan Schneider October, 013 Nechiporuk [1] gives a ethod to derive lower bounds on forula size over the full binary basis B The lower
More information1 Randomized complexity
80240233: Complexity of Computation Lecture 6 ITCS, Tsinghua Univesity, Fall 2007 23 October 2007 Instructor: Elad Verbin Notes by: Zhang Zhiqiang and Yu Wei 1 Randomized complexity So far our notion of
More informationRandomized Computation
Randomized Computation Slides based on S.Aurora, B.Barak. Complexity Theory: A Modern Approach. Ahto Buldas Ahto.Buldas@ut.ee We do not assume anything about the distribution of the instances of the problem
More information1 Proof of learning bounds
COS 511: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #4 Scribe: Akshay Mittal February 13, 2013 1 Proof of learning bounds For intuition of the following theore, suppose there exists a
More informationThe Frequent Paucity of Trivial Strings
The Frequent Paucity of Trivial Strings Jack H. Lutz Departent of Coputer Science Iowa State University Aes, IA 50011, USA lutz@cs.iastate.edu Abstract A 1976 theore of Chaitin can be used to show that
More information1 Rademacher Complexity Bounds
COS 511: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #10 Scribe: Max Goer March 07, 2013 1 Radeacher Coplexity Bounds Recall the following theore fro last lecture: Theore 1. With probability
More informationA Quantum Observable for the Graph Isomorphism Problem
A Quantu Observable for the Graph Isoorphis Proble Mark Ettinger Los Alaos National Laboratory Peter Høyer BRICS Abstract Suppose we are given two graphs on n vertices. We define an observable in the Hilbert
More informationLecture 22: Counting
CS 710: Complexity Theory 4/8/2010 Lecture 22: Counting Instructor: Dieter van Melkebeek Scribe: Phil Rydzewski & Chi Man Liu Last time we introduced extractors and discussed two methods to construct them.
More informationThe Weierstrass Approximation Theorem
36 The Weierstrass Approxiation Theore Recall that the fundaental idea underlying the construction of the real nubers is approxiation by the sipler rational nubers. Firstly, nubers are often deterined
More informationThis model assumes that the probability of a gap has size i is proportional to 1/i. i.e., i log m e. j=1. E[gap size] = i P r(i) = N f t.
CS 493: Algoriths for Massive Data Sets Feb 2, 2002 Local Models, Bloo Filter Scribe: Qin Lv Local Models In global odels, every inverted file entry is copressed with the sae odel. This work wells when
More informationLecture 6: Oracle TMs, Diagonalization Limits, Space Complexity
CSE 531: Computational Complexity I Winter 2016 Lecture 6: Oracle TMs, Diagonalization Limits, Space Complexity January 22, 2016 Lecturer: Paul Beame Scribe: Paul Beame Diagonalization enabled us to separate
More information1 Randomized Computation
CS 6743 Lecture 17 1 Fall 2007 1 Randomized Computation Why is randomness useful? Imagine you have a stack of bank notes, with very few counterfeit ones. You want to choose a genuine bank note to pay at
More informationCombining Classifiers
Cobining Classifiers Generic ethods of generating and cobining ultiple classifiers Bagging Boosting References: Duda, Hart & Stork, pg 475-480. Hastie, Tibsharini, Friedan, pg 246-256 and Chapter 10. http://www.boosting.org/
More informationCSC 5170: Theory of Computational Complexity Lecture 5 The Chinese University of Hong Kong 8 February 2010
CSC 5170: Theory of Computational Complexity Lecture 5 The Chinese University of Hong Kong 8 February 2010 So far our notion of realistic computation has been completely deterministic: The Turing Machine
More informationAnalyzing Simulation Results
Analyzing Siulation Results Dr. John Mellor-Cruey Departent of Coputer Science Rice University johnc@cs.rice.edu COMP 528 Lecture 20 31 March 2005 Topics for Today Model verification Model validation Transient
More informationNotes for Lecture 3... x 4
Stanford University CS254: Computational Complexity Notes 3 Luca Trevisan January 14, 2014 Notes for Lecture 3 In this lecture we introduce the computational model of boolean circuits and prove that polynomial
More informationL S is not p m -hard for NP. Moreover, we prove for every L NP P, that there exists a sparse S EXP such that L S is not p m -hard for NP.
Properties of NP-Coplete Sets Christian Glaßer, A. Pavan, Alan L. Selan, Saik Sengupta January 15, 2004 Abstract We study several properties of sets that are coplete for NP. We prove that if L is an NP-coplete
More informationLecture 2. 1 More N P-Compete Languages. Notes on Complexity Theory: Fall 2005 Last updated: September, Jonathan Katz
Notes on Complexity Theory: Fall 2005 Last updated: September, 2005 Jonathan Katz Lecture 2 1 More N P-Compete Languages It will be nice to find more natural N P-complete languages. To that end, we ine
More informationList Scheduling and LPT Oliver Braun (09/05/2017)
List Scheduling and LPT Oliver Braun (09/05/207) We investigate the classical scheduling proble P ax where a set of n independent jobs has to be processed on 2 parallel and identical processors (achines)
More informationProbability and Stochastic Processes: A Friendly Introduction for Electrical and Computer Engineers Roy D. Yates and David J.
Probability and Stochastic Processes: A Friendly Introduction for Electrical and oputer Engineers Roy D. Yates and David J. Goodan Proble Solutions : Yates and Goodan,1..3 1.3.1 1.4.6 1.4.7 1.4.8 1..6
More informationSolutions of some selected problems of Homework 4
Solutions of soe selected probles of Hoework 4 Sangchul Lee May 7, 2018 Proble 1 Let there be light A professor has two light bulbs in his garage. When both are burned out, they are replaced, and the next
More informationComputability and Complexity Random Sources. Computability and Complexity Andrei Bulatov
Coputabilit and Copleit 29- Rando Sources Coputabilit and Copleit Andrei Bulatov Coputabilit and Copleit 29-2 Rando Choices We have seen several probabilistic algoriths, that is algoriths that ake soe
More informationLecture 59 : Instance Compression and Succinct PCP s for NP
IITM-CS6840: Advanced Complexity Theory March 31, 2012 Lecture 59 : Instance Compression and Succinct PCP s for NP Lecturer: Sivaramakrishnan N.R. Scribe: Prashant Vasudevan 1 Introduction Classical Complexity
More informationRandomized Complexity Classes; RP
Randomized Complexity Classes; RP Let N be a polynomial-time precise NTM that runs in time p(n) and has 2 nondeterministic choices at each step. N is a polynomial Monte Carlo Turing machine for a language
More informationLecture 20: Goemans-Williamson MAXCUT Approximation Algorithm. 2 Goemans-Williamson Approximation Algorithm for MAXCUT
CS 80: Introduction to Complexity Theory 0/03/03 Lecture 20: Goemans-Williamson MAXCUT Approximation Algorithm Instructor: Jin-Yi Cai Scribe: Christopher Hudzik, Sarah Knoop Overview First, we outline
More informationTheory of Computer Science to Msc Students, Spring Lecture 2
Theory of Computer Science to Msc Students, Spring 2007 Lecture 2 Lecturer: Dorit Aharonov Scribe: Bar Shalem and Amitai Gilad Revised: Shahar Dobzinski, March 2007 1 BPP and NP The theory of computer
More informationLecture 23: Alternation vs. Counting
CS 710: Complexity Theory 4/13/010 Lecture 3: Alternation vs. Counting Instructor: Dieter van Melkebeek Scribe: Jeff Kinne & Mushfeq Khan We introduced counting complexity classes in the previous lecture
More information1 Proving the Fundamental Theorem of Statistical Learning
THEORETICAL MACHINE LEARNING COS 5 LECTURE #7 APRIL 5, 6 LECTURER: ELAD HAZAN NAME: FERMI MA ANDDANIEL SUO oving te Fundaental Teore of Statistical Learning In tis section, we prove te following: Teore.
More informationKinematics and dynamics, a computational approach
Kineatics and dynaics, a coputational approach We begin the discussion of nuerical approaches to echanics with the definition for the velocity r r ( t t) r ( t) v( t) li li or r( t t) r( t) v( t) t for
More information1 Identical Parallel Machines
FB3: Matheatik/Inforatik Dr. Syaantak Das Winter 2017/18 Optiizing under Uncertainty Lecture Notes 3: Scheduling to Miniize Makespan In any standard scheduling proble, we are given a set of jobs J = {j
More informationE0 370 Statistical Learning Theory Lecture 6 (Aug 30, 2011) Margin Analysis
E0 370 tatistical Learning Theory Lecture 6 (Aug 30, 20) Margin Analysis Lecturer: hivani Agarwal cribe: Narasihan R Introduction In the last few lectures we have seen how to obtain high confidence bounds
More informationLecture 21 Principle of Inclusion and Exclusion
Lecture 21 Principle of Inclusion and Exclusion Holden Lee and Yoni Miller 5/6/11 1 Introduction and first exaples We start off with an exaple Exaple 11: At Sunnydale High School there are 28 students
More informationLecture 24: Randomized Complexity, Course Summary
6.045 Lecture 24: Randomized Complexity, Course Summary 1 1/4 1/16 1/4 1/4 1/32 1/16 1/32 Probabilistic TMs 1/16 A probabilistic TM M is a nondeterministic TM where: Each nondeterministic step is called
More informationReversibility of Turing Machine Computations
Reversiility of Turing Machine Coputations Zvi M. Kede NYU CS Technical Report TR-2013-956 May 13, 2013 Astract Since Bennett s 1973 seinal paper, there has een a growing interest in general-purpose, reversile
More informationLecture 9 November 23, 2015
CSC244: Discrepancy Theory in Coputer Science Fall 25 Aleksandar Nikolov Lecture 9 Noveber 23, 25 Scribe: Nick Spooner Properties of γ 2 Recall that γ 2 (A) is defined for A R n as follows: γ 2 (A) = in{r(u)
More informationLecture Examples of problems which have randomized algorithms
6.841 Advanced Complexity Theory March 9, 2009 Lecture 10 Lecturer: Madhu Sudan Scribe: Asilata Bapat Meeting to talk about final projects on Wednesday, 11 March 2009, from 5pm to 7pm. Location: TBA. Includes
More information: On the P vs. BPP problem. 30/12/2016 Lecture 11
03684155: On the P vs. BPP problem. 30/12/2016 Lecture 11 Promise problems Amnon Ta-Shma and Dean Doron 1 Definitions and examples In a promise problem, we are interested in solving a problem only on a
More informationBloom Filters. filters: A survey, Internet Mathematics, vol. 1 no. 4, pp , 2004.
Bloo Filters References A. Broder and M. Mitzenacher, Network applications of Bloo filters: A survey, Internet Matheatics, vol. 1 no. 4, pp. 485-509, 2004. Li Fan, Pei Cao, Jussara Aleida, Andrei Broder,
More informationLecture Notes Each circuit agrees with M on inputs of length equal to its index, i.e. n, x {0, 1} n, C n (x) = M(x).
CS 221: Computational Complexity Prof. Salil Vadhan Lecture Notes 4 February 3, 2010 Scribe: Jonathan Pines 1 Agenda P-/NP- Completeness NP-intermediate problems NP vs. co-np L, NL 2 Recap Last time, we
More informationBlock designs and statistics
Bloc designs and statistics Notes for Math 447 May 3, 2011 The ain paraeters of a bloc design are nuber of varieties v, bloc size, nuber of blocs b. A design is built on a set of v eleents. Each eleent
More informationarxiv: v1 [cs.ds] 17 Mar 2016
Tight Bounds for Single-Pass Streaing Coplexity of the Set Cover Proble Sepehr Assadi Sanjeev Khanna Yang Li Abstract arxiv:1603.05715v1 [cs.ds] 17 Mar 2016 We resolve the space coplexity of single-pass
More informationLecture 2 Sep 5, 2017
CS 388R: Randomized Algorithms Fall 2017 Lecture 2 Sep 5, 2017 Prof. Eric Price Scribe: V. Orestis Papadigenopoulos and Patrick Rall NOTE: THESE NOTES HAVE NOT BEEN EDITED OR CHECKED FOR CORRECTNESS 1
More informationU.C. Berkeley CS278: Computational Complexity Professor Luca Trevisan 1/29/2002. Notes for Lecture 3
U.C. Bereley CS278: Computational Complexity Handout N3 Professor Luca Trevisan 1/29/2002 Notes for Lecture 3 In this lecture we will define the probabilistic complexity classes BPP, RP, ZPP and we will
More informationLean Walsh Transform
Lean Walsh Transfor Edo Liberty 5th March 007 inforal intro We show an orthogonal atrix A of size d log 4 3 d (α = log 4 3) which is applicable in tie O(d). By applying a rando sign change atrix S to the
More informationLecture 12: Interactive Proofs
princeton university cos 522: computational complexity Lecture 12: Interactive Proofs Lecturer: Sanjeev Arora Scribe:Carl Kingsford Recall the certificate definition of NP. We can think of this characterization
More informationLecture 21. Interior Point Methods Setup and Algorithm
Lecture 21 Interior Point Methods In 1984, Kararkar introduced a new weakly polynoial tie algorith for solving LPs [Kar84a], [Kar84b]. His algorith was theoretically faster than the ellipsoid ethod and
More informationSpace is a computation resource. Unlike time it can be reused. Computational Complexity, by Fu Yuxi Space Complexity 1 / 44
Space Complexity Space is a computation resource. Unlike time it can be reused. Computational Complexity, by Fu Yuxi Space Complexity 1 / 44 Synopsis 1. Space Bounded Computation 2. Logspace Reduction
More informationFinite fields. and we ve used it in various examples and homework problems. In these notes I will introduce more finite fields
Finite fields I talked in class about the field with two eleents F 2 = {, } and we ve used it in various eaples and hoework probles. In these notes I will introduce ore finite fields F p = {,,...,p } for
More informationCS294: Pseudorandomness and Combinatorial Constructions September 13, Notes for Lecture 5
UC Berkeley Handout N5 CS94: Pseudorandomness and Combinatorial Constructions September 3, 005 Professor Luca Trevisan Scribe: Gatis Midrijanis Notes for Lecture 5 In the few lectures we are going to look
More informationChapter 5 : Randomized computation
Dr. Abhijit Das, Chapter 5 : Randomized computation Probabilistic or randomized computation often provides practical methods to solve certain computational problems. In order to define probabilistic complexity
More informationLecture October 23. Scribes: Ruixin Qiang and Alana Shine
CSCI699: Topics in Learning and Gae Theory Lecture October 23 Lecturer: Ilias Scribes: Ruixin Qiang and Alana Shine Today s topic is auction with saples. 1 Introduction to auctions Definition 1. In a single
More informationKernel Methods and Support Vector Machines
Intelligent Systes: Reasoning and Recognition Jaes L. Crowley ENSIAG 2 / osig 1 Second Seester 2012/2013 Lesson 20 2 ay 2013 Kernel ethods and Support Vector achines Contents Kernel Functions...2 Quadratic
More informationUmans Complexity Theory Lectures
Umans Complexity Theory Lectures Lecture 12: The Polynomial-Time Hierarchy Oracle Turing Machines Oracle Turing Machine (OTM): Deterministic multitape TM M with special query tape special states q?, q
More informationPr[X = s Y = t] = Pr[X = s] Pr[Y = t]
Homework 4 By: John Steinberger Problem 1. Recall that a real n n matrix A is positive semidefinite if A is symmetric and x T Ax 0 for all x R n. Assume A is a real n n matrix. Show TFAE 1 : (a) A is positive
More informationLecture 4 : Quest for Structure in Counting Problems
CS6840: Advanced Complexity Theory Jan 10, 2012 Lecture 4 : Quest for Structure in Counting Problems Lecturer: Jayalal Sarma M.N. Scribe: Dinesh K. Theme: Between P and PSPACE. Lecture Plan:Counting problems
More informationUnderstanding Machine Learning Solution Manual
Understanding Machine Learning Solution Manual Written by Alon Gonen Edited by Dana Rubinstein Noveber 17, 2014 2 Gentle Start 1. Given S = ((x i, y i )), define the ultivariate polynoial p S (x) = i []:y
More informationma x = -bv x + F rod.
Notes on Dynaical Systes Dynaics is the study of change. The priary ingredients of a dynaical syste are its state and its rule of change (also soeties called the dynaic). Dynaical systes can be continuous
More informationLecture 15: Expanders
CS 710: Complexity Theory 10/7/011 Lecture 15: Expanders Instructor: Dieter van Melkebeek Scribe: Li-Hsiang Kuo In the last lecture we introduced randomized computation in terms of machines that have access
More informationThe Simplex Method is Strongly Polynomial for the Markov Decision Problem with a Fixed Discount Rate
The Siplex Method is Strongly Polynoial for the Markov Decision Proble with a Fixed Discount Rate Yinyu Ye April 20, 2010 Abstract In this note we prove that the classic siplex ethod with the ost-negativereduced-cost
More informationLecture 21: Space Complexity (The Final Exam Frontier?)
6.045 Lecture 21: Space Complexity (The Final Exam Frontier?) 1 conp NP MIN-FORMULA conp P NP FIRST-SAT TAUT P FACTORING SAT NP NP NP 2 VOTE VOTE VOTE For your favorite course on automata and complexity
More informationPhysically Based Modeling CS Notes Spring 1997 Particle Collision and Contact
Physically Based Modeling CS 15-863 Notes Spring 1997 Particle Collision and Contact 1 Collisions with Springs Suppose we wanted to ipleent a particle siulator with a floor : a solid horizontal plane which
More information1 Bounding the Margin
COS 511: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #12 Scribe: Jian Min Si March 14, 2013 1 Bounding the Margin We are continuing the proof of a bound on the generalization error of AdaBoost
More informationLecture 4. 1 Circuit Complexity. Notes on Complexity Theory: Fall 2005 Last updated: September, Jonathan Katz
Notes on Complexity Theory: Fall 2005 Last updated: September, 2005 Jonathan Katz Lecture 4 1 Circuit Complexity Circuits are directed, acyclic graphs where nodes are called gates and edges are called
More informationFeature Extraction Techniques
Feature Extraction Techniques Unsupervised Learning II Feature Extraction Unsupervised ethods can also be used to find features which can be useful for categorization. There are unsupervised ethods that
More informationCSE200: Computability and complexity Space Complexity
CSE200: Computability and complexity Space Complexity Shachar Lovett January 29, 2018 1 Space complexity We would like to discuss languages that may be determined in sub-linear space. Lets first recall
More informationImproved Guarantees for Agnostic Learning of Disjunctions
Iproved Guarantees for Agnostic Learning of Disjunctions Pranjal Awasthi Carnegie Mellon University pawasthi@cs.cu.edu Avri Blu Carnegie Mellon University avri@cs.cu.edu Or Sheffet Carnegie Mellon University
More informationReducibility and Completeness
Reducibility and Copleteness Chapter 28 of the forthcoing CRC Handbook on Algoriths and Theory of Coputation Eric Allender 1 Rutgers University Michael C. Loui 2 University of Illinois at Urbana-Chapaign
More informationOn Process Complexity
On Process Coplexity Ada R. Day School of Matheatics, Statistics and Coputer Science, Victoria University of Wellington, PO Box 600, Wellington 6140, New Zealand, Eail: ada.day@cs.vuw.ac.nz Abstract Process
More informationIn this chapter, we consider several graph-theoretic and probabilistic models
THREE ONE GRAPH-THEORETIC AND STATISTICAL MODELS 3.1 INTRODUCTION In this chapter, we consider several graph-theoretic and probabilistic odels for a social network, which we do under different assuptions
More information3.8 Three Types of Convergence
3.8 Three Types of Convergence 3.8 Three Types of Convergence 93 Suppose that we are given a sequence functions {f k } k N on a set X and another function f on X. What does it ean for f k to converge to
More informationComputational Complexity Theory
Computational Complexity Theory Marcus Hutter Canberra, ACT, 0200, Australia http://www.hutter1.net/ Assumed Background Preliminaries Turing Machine (TM) Deterministic Turing Machine (DTM) NonDeterministic
More informationComplexity Theory VU , SS The Polynomial Hierarchy. Reinhard Pichler
Complexity Theory Complexity Theory VU 181.142, SS 2018 6. The Polynomial Hierarchy Reinhard Pichler Institut für Informationssysteme Arbeitsbereich DBAI Technische Universität Wien 15 May, 2018 Reinhard
More informationMTAT Complexity Theory October 13th-14th, Lecture 6
MTAT.07.004 Complexity Theory October 13th-14th, 2011 Lecturer: Peeter Laud Lecture 6 Scribe(s): Riivo Talviste 1 Logarithmic memory Turing machines working in logarithmic space become interesting when
More informationA An Overview of Complexity Theory for the Algorithm Designer
A An Overview of Complexity Theory for the Algorithm Designer A.1 Certificates and the class NP A decision problem is one whose answer is either yes or no. Two examples are: SAT: Given a Boolean formula
More informationOutline. Complexity Theory EXACT TSP. The Class DP. Definition. Problem EXACT TSP. Complexity of EXACT TSP. Proposition VU 181.
Complexity Theory Complexity Theory Outline Complexity Theory VU 181.142, SS 2018 6. The Polynomial Hierarchy Reinhard Pichler Institut für Informationssysteme Arbeitsbereich DBAI Technische Universität
More informationTesting Properties of Collections of Distributions
Testing Properties of Collections of Distributions Reut Levi Dana Ron Ronitt Rubinfeld April 9, 0 Abstract We propose a fraework for studying property testing of collections of distributions, where the
More informationTheoretical Computer Science
Theoretical Coputer Science 411 (2010) 22 43 Contents lists available at ScienceDirect Theoretical Coputer Science journal hoepage: www.elsevier.co/locate/tcs Theory of one-tape linear-tie Turing achines
More informationN-Point. DFTs of Two Length-N Real Sequences
Coputation of the DFT of In ost practical applications, sequences of interest are real In such cases, the syetry properties of the DFT given in Table 5. can be exploited to ake the DFT coputations ore
More informationPrerequisites. We recall: Theorem 2 A subset of a countably innite set is countable.
Prerequisites 1 Set Theory We recall the basic facts about countable and uncountable sets, union and intersection of sets and iages and preiages of functions. 1.1 Countable and uncountable sets We can
More informationA note on the multiplication of sparse matrices
Cent. Eur. J. Cop. Sci. 41) 2014 1-11 DOI: 10.2478/s13537-014-0201-x Central European Journal of Coputer Science A note on the ultiplication of sparse atrices Research Article Keivan Borna 12, Sohrab Aboozarkhani
More information6.841/18.405J: Advanced Complexity Wednesday, February 12, Lecture Lecture 3
6.841/18.405J: Advanced Complexity Wednesday, February 12, 2003 Lecture Lecture 3 Instructor: Madhu Sudan Scribe: Bobby Kleinberg 1 The language MinDNF At the end of the last lecture, we introduced the
More informationlecture 36: Linear Multistep Mehods: Zero Stability
95 lecture 36: Linear Multistep Mehods: Zero Stability 5.6 Linear ultistep ethods: zero stability Does consistency iply convergence for linear ultistep ethods? This is always the case for one-step ethods,
More informationModel Fitting. CURM Background Material, Fall 2014 Dr. Doreen De Leon
Model Fitting CURM Background Material, Fall 014 Dr. Doreen De Leon 1 Introduction Given a set of data points, we often want to fit a selected odel or type to the data (e.g., we suspect an exponential
More information3.3 Variational Characterization of Singular Values
3.3. Variational Characterization of Singular Values 61 3.3 Variational Characterization of Singular Values Since the singular values are square roots of the eigenvalues of the Heritian atrices A A and
More information