# 1 Randomized complexity

Save this PDF as:

Size: px
Start display at page:

## Transcription

1 : Complexity of Computation Lecture 6 ITCS, Tsinghua Univesity, Fall October 2007 Instructor: Elad Verbin Notes by: Zhang Zhiqiang and Yu Wei 1 Randomized complexity So far our notion of a realistic computational device was the deterministic polynomial-time Turing Machine. However, this model does not account for the possibility to use randomness, a resource that can arguably be found in nature and used in computations. This gives the notion of a randomized Turing Machine which, in addition to its other tapes, has access to a random tape which is initialized with a sequence of random and uncorrelated bits r 1, r 2, {0, 1}. 1 As soon as we allow the algorithm access to a random tape, its behavior is no longer completely determined by the input x. Now for every x we have a distribution of outputs depending on the how the random tape was initialized. Often it will be convenient to speak of this randomness explicitly, and we can indeed think of a randomized algorithm A as a deterministic algorithm that takes both a real input x and a random input r = (r 1, r 2,... ), denoting the setting of the random tape. An issue that occurs in defining the running time of randomized computations is that the running time itself may be a random variable. It might even be that for some settings of the randomness the computation never terminates. However, much of the time we will only study computations in which the running time is bounded by t(n) for all inputs of length n and all settings of the random tape. In this case we say that the randomized algorithm runs in time t(n), and without loss of generality we can then model the random tape of the algorithm as a string of length t(n). For decision problems, we will think of a randomized algorithm as solving the problem as long as it behaves correctly on all inputs but for most choices of the randomness. Depending on what we allow the algorithm to do on the remaining, bad choices of randomness we obtain three different definitions of randomized complexity classes. The first option is to require the algorithm to be correct over most choices of the randomness, but to output an arbitrary answer otherwise. This yields the class of decision problems BPP (for bounded probabilistic polynomial-time). Definition 1. The class BPP consists of all decision problems L for which there exists a randomized algorithm A running in probabilistic polynomial-time and such that for every x {0, 1}, Pr[A(x) = L(x)] 2/3 where the probability is taken over the setting of the random tape of the algorithm A. 1 Whether or not sufficiently many such uncorrelated and random bits can be found in nature is somewhat questionable; these questions are partially addressed in the study of randomized algorithms, specifically (but from different perspectives) in the areas of derandomization and randomness extraction. 1

2 2 As we will see soon, the choice of the constant 2/3 is irrelevant, as long as the constant is somewhat bigger than 1/2. Notice that if we replace the constant 2/3 with 1, we obtain exactly the class P. So in particular P BPP. Certain randomized algorithms have a one-sided error property; they never accept no instances of the problem, but may fail on a small fraction of yes instances. Definition 2. The class RP consists of all decision problems L for which there exists a randomized algorithm A running in probabilistic polynomial-time and such that for every x {0, 1}, x L = Pr[A(x) = 1] 1/2 x L = Pr[A(x) = 0] = 1. Again, the choice of constant 1/2 is irrelevant, as long as this quantity is bounded away from zero. Similarly, we can have algorithms that are always correct on the yes instances but may err on some fraction on no instances. This is the class corp. Observe that L corp iff L RP. So far the algorithms we look at are always efficient and mostly correct. We can also consider algorithms that are mostly efficient, but always correct. Say that a randomized algorithm A runs in expected polynomial time if there is a polynomial p such that for all x, the expected running time of A on input x is at most p(x). Definition 3. The class ZPP consists of all decision problems L that are decided by algorithms A such that for all x, Pr[A(x) = L(x)] = 1 and A runs in expected polynomial time. Alternatively, ZPP can be defined as the class of decision problems L for which there exists a randomized algorithm B that always runs in polynomial time and on every input x outputs either L(x) or the special symbol?, with the property that for every x, Pr[B(x) = L(x)] 1/2. Here is a sketch of the equivalence of the two definitions: To turn algorithm A into algorithm B, run A(x) for at most 2p( x ) steps; if A has finished within this time bound output the answer (which is guaranteed to be correct), otherwise output?. Conversely, given algorithm B, you can obtain A from B by running B(x) sequentially for as many times as necessary until B(x) outputs an answer different from?. 2 Amplifying the success probability We now show that the constants that determine the success probability of randomized algorithms are indeed arbitrary. We focus on BPP algorithms. Suppose you have an algorithm for a decision problem L such that x L = Pr[A(x) = 1] c( x ) x L = Pr[A(x) = 1] s( x ).

3 3 If c(n) and s(n) are somewhat bounded away from one another, we can make c very close to 1 and s very close to zero by running the algorithm independently m = poly(n) times and observing what fraction of copies accept. If this number is closer to m c(n) than to m s(n) we accept, otherwise we reject. Let X i be an indicator random variable for the ith run of the algorithm accepting and X = X X m, the number of accepting trials. Let µ = E[X]. If x L, then µ m c(n), while if x L, then µ m c(n). Recall the Chernoff bound, which tells us that if m is large, then X is very close to its expectation with extremely high probability: Theorem 4 (Chernoff bound). Let X 1, X 2,, X m are independent random variables with Pr(X i = 1) = p, Pr(X i = 1) = 1 p. Set X = X X m and µ = E[X]. Then Pr [ X pm ɛm ] e ɛ2 m/2p(1 p) We set ɛ = (c(n) s(n))/2 to obtain the following consequence: x L = Pr[X (c(n) + s(n))m/2] e δ(n)m x L = Pr[X (c(n) + s(n))m/2] e δ(n)m where δ(n) = (c(n) s(n)) 2 /2. So as long as c(n) s(n) = n Ω(1) that is, c(n) and s(n) are not too close, we can set m = p(n)/δ(n), where p is an arbitrary polynomial. Then the algorithm that runs m independent copies of A and accepts whenever (c(n) + s(n))m/2 of those copies accepts has failure probability at most 2 p(n). 3 Relations between randomized complexity classes P ZPP: Any algorithm that runs in polynomial time runs in expected polynomial time. ZPP RP corp: We show ZPP RP. Consider the definition of ZPP where an algorithm B returns either the correct answer or?. When B outputs 0 or 1, output this answer; when B outputs?, output 0. RP corp ZPP: Given an RP algorithm A and a corp algorithm B for the same problem, on input x, run both A and B. If A(x) outputs 0, output 0; if B(x) outputs 1, output 1; otherwise, output?. RP BPP: By our discussion in the previous section, the probability 1/2 in the definition of RP can be replaced by 2/3. RP NP: Think of the randomness r used by the algorithm as an NP witness. If x is a no instance, then A(x, r) rejects for all r. If x is a yes instance, then A(x, r) accepts with nonzero probability, so there is some r for which A(x, r) acccepts. We now state some containments of BPP.

4 4 4 Adleman s Theorem Theorem 5 (Adelman). BPP P/poly. Proof. Let L BPP. By our discussion on amplifying the success probability, L has a BPP type algorithm such that for every x, Pr r [A(x, r) = L(x)] 1 1/2 x +1. We fix the input size x = n and show that the same string r n can be good for all inputs of size n simultaneously. Pr rn [ x {0, 1} n, A(x, r n ) = L(x)] = 1 Pr rn [ x {0, 1} n, A(x, r n ) L(x)] 1 x {0,1} n Pr rn [A(x, r n ) L(x)] 1 2 n 2 n 1 = 1/2, so there is a r n of length poly(n) that behaves correctly on all length n inputs. Let circuit C n simulate A(x, r n ) on input x when x = n. Then C n computes L, so L P/poly. This result indicates that randomness is not too powerful. In particular it is unlikely that NP BPP, for in that case we would have that NP P/poly and by the Karp-Lipton-Sipser theorem, Σ 2 = Π 2. 5 Randomized computation and the polynomial hierarchy The result BPP P/poly seems to indicate that randomness does not add much to the power of deterministic computation. So it seems reasonable to conjecture that BPP = P. In a few lectures we will provide some evidence for this claim. However we cannot even prove that BPP NP. The best statement of this kind we can prove is the following: Theorem 6 (Gacs and Sipser). BPP Σ 2. We show a proof of this statement by Lautemann. Proof. Let L BPP and let A(x, r) be a BPP type algorithm for L with error probability 1/3m that uses m = m(n) bits of randomness on inputs of length n. To show that L Σ 2, we show the following characterization of L: x L y 1,, y m {0, 1} m z {0, 1} m : m A(x, y i z) = 1.

5 5 This statement immediately yields a Σ 2 -type algorithm for L. First, suppose x L. When we choose y 1, y 2,, y m independently uniformly randomly, Pr y1,,y m [ z : A(x, y i z) = 0 for all i] 2 m Pr y1,,y m [A(x, y i ) = 0 for all i] 2 m (1/3m) m < 1 So there must exist y 1,, y m such that for some z, A(x, y i z) = 1. Now, suppose x L. We now need to prove that y 1,, y m {0, 1} m z {0, 1} m : m A(x, y i z) = 0. Fix an arbitrary choice for y 1,, y m. Then for a random z {0, 1} m, Pr z [A(x, y i z) = 1 for some i] m Pr z [A(x, y i z) = 1] m (1/3m) = 1/3 < 1 so we can always choose z such that A(x, y i z) = 0 for all i.

### Randomized Complexity Classes; RP

Randomized Complexity Classes; RP Let N be a polynomial-time precise NTM that runs in time p(n) and has 2 nondeterministic choices at each step. N is a polynomial Monte Carlo Turing machine for a language

### 6.842 Randomness and Computation Lecture 5

6.842 Randomness and Computation 2012-02-22 Lecture 5 Lecturer: Ronitt Rubinfeld Scribe: Michael Forbes 1 Overview Today we will define the notion of a pairwise independent hash function, and discuss its

### If NP languages are hard on the worst-case then it is easy to find their hard instances

If NP languages are hard on the worst-case then it is easy to find their hard instances Dan Gutfreund School of Computer Science and Engineering, The Hebrew University of Jerusalem, Israel, 91904 danig@cs.huji.ac.il

### 1 Computational problems

80240233: Computational Complexity Lecture 1 ITCS, Tsinghua Univesity, Fall 2007 9 October 2007 Instructor: Andrej Bogdanov Notes by: Andrej Bogdanov The aim of computational complexity theory is to study

### Kernel-Size Lower Bounds. The Evidence from Complexity Theory

: The Evidence from Complexity Theory IAS Worker 2013, Warsaw Part 2/3 Note These slides are a slightly revised version of a 3-part tutorial given at the 2013 Workshop on Kernelization ( Worker ) at the

### Problem Set 2. Assigned: Mon. November. 23, 2015

Pseudorandomness Prof. Salil Vadhan Problem Set 2 Assigned: Mon. November. 23, 2015 Chi-Ning Chou Index Problem Progress 1 SchwartzZippel lemma 1/1 2 Robustness of the model 1/1 3 Zero error versus 1-sided

### Lecture 17. In this lecture, we will continue our discussion on randomization.

ITCS:CCT09 : Computational Complexity Theory May 11, 2009 Lecturer: Jayalal Sarma M.N. Lecture 17 Scribe: Hao Song In this lecture, we will continue our discussion on randomization. 1 BPP and the Polynomial

### 1 Computational Problems

Stanford University CS254: Computational Complexity Handout 2 Luca Trevisan March 31, 2010 Last revised 4/29/2010 In this lecture we define NP, we state the P versus NP problem, we prove that its formulation

### 2 Natural Proofs: a barrier for proving circuit lower bounds

Topics in Theoretical Computer Science April 4, 2016 Lecturer: Ola Svensson Lecture 6 (Notes) Scribes: Ola Svensson Disclaimer: These notes were written for the lecturer only and may contain inconsistent

### A note on exponential circuit lower bounds from derandomizing Arthur-Merlin games

Electronic Colloquium on Computational Complexity, Report No. 74 (200) A note on exponential circuit lower bounds from derandomizing Arthur-Merlin games Harry Buhrman Scott Aaronson MIT aaronson@csail.mit.edu

### IF NP LANGUAGES ARE HARD ON THE WORST-CASE, THEN IT IS EASY TO FIND THEIR HARD INSTANCES

IF NP LANGUAGES ARE HARD ON THE WORST-CASE, THEN IT IS EASY TO FIND THEIR HARD INSTANCES Dan Gutfreund, Ronen Shaltiel, and Amnon Ta-Shma Abstract. We prove that if NP BPP, i.e., if SAT is worst-case hard,

### Lecture 20: conp and Friends, Oracles in Complexity Theory

6.045 Lecture 20: conp and Friends, Oracles in Complexity Theory 1 Definition: conp = { L L NP } What does a conp computation look like? In NP algorithms, we can use a guess instruction in pseudocode:

### Lecture 2. 1 More N P-Compete Languages. Notes on Complexity Theory: Fall 2005 Last updated: September, Jonathan Katz

Notes on Complexity Theory: Fall 2005 Last updated: September, 2005 Jonathan Katz Lecture 2 1 More N P-Compete Languages It will be nice to find more natural N P-complete languages. To that end, we ine

### On Rice s theorem. Hans Hüttel. October 2001

On Rice s theorem Hans Hüttel October 2001 We have seen that there exist languages that are Turing-acceptable but not Turing-decidable. An important example of such a language was the language of the Halting

### 2 P vs. NP and Diagonalization

2 P vs NP and Diagonalization CS 6810 Theory of Computing, Fall 2012 Instructor: David Steurer (sc2392) Date: 08/28/2012 In this lecture, we cover the following topics: 1 3SAT is NP hard; 2 Time hierarchies;

### Lecture 3: Reductions and Completeness

CS 710: Complexity Theory 9/13/2011 Lecture 3: Reductions and Completeness Instructor: Dieter van Melkebeek Scribe: Brian Nixon Last lecture we introduced the notion of a universal Turing machine for deterministic

### Notes for Lecture 2. Statement of the PCP Theorem and Constraint Satisfaction

U.C. Berkeley Handout N2 CS294: PCP and Hardness of Approximation January 23, 2006 Professor Luca Trevisan Scribe: Luca Trevisan Notes for Lecture 2 These notes are based on my survey paper [5]. L.T. Statement

### Theory of Computation

Theory of Computation Dr. Sarmad Abbasi Dr. Sarmad Abbasi () Theory of Computation 1 / 33 Lecture 20: Overview Incompressible strings Minimal Length Descriptions Descriptive Complexity Dr. Sarmad Abbasi

### 15-855: Intensive Intro to Complexity Theory Spring Lecture 7: The Permanent, Toda s Theorem, XXX

15-855: Intensive Intro to Complexity Theory Spring 2009 Lecture 7: The Permanent, Toda s Theorem, XXX 1 #P and Permanent Recall the class of counting problems, #P, introduced last lecture. It was introduced

### 3 Probabilistic Turing Machines

CS 252 - Probabilistic Turing Machines (Algorithms) Additional Reading 3 and Homework problems 3 Probabilistic Turing Machines When we talked about computability, we found that nondeterminism was sometimes

### Lecture 12: Interactive Proofs

princeton university cos 522: computational complexity Lecture 12: Interactive Proofs Lecturer: Sanjeev Arora Scribe:Carl Kingsford Recall the certificate definition of NP. We can think of this characterization

### A Note on the Karp-Lipton Collapse for the Exponential Hierarchy

A Note on the Karp-Lipton Collapse for the Exponential Hierarchy Chris Bourke Department of Computer Science & Engineering University of Nebraska Lincoln, NE 68503, USA Email: cbourke@cse.unl.edu January

### ,

Kolmogorov Complexity Carleton College, CS 254, Fall 2013, Prof. Joshua R. Davis based on Sipser, Introduction to the Theory of Computation 1. Introduction Kolmogorov complexity is a theory of lossless

### Lecture 9 and 10: Malicious Security - GMW Compiler and Cut and Choose, OT Extension

CS 294 Secure Computation February 16 and 18, 2016 Lecture 9 and 10: Malicious Security - GMW Compiler and Cut and Choose, OT Extension Instructor: Sanjam Garg Scribe: Alex Irpan 1 Overview Garbled circuits

### U.C. Berkeley CS278: Computational Complexity Professor Luca Trevisan 9/6/2004. Notes for Lecture 3

U.C. Berkeley CS278: Computational Complexity Handout N3 Professor Luca Trevisan 9/6/2004 Notes for Lecture 3 Revised 10/6/04 1 Space-Bounded Complexity Classes A machine solves a problem using space s(

### Topic: Sampling, Medians of Means method and DNF counting Date: October 6, 2004 Scribe: Florin Oprea

15-859(M): Randomized Algorithms Lecturer: Shuchi Chawla Topic: Sampling, Medians of Means method and DNF counting Date: October 6, 200 Scribe: Florin Oprea 8.1 Introduction In this lecture we will consider

### An Introduction to Randomized algorithms

An Introduction to Randomized algorithms C.R. Subramanian The Institute of Mathematical Sciences, Chennai. Expository talk presented at the Research Promotion Workshop on Introduction to Geometric and

### Derandomization and Circuit Lower Bounds. Markus Bläser Universität des Saarlandes DraftAugust 25, 2008 and forever

Derandomization and Circuit Lower Bounds Markus Bläser Universität des Saarlandes DraftAugust 25, 2008 and forever 2 1 Introduction Primality testing is the following problem: Given a number n in binary,

### Lecture 16: Time Complexity and P vs NP

6.045 Lecture 16: Time Complexity and P vs NP 1 Time-Bounded Complexity Classes Definition: TIME(t(n)) = { L there is a Turing machine M with time complexity O(t(n)) so that L = L(M) } = { L L is a language

### Finish K-Complexity, Start Time Complexity

6.045 Finish K-Complexity, Start Time Complexity 1 Kolmogorov Complexity Definition: The shortest description of x, denoted as d(x), is the lexicographically shortest string such that M(w) halts

### DRAFT. Algebraic computation models. Chapter 14

Chapter 14 Algebraic computation models Somewhat rough We think of numerical algorithms root-finding, gaussian elimination etc. as operating over R or C, even though the underlying representation of the

### Computability Theory

Computability Theory Cristian S. Calude May 2012 Computability Theory 1 / 1 Bibliography M. Sipser. Introduction to the Theory of Computation, PWS 1997. (textbook) Computability Theory 2 / 1 Supplementary

### Turing Machines Part III

Turing Machines Part III Announcements Problem Set 6 due now. Problem Set 7 out, due Monday, March 4. Play around with Turing machines, their powers, and their limits. Some problems require Wednesday's

### Constructing Hard Functions from Learning Algorithms

Electronic Colloquium on Computational Complexity, Revision 1 of Report No. 129 (2013) Constructing Hard Functions from Learning Algorithms Adam Klivans klivans@cs.utexas.edu Pravesh Kothari kothari@cs.utexas.edu

### 6.045J/18.400J: Automata, Computability and Complexity Final Exam. There are two sheets of scratch paper at the end of this exam.

6.045J/18.400J: Automata, Computability and Complexity May 20, 2005 6.045 Final Exam Prof. Nancy Lynch Name: Please write your name on each page. This exam is open book, open notes. There are two sheets

### Identifying an Honest EXP NP Oracle Among Many

Identifying an Honest EXP NP Oracle Among Many Shuichi Hirahara The University of Tokyo CCC 18/6/2015 Overview Dishonest oracle Honest oracle Queries Which is honest? Our Contributions Selector 1. We formulate

### Lecture 9: Connecting PH, P/poly and BPP

Comutational Comlexity Theory, Fall 010 Setember Lecture 9: Connecting PH, P/oly and BPP Lecturer: Kristoffer Arnsfelt Hansen Scribe: Martin Sergio Hedevang Faester Although we do not know how to searate

### Foundations of Machine Learning and Data Science. Lecturer: Avrim Blum Lecture 9: October 7, 2015

10-806 Foundations of Machine Learning and Data Science Lecturer: Avrim Blum Lecture 9: October 7, 2015 1 Computational Hardness of Learning Today we will talk about some computational hardness results

### Integer-Valued Polynomials

Integer-Valued Polynomials LA Math Circle High School II Dillon Zhi October 11, 2015 1 Introduction Some polynomials take integer values p(x) for all integers x. The obvious examples are the ones where

### Linear Classification: Perceptron

Linear Classification: Perceptron Yufei Tao Department of Computer Science and Engineering Chinese University of Hong Kong 1 / 18 Y Tao Linear Classification: Perceptron In this lecture, we will consider

### Boolean circuits. Lecture Definitions

Lecture 20 Boolean circuits In this lecture we will discuss the Boolean circuit model of computation and its connection to the Turing machine model. Although the Boolean circuit model is fundamentally

### MTAT Complexity Theory October 13th-14th, Lecture 6

MTAT.07.004 Complexity Theory October 13th-14th, 2011 Lecturer: Peeter Laud Lecture 6 Scribe(s): Riivo Talviste 1 Logarithmic memory Turing machines working in logarithmic space become interesting when

### Non-Interactive ZK:The Feige-Lapidot-Shamir protocol

Non-Interactive ZK: The Feige-Lapidot-Shamir protocol April 20, 2009 Remainders FLS protocol Definition (Interactive proof system) A pair of interactive machines (P, V ) is called an interactive proof

### Stochasticity in Algorithmic Statistics for Polynomial Time

Stochasticity in Algorithmic Statistics for Polynomial Time Alexey Milovanov 1 and Nikolay Vereshchagin 2 1 National Research University Higher School of Economics, Moscow, Russia almas239@gmail.com 2

### Lecture 17: Cook-Levin Theorem, NP-Complete Problems

6.045 Lecture 17: Cook-Levin Theorem, NP-Complete Problems 1 Is SAT solvable in O(n) time on a multitape TM? Logic circuits of 6n gates for SAT? If yes, then not only is P=NP, but there would be a dream

### Essential facts about NP-completeness:

CMPSCI611: NP Completeness Lecture 17 Essential facts about NP-completeness: Any NP-complete problem can be solved by a simple, but exponentially slow algorithm. We don t have polynomial-time solutions

### Time Complexity. Definition. Let t : n n be a function. NTIME(t(n)) = {L L is a language decidable by a O(t(n)) deterministic TM}

Time Complexity Definition Let t : n n be a function. TIME(t(n)) = {L L is a language decidable by a O(t(n)) deterministic TM} NTIME(t(n)) = {L L is a language decidable by a O(t(n)) non-deterministic

### 2 How many distinct elements are in a stream?

Dealing with Massive Data January 31, 2011 Lecture 2: Distinct Element Counting Lecturer: Sergei Vassilvitskii Scribe:Ido Rosen & Yoonji Shin 1 Introduction We begin by defining the stream formally. Definition

### Separating Cook Completeness from Karp-Levin Completeness under a Worst-Case Hardness Hypothesis

Separating Cook Completeness from Karp-Levin Completeness under a Worst-Case Hardness Hypothesis Debasis Mandal A. Pavan Rajeswari Venugopalan Abstract We show that there is a language that is Turing complete

### 1 Deterministic Turing Machines

Time and Space Classes Exposition by William Gasarch 1 Deterministic Turing Machines Turing machines are a model of computation. It is believed that anything that can be computed can be computed by a Turing

### Decidability and Undecidability

Decidability and Undecidability Major Ideas from Last Time Every TM can be converted into a string representation of itself. The encoding of M is denoted M. The universal Turing machine U TM accepts an

### Notes on Space-Bounded Complexity

U.C. Berkeley CS172: Automata, Computability and Complexity Handout 7 Professor Luca Trevisan April 14, 2015 Notes on Space-Bounded Complexity These are notes for CS278, Computational Complexity, scribed

### 198:538 Lecture Given on February 23rd, 1998

198:538 Lecture Given on February 23rd, 1998 Lecturer: Professor Eric Allender Scribe: Sunny Daniels November 25, 1998 1 A Result From theorem 5 of the lecture on 16th February 1998, together with the

### CS154, Lecture 10: Rice s Theorem, Oracle Machines

CS154, Lecture 10: Rice s Theorem, Oracle Machines Moral: Analyzing Programs is Really, Really Hard But can we more easily tell when some program analysis problem is undecidable? Problem 1 Undecidable

### Theory of Computation

Theory of Computation Lecture #2 Sarmad Abbasi Virtual University Sarmad Abbasi (Virtual University) Theory of Computation 1 / 1 Lecture 2: Overview Recall some basic definitions from Automata Theory.

### 6.840 Language Membership

6.840 Language Membership Michael Bernstein 1 Undecidable INP Practice final Use for A T M. Build a machine that asks EQ REX and then runs M on w. Query INP. If it s in P, accept. Note that the language

### Nondeterministic Circuit Lower Bounds from Mildly Derandomizing Arthur-Merlin Games

Nondeterministic Circuit Lower Bounds from Mildly Derandomizing Arthur-Merlin Games Barış Aydınlıo glu Department of Computer Sciences University of Wisconsin Madison, WI 53706, USA baris@cs.wisc.edu Dieter

### 1 Reductions and Expressiveness

15-451/651: Design & Analysis of Algorithms November 3, 2015 Lecture #17 last changed: October 30, 2015 In the past few lectures we have looked at increasingly more expressive problems solvable using efficient

### 6.045 Final Exam Solutions

6.045J/18.400J: Automata, Computability and Complexity Prof. Nancy Lynch, Nati Srebro 6.045 Final Exam Solutions May 18, 2004 Susan Hohenberger Name: Please write your name on each page. This exam is open

### The Polynomial Hierarchy

The Polynomial Hierarchy Slides based on S.Aurora, B.Barak. Complexity Theory: A Modern Approach. Ahto Buldas Ahto.Buldas@ut.ee Motivation..synthesizing circuits is exceedingly difficulty. It is even

### What One Has to Know when Attacking P vs. NP. Juraj Hromkovič and Peter Rossmanith

What One Has to Know when Attacking P vs. NP Juraj Hromkovič and Peter Rossmanith September 12, 2017 FCT 2017 [Chaitin, JACM 1974] There exists d N such that, for all n d and all x {0, 1}, there does not

### Notes on Complexity Theory Last updated: October, Lecture 6

Notes on Complexity Theory Last updated: October, 2015 Lecture 6 Notes by Jonathan Katz, lightly edited by Dov Gordon 1 PSPACE and PSPACE-Completeness As in our previous study of N P, it is useful to identify

### Complexity and NP-completeness

Lecture 17 Complexity and NP-completeness Supplemental reading in CLRS: Chapter 34 As an engineer or computer scientist, it is important not only to be able to solve problems, but also to know which problems

### Lecture 4: Hardness Amplification: From Weak to Strong OWFs

COM S 687 Introduction to Cryptography September 5, 2006 Lecture 4: Hardness Amplification: From Weak to Strong OWFs Instructor: Rafael Pass Scribe: Jed Liu Review In the previous lecture, we formalised

### Lecture 23: More PSPACE-Complete, Randomized Complexity

6.045 Lecture 23: More PSPACE-Complete, Randomized Complexity 1 Final Exam Information Who: You On What: Everything through PSPACE (today) With What: One sheet (double-sided) of notes are allowed When:

### CpSc 421 Homework 9 Solution

CpSc 421 Homework 9 Solution Attempt any three of the six problems below. The homework is graded on a scale of 100 points, even though you can attempt fewer or more points than that. Your recorded grade

### 1 More finite deterministic automata

CS 125 Section #6 Finite automata October 18, 2016 1 More finite deterministic automata Exercise. Consider the following game with two players: Repeatedly flip a coin. On heads, player 1 gets a point.

### Towards NEXP versus BPP?

Towards NEXP versus BPP? Ryan Williams Stanford University Abstract. We outline two plausible approaches to improving the miserable state of affairs regarding lower bounds against probabilistic polynomial

### 6.841/18.405J: Advanced Complexity Wednesday, February 12, Lecture Lecture 3

6.841/18.405J: Advanced Complexity Wednesday, February 12, 2003 Lecture Lecture 3 Instructor: Madhu Sudan Scribe: Bobby Kleinberg 1 The language MinDNF At the end of the last lecture, we introduced the

### Fourier Sampling & Simon s Algorithm

Chapter 4 Fourier Sampling & Simon s Algorithm 4.1 Reversible Computation A quantum circuit acting on n qubits is described by an n n unitary operator U. Since U is unitary, UU = U U = I. This implies

### Computer Sciences Department

Computer Sciences Department 1 Reference Book: INTRODUCTION TO THE THEORY OF COMPUTATION, SECOND EDITION, by: MICHAEL SIPSER Computer Sciences Department 3 ADVANCED TOPICS IN C O M P U T A B I L I T Y

### Pseudorandom Generators and Typically-Correct Derandomization

Pseudorandom Generators and Typically-Correct Derandomization Jeff Kinne 1, Dieter van Melkebeek 1, and Ronen Shaltiel 2 1 Department of Computer Sciences, University of Wisconsin-Madison, USA {jkinne,dieter}@cs.wisc.edu

### 10.1 The Formal Model

67577 Intro. to Machine Learning Fall semester, 2008/9 Lecture 10: The Formal (PAC) Learning Model Lecturer: Amnon Shashua Scribe: Amnon Shashua 1 We have see so far algorithms that explicitly estimate

### Polynomial Representations of Threshold Functions and Algorithmic Applications. Joint with Josh Alman (Stanford) and Timothy M.

Polynomial Representations of Threshold Functions and Algorithmic Applications Ryan Williams Stanford Joint with Josh Alman (Stanford) and Timothy M. Chan (Waterloo) Outline The Context: Polynomial Representations,

### A Short History of Computational Complexity

A Short History of Computational Complexity Lance Fortnow, Steve Homer Georgia Kaouri NTUAthens Overview 1936: Turing machine early 60 s: birth of computational complexity early 70 s: NP-completeness,

### Definition: Alternating time and space Game Semantics: State of machine determines who

CMPSCI 601: Recall From Last Time Lecture 3 Definition: Alternating time and space Game Semantics: State of machine determines who controls, White wants it to accept, Black wants it to reject. White wins

### Variations of the Turing Machine

Variations of the Turing Machine 1 The Standard Model Infinite Tape a a b a b b c a c a Read-Write Head (Left or Right) Control Unit Deterministic 2 Variations of the Standard Model Turing machines with:

### Lecture 14: IP = PSPACE

IAS/PCMI Summer Session 2000 Clay Mathematics Undergraduate Program Basic Course on Computational Complexity Lecture 14: IP = PSPACE David Mix Barrington and Alexis Maciel August 3, 2000 1. Overview We

### CS6840: Advanced Complexity Theory Mar 29, Lecturer: Jayalal Sarma M.N. Scribe: Dinesh K.

CS684: Advanced Complexity Theory Mar 29, 22 Lecture 46 : Size lower bounds for AC circuits computing Parity Lecturer: Jayalal Sarma M.N. Scribe: Dinesh K. Theme: Circuit Complexity Lecture Plan: Proof

### : On the P vs. BPP problem. 18/12/16 Lecture 10

03684155: On the P vs. BPP problem. 18/12/16 Lecture 10 Natural proofs Amnon Ta-Shma and Dean Doron 1 Natural proofs The ultimate goal we have is separating classes (or proving they are equal if they are).

### ON COMPUTAMBLE NUMBERS, WITH AN APPLICATION TO THE ENTSCHENIDUGSPROBLEM. Turing 1936

ON COMPUTAMBLE NUMBERS, WITH AN APPLICATION TO THE ENTSCHENIDUGSPROBLEM Turing 1936 Where are We? Ignoramus et ignorabimus Wir mussen wissen Wir werden wissen We do not know We shall not know We must know

### Linear Programming. Linear Programming I. Lecture 1. Linear Programming. Linear Programming

Linear Programming Linear Programming Lecture Linear programming. Optimize a linear function subject to linear inequalities. (P) max " c j x j n j= n s. t. " a ij x j = b i # i # m j= x j 0 # j # n (P)

### Lecture 3: Constructing a Quantum Model

CS 880: Quantum Information Processing 9/9/010 Lecture 3: Constructing a Quantum Model Instructor: Dieter van Melkebeek Scribe: Brian Nixon This lecture focuses on quantum computation by contrasting it

### HW11. Due: December 8, 2016

CSCI 1010 Theory of Computation HW11 Due: December 8, 2016 Attach a fully filled-in cover sheet to the front of your printed homework. Your name should not appear anywhere except on the cover sheet; each

### Sample Project: Simulation of Turing Machines by Machines with only Two Tape Symbols

Sample Project: Simulation of Turing Machines by Machines with only Two Tape Symbols The purpose of this document is to illustrate what a completed project should look like. I have chosen a problem that

### Q = Set of states, IE661: Scheduling Theory (Fall 2003) Primer to Complexity Theory Satyaki Ghosh Dastidar

IE661: Scheduling Theory (Fall 2003) Primer to Complexity Theory Satyaki Ghosh Dastidar Turing Machine A Turing machine is an abstract representation of a computing device. It consists of a read/write

### Introduction to Turing Machines. Reading: Chapters 8 & 9

Introduction to Turing Machines Reading: Chapters 8 & 9 1 Turing Machines (TM) Generalize the class of CFLs: Recursively Enumerable Languages Recursive Languages Context-Free Languages Regular Languages

### Definition: Alternating time and space Game Semantics: State of machine determines who

CMPSCI 601: Recall From Last Time Lecture Definition: Alternating time and space Game Semantics: State of machine determines who controls, White wants it to accept, Black wants it to reject. White wins

### CS154, Lecture 12: Kolmogorov Complexity: A Universal Theory of Data Compression

CS154, Lecture 12: Kolmogorov Complexity: A Universal Theory of Data Compression Rosencrantz & Guildenstern Are Dead (Tom Stoppard) Rigged Lottery? And the winning numbers are: 1, 2, 3, 4, 5, 6 But is

### Jin-Yi Cai. iff there is a P-time predicate P such that the following holds:

Journal of Computer and System Sciences 73 (2007) 25 35 www.elsevier.com/locate/jcss S p 2 ZPPNP Jin-Yi Cai Computer Sciences Department, University of Wisconsin, Madison, WI 53706, USA Received 17 October

### compare to comparison and pointer based sorting, binary trees

Admin Hashing Dictionaries Model Operations. makeset, insert, delete, find keys are integers in M = {1,..., m} (so assume machine word size, or unit time, is log m) can store in array of size M using power:

### Theory of Computation CS3102 Spring 2015 A tale of computers, math, problem solving, life, love and tragic death

Theory of Computation CS3102 Spring 2015 A tale of computers, math, problem solving, life, love and tragic death Robbie Hott www.cs.virginia.edu/~jh2jf Department of Computer Science University of Virginia

### 1 Acceptance, Rejection, and I/O for Turing Machines

1 Acceptance, Rejection, and I/O for Turing Machines Definition 1.1 (Initial Configuration) If M = (K,Σ,δ,s,H) is a Turing machine and w (Σ {, }) then the initial configuration of M on input w is (s, w).

### Definition: conp = { L L NP } What does a conp computation look like?

Space Complexity 28 Definition: conp = { L L NP } What does a conp computation look like? In NP algorithms, we can use a guess instruction in pseudocode: Guess string y of x k length and the machine accepts

### Real Interactive Proofs for VPSPACE

Brandenburgische Technische Universität, Cottbus-Senftenberg, Germany Colloquium Logicum Hamburg, September 2016 joint work with M. Baartse 1. Introduction Blum-Shub-Smale model of computability and complexity

### Simplified Derandomization of BPP Using a Hitting Set Generator

Simplified Derandomization of BPP Using a Hitting Set Generator Oded Goldreich, Salil Vadhan, and Avi Wigderson Abstract. A hitting-set generator is a deterministic algorithm that generates a set of strings

### Lecture 3: AC 0, the switching lemma

Lecture 3: AC 0, the switching lemma Topics in Complexity Theory and Pseudorandomness (Spring 2013) Rutgers University Swastik Kopparty Scribes: Meng Li, Abdul Basit 1 Pseudorandom sets We start by proving