Asymptotic notation : big-o and small-o

Similar documents
Variants of Turing Machine (intro)

Equivalence of TMs and Multitape TMs. Theorem 3.13 and Corollary 3.15 By: Joseph Lauman

The Church-Turing Thesis

Time Complexity (1) CSCI Spring Original Slides were written by Dr. Frederick W Maier. CSCI 2670 Time Complexity (1)

Computability Theory. CS215, Lecture 6,

Variations of the Turing Machine

Theory of Computation

Busch Complexity Lectures: Turing s Thesis. Costas Busch - LSU 1

Applied Computer Science II Chapter 7: Time Complexity. Prof. Dr. Luc De Raedt. Institut für Informatik Albert-Ludwigs Universität Freiburg Germany

Turing s Thesis. Fall Costas Busch - RPI!1

Decidability (intro.)

Turing Machines A Turing Machine is a 7-tuple, (Q, Σ, Γ, δ, q0, qaccept, qreject), where Q, Σ, Γ are all finite

ECS 120 Lesson 23 The Class P

Complexity (Pre Lecture)

Foundations of

Computer Sciences Department

Theory of Computation

Chapter 3: The Church-Turing Thesis

Complexity Theory Part I

Turing Machines. 22c:135 Theory of Computation. Tape of a Turing Machine (TM) TM versus FA, PDA

Chapter 7: Time Complexity

SOLUTION: SOLUTION: SOLUTION:

CS20a: Turing Machines (Oct 29, 2002)

Theory of Computation (IX) Yijia Chen Fudan University

V Honors Theory of Computation

Turing Machines and Time Complexity

Turing Machines Extensions

15.1 Proof of the Cook-Levin Theorem: SAT is NP-complete

Theory of Computation Time Complexity

Complexity Theory Part II

Computational Models Lecture 11, Spring 2009

Homework Assignment 6 Answers

CSE 2001: Introduction to Theory of Computation Fall Suprakash Datta

Friday Four Square! Today at 4:15PM, Outside Gates

More Turing Machines. CS154 Chris Pollett Mar 15, 2006.

Introduction to Languages and Computation

Complexity Theory Part I

Time Complexity. CS60001: Foundations of Computing Science

CSCI3390-Lecture 14: The class NP

Lecture 24: Randomized Complexity, Course Summary

Harvard CS 121 and CSCI E-121 Lecture 20: Polynomial Time

FORMAL LANGUAGES, AUTOMATA AND COMPUTATION

Turing Machine Variants

TURING MAHINES

Polynomial time reduction and NP-completeness. Exploring some time complexity limits of polynomial time algorithmic solutions

Part I: Definitions and Properties

CSE 105 Theory of Computation

CSCE 551: Chin-Tser Huang. University of South Carolina

An example of a decidable language that is not a CFL Implementation-level description of a TM State diagram of TM

Computability and Complexity

CSCC63 Worksheet Turing Machines

CSE 105 THEORY OF COMPUTATION

The purpose here is to classify computational problems according to their complexity. For that purpose we need first to agree on a computational

Two Way Deterministic Finite Automata

FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY. FLAC (15-453) Spring l. Blum TIME COMPLEXITY AND POLYNOMIAL TIME;

Theory of Computation

Advanced topic: Space complexity

CSE 2001: Introduction to Theory of Computation Fall Suprakash Datta

Computability and Complexity

Time to learn about NP-completeness!

Theory of Computation

TIME COMPLEXITY AND POLYNOMIAL TIME; NON DETERMINISTIC TURING MACHINES AND NP. THURSDAY Mar 20

ECS 120 Lesson 20 The Post Correspondence Problem

Turing Machines Part II

CS5371 Theory of Computation. Lecture 10: Computability Theory I (Turing Machine)

Turing Machines. The Language Hierarchy. Context-Free Languages. Regular Languages. Courtesy Costas Busch - RPI 1

Computational complexity

Chapter 5 The Witness Reduction Technique

CSE 200 Lecture Notes Turing machine vs. RAM machine vs. circuits

CS5371 Theory of Computation. Lecture 10: Computability Theory I (Turing Machine)

CSCE 551: Chin-Tser Huang. University of South Carolina

CSCE 551 Final Exam, Spring 2004 Answer Key

Introduction to Computability

Space Complexity. The space complexity of a program is how much memory it uses.

Undecidable Problems and Reducibility

Intro to Theory of Computation

6.045: Automata, Computability, and Complexity (GITCS) Class 15 Nancy Lynch

Computational Models Lecture 8 1

The space complexity of a standard Turing machine. The space complexity of a nondeterministic Turing machine

CS151 Complexity Theory. Lecture 1 April 3, 2017

ECS 120 Lesson 15 Turing Machines, Pt. 1

P and NP. Or, how to make $1,000,000.

CS4026 Formal Models of Computation

CMPT 710/407 - Complexity Theory Lecture 4: Complexity Classes, Completeness, Linear Speedup, and Hierarchy Theorems

The Power of One-State Turing Machines

BBM402-Lecture 11: The Class NP

Outline. Complexity Theory. Example. Sketch of a log-space TM for palindromes. Log-space computations. Example VU , SS 2018

A Note on Turing Machine Design

Turing Machine Extensions, and Enumerators

1 Showing Recognizability

Turing Machines (TM) Deterministic Turing Machine (DTM) Nondeterministic Turing Machine (NDTM)

Turing Machines Part II

Decidability: Church-Turing Thesis

Computational Models Lecture 8 1

CS5371 Theory of Computation. Lecture 12: Computability III (Decidable Languages relating to DFA, NFA, and CFG)

Turing Machine properties. Turing Machines. Alternate TM definitions. Alternate TM definitions. Alternate TM definitions. Alternate TM definitions

Turing Machine variants

Computational Models Lecture 8 1

CSE 105 THEORY OF COMPUTATION

Transcription:

Time complexity Here we will consider elements of computational complexity theory an investigation of the time (or other resources) required for solving computational problems. We introduce a way of measuring the time used to solve a problem. Then we will classify problems according to the amount of time required. We will see that certain decidable problems require enormous amounts of time and how to determine when you are faced with such a problem. Let consider an example of a TM M1 which decides the language A = { 0 1 : 0}. M1 = on input w:. Repeat the following if both 0s and 1s remain on the tape. 3. Scan across the tape, crossing off a single 0 and a single 1. 4. If 0s still remain after all the 1s have been crossed off, or if 1s remain after all the 0s have been crossed off, reject. Otherwise, if neither 0s nor 1s remain on the tape, accept. How much time does a single type TM need to decide A? We count the number of steps that algorithm uses on a particular input as a function of the length of the string representing the input. We consider worst case analysis, i.e., the longest running time of all inputs of a particular length. Theory of Computation, Feodor F. Dragan, Kent State University 1 Asymptotic notation : big-o and small-o Def. 1: Let M be a TM that halts on all inputs. The running time or time complexity of M is the function f: N N, where f( is the maximum number of steps that M uses on any input of length n. We say M runs in time f( and M is an f( time Turing machine. Def. : Let f and g be two functions Say that f(=g() if positive integers c and n exist so that for, every : + f g N R n n'., f ( c g(. We say that g( is an upper bound for f( (or asymptotic upper bound). Intuitively, this means that f is less than or equal to g for sufficient large n if we disregard differences up to a constant factor. O represents that constant; constant is hidden under O. 3 3 4 If f ( = 5n + n + n + 6, then f ( n ) or f ( n ), but f ( If f ( = 3nlog n + 5n log log n +, then f ( n log. If f ( n ), then f ( Other examples of run-time: 1) log, O (1), n ( = ). Bounds of the form c n for c >0 are called polynomial bounds. Bounds of the form ( δ n ) ( δ are called exponential bounds. > 0) + Def. 3: Let f and g be two functions f, g : N R. Say that f(=o(g() if for any real c>0, a number n exists so that for every n n', f ( < c g(, ni.e., ) Examples: n = o(, n = o( n log log, f ( lim n = 0. n log log n = o( n log, n log n = o( n ), g( 3 n = o( n ), but f ( o( f ( ). Theory of Computation, Feodor F. Dragan, Kent State University 1

Analyzing Algorithms Let s analyze the algorithm we gave for the language A = { 0 1 : 0}. M1 = on input w:. Repeat the following if both 0s and 1s remain on the tape. 3. Scan across the tape, crossing off a single 0 and a single 1. 4. If 0s still remain after all the 1s have been crossed off, or if 1s remain after all the 0s have been crossed off, reject. Otherwise, if neither 0s nor 1s remain on the tape, accept. Stage 1: verifies that input is of form 0*1* in n steps. Hence steps. Stages,3: each scan uses steps, at most n/ scans. Hence n ) steps. Stage 4: at most steps. Hence, the total time of M1 on input of length n is O ( n ) Def. 4: Let t : N N be a function. Define the time complexity class, TIME(), to be TIME ( ) = { L : L is a language decided by an ) time Turing machine}. We have A TIME( n ). quicly? Is there a machine that decides A asymptotically more Theory of Computation, Feodor F. Dragan, Kent State University 3 A = { 0 1 : 0} TIME( nlog M = on input w:. Repeat the following if some 0s and some 1s remain on the tape. 3. Scan across the tape, checing whether the total number of 0s and 1s remaining is even or odd. If odd, reject. 4. Scan again across the tape, crossing off every other 0 starting with the first 0, and then crossing off every other 1 starting with the first 1. 5. If no 0s and no 1s remain on the tape, accept. Otherwise, reject. Why does M decide A? on every scan performed in stage 4, the total number of 0s (of 1s) remaining is cut in half and any remainder is discarded. in stage 3 we chec whether the parities of # of 0s and # of 1s are the same. Running Time: All Stages tae steps. Stages 1 and 5 are executed once. Stages,3,4 are executed at most (1+log time. Hence, the total time of M on input of length n is O ( + (1 + log( ) nlog. So, A TIME( n log. This result cannot be further improved on a single tape TM. Theory of Computation, Feodor F. Dragan, Kent State University 4

Linear time two-tape Turing machine for A. M3 = on input w:. Scan across the 0s on tape 1 until the first 1. At the same time, copy the 0s onto tape. 3. Scan across the 1s on tape 1 until the end of the input. For each 1 read on tape 1, cross off a 0 on tape. If all 0s are crossed off before all the 1s are read, reject. 4. If all the 0s have now been crossed off, accept. If any 0s remain, reject. Clearly, this is a decider for A. Running time is clearly. Summary: We presented a single tape TM M that decides A in n log time. We mentioned (w/o proof) that no single tape TM can do it more quicly. Then we presented a two-tape TM M3 that decides A in linear time. Hence, the complexity of A depends on the model of computation selected. This shows an important difference between complexity theory and computability theory. In computability theory, The Church-Turing thesis implies that all reasonable models of computation are equivalent, i.e., they decide the same class of languages. In complexity theory, the choice of model affects the time complexity of languages. Theory of Computation, Feodor F. Dragan, Kent State University 5 Complexity relations among models: Multi-tape TM δ : Q Γ Q Γ { L, R}, δ q, a,..., a ) ( r, b,..., b, L, R,..., ( 1 = 1 L ) control 0 1 0 0 a a b b b a b Theorem 1: Let be a function, where n. Then every time multi-tape TM has an equivalent t ( ) time single-tape TM. We have seen before how to convert a multi-tape TM M to an equivalent single tape TM S, that simulates it. Let M be a -tape TM that runs in time. We will show that simulating each step of the multi-tape TM uses at most ) steps of the single-tape TM. Hence the total time used is t ( ). S simulates the effect of tapes by storing their information on its single tape. It uses new symbol # as a delimiter to separate the contents of the different tapes. S must also eep trac of the locations of the heads. It does so by writing a tape symbol with a dot above it to mar the place where the head on that tape would be. M 0 1 a a b b a 3 1 S # 0 1 # # Theory of Computation, Feodor F. Dragan, Kent State University 6 a... a a b b a # 3

Multi-tape TM vs. Single-tape TM S= On input : w = w... 1w wn 1. First S puts its tape into the format that represents all tapes of M. The formatted tape contains... # w... 1w wn # # #... #. To simulate a single move, S scans its tape from the first #, which mars the left-hand end, to the (+1)st #, which mars the right-hand end, in order to determine the symbols under the virtual heads. Then S maes a second pass to update the tapes according to the way that M s transition function dictates. 3. If at any point S moves one of the virtual heads to the right onto a #, this action signifies that M has moved the corresponding head onto the previously unread blan portion of that tape. So S writes a blan symbol on this tape cell and shifts the tape contents, from this sell until the rightmost #, one unit to the right. Then it continues the simulation as before. Running Time: Stage 1 taes steps and is executed once. Stages,3: S simulates each of the steps of M, using ) steps. The length of the active portion of S s tape determines how long S taes to scan it. A scan of the active portion of S s tape uses ) steps. (Why???) Hence, the total time of S on input of length n is O ( + ) t ( ). Theory of Computation, Feodor F. Dragan, Kent State University 7 Complexity relations among models: Nondeterministic TM A non-deterministic TM is a decider if all its computation branches halt on all inputs. Def 5: Let N be a non-deterministic TM that is a decider. The running time of N is the function f: N N, where f( is the maximum number of steps that N uses on any branch of its computation on any input of length n. f( accept/reject Deterministic δ : Q Γ Q Γ { L, R} reject accept reject Non-deterministic δ : Q Γ Ρ( Q Γ { L, R}) Theorem : Let be a function, where n. Then every time nondeterministic TM has an equivalent time deterministic TM. ) We have seen before that any non-deterministic TM N has an equivalent deterministic TM D, that simulates it. Theory of Computation, Feodor F. Dragan, Kent State University 8 4

Non-deterministic TMs vs. ordinary TMs The simulating deterministic TM D has three tapes. Tape 1 always contains the input string and is never altered. Tape maintains a copy of N s tape on some branch of its non-deterministic computation. Tape 3 eeps trac of D s location in N s non-deterministic computation tree. D 0 1 0 0 x x # 0 Every node in the tree can have at most b children, where b is the size of the largest set of possible choices given by N s transition function. Tape 3 contains a string over Σ = { 1,,..., b b }*. Each symbol in the string tells us which choice to mae next when simulating a step in one branch in N s non-deterministic computation. This gives the address of a node in the tree. On an input of length n, every branch of N s non-deterministic computation tree has a length of at most. Hence the total number of leaves in the tree is at most t ( b. The total number of nodes in the tree is less that twice the maximum number of leaves, i.e. is bounded by ). Hence the running time of D is t ( ) O ( b ) =. D has three tapes. Converting it to a single tape TM S at most squares the running time. So, the running time of S is 1 x 1 3 3 3 1 ( ) = Input tape Simulation tape Address tape. Theory of Computation, Feodor F. Dragan, Kent State University 9 = ) t ( ) ). 5