SEPARATING NOTIONS OF HIGHER-TYPE POLYNOMIAL-TIME. Robert Irwin Syracuse University Bruce Kapron University of Victoria Jim Royer Syracuse University

Similar documents
ADVENTURES IN TIME & SPACE. Jim Royer Syracuse University

Computability in Europe 2006 Clocking Type-2 Computation in the Unit Cost Model

Complete problems for classes in PH, The Polynomial-Time Hierarchy (PH) oracle is like a subroutine, or function in

Adventures in Time and Space

Some Applications of Logic to Feasibility in Higher Types

ADVENTURES IN TIME AND SPACE

The basic feasible functionals in computable analysis

1 Deterministic Turing Machines

Linear two-sorted constructive arithmetic

Database Theory VU , SS Complexity of Query Evaluation. Reinhard Pichler

CS 4110 Programming Languages & Logics. Lecture 16 Programming in the λ-calculus

Essential facts about NP-completeness:

CS154, Lecture 17: conp, Oracles again, Space Complexity

Speed-up Theorems in Type-2 Computation

1 Computational Problems

Notes for Lecture Notes 2

Reducibility in polynomialtime computable analysis. Akitoshi Kawamura (U Tokyo) Wadern, Germany, September 24, 2015

NP, polynomial-time mapping reductions, and NP-completeness

Definition: conp = { L L NP } What does a conp computation look like?

SEQUENTIALITY & COMPLEXITY. Jim Royer Syracuse University

Definition: Alternating time and space Game Semantics: State of machine determines who

Interpretation of stream programs:

CSC 5170: Theory of Computational Complexity Lecture 9 The Chinese University of Hong Kong 15 March 2010

conp, Oracles, Space Complexity

Unprovability of circuit upper bounds in Cook s theory PV

1 Circuit Complexity. CS 6743 Lecture 15 1 Fall Definitions

: On the P vs. BPP problem. 30/12/2016 Lecture 12

: Computational Complexity Lecture 3 ITCS, Tsinghua Univesity, Fall October 2007

Notes on Complexity Theory Last updated: October, Lecture 6

JASS 06 Report Summary. Circuit Complexity. Konstantin S. Ushakov. May 14, 2006

Lecture 6: Oracle TMs, Diagonalization Limits, Space Complexity

Notes for Lecture 3... x 4

NP-Completeness I. Lecture Overview Introduction: Reduction and Expressiveness

On approximate decidability of minimal programs 1

Lecture 4 : Quest for Structure in Counting Problems

Computational Complexity of Bayesian Networks

Principles of Program Analysis: Control Flow Analysis

CS154, Lecture 10: Rice s Theorem, Oracle Machines

On Type-2 Complexity Classes

Outline. Complexity Theory. Example. Sketch of a log-space TM for palindromes. Log-space computations. Example VU , SS 2018

2 P vs. NP and Diagonalization

On the Computational Complexity of Longley s H Functional

Math Models of OR: Branch-and-Bound

Feedback Turing Computability, and Turing Computability as Feedback

Lecture 20: conp and Friends, Oracles in Complexity Theory

CSCE 551 Final Exam, Spring 2004 Answer Key

Umans Complexity Theory Lectures

The World According to Wolfram

Between proof theory and model theory Three traditions in logic: Syntactic (formal deduction)

Lecture 3: Reductions and Completeness

Introduction to λ-calculus

Toda s theorem in bounded arithmetic with parity quantifiers and bounded depth proof systems with parity gates

INAPPROX APPROX PTAS. FPTAS Knapsack P

Introduction to Complexity Theory. Bernhard Häupler. May 2, 2006

Definition: Alternating time and space Game Semantics: State of machine determines who

an efficient procedure for the decision problem. We illustrate this phenomenon for the Satisfiability problem.

Universality for Nondeterministic Logspace

Apropos of an errata in ÜB 10 exercise 3

Arithmetical Hierarchy

Proof Theory and Subsystems of Second-Order Arithmetic

Arithmetical Hierarchy

On the Computational Hardness of Graph Coloring

6.841/18.405J: Advanced Complexity Wednesday, February 12, Lecture Lecture 3

Formal Techniques for Software Engineering: Denotational Semantics

REDUCTION OF HILBERT-TYPE PROOF SYSTEMS TO THE IF-THEN-ELSE EQUATIONAL LOGIC. 1. Introduction

Computational complexity and some Graph Theory

III.2 (b) Higher level programming concepts for URMs Anton Setzer

1 Deterministic Turing Machines

1 From previous lectures

Parallel Time and Proof Complexity. Klaus Aehlig

Lecture 3 (Notes) 1. The book Computational Complexity: A Modern Approach by Sanjeev Arora and Boaz Barak;

6.045: Automata, Computability, and Complexity (GITCS) Class 17 Nancy Lynch

NP and Completeness. Klaus Sutner Carnegie Mellon University. Fall 2017

Operational Semantics

Complexity Theory VU , SS The Polynomial Hierarchy. Reinhard Pichler

NP and Completeness. Where Are We? 2. Reductions, Again 3. Polynomial Time Turing Reductions 4. Example 2 6. Example 1 5. Proposition.

Outline. Complexity Theory EXACT TSP. The Class DP. Definition. Problem EXACT TSP. Complexity of EXACT TSP. Proposition VU 181.

Umans Complexity Theory Lectures

Complexity Theory. Knowledge Representation and Reasoning. November 2, 2005

Logic: The Big Picture

Resource-bounded Forcing Theorem and Randomness

Reductions in Computability Theory

Lecture 24: Randomized Complexity, Course Summary

Definability of Recursive Predicates in the Induced Subgraph Order

Introduction to Turing Machines. Reading: Chapters 8 & 9

Third-Order Computation and Bounded Arithmetic

Lecture 3. 1 Terminology. 2 Non-Deterministic Space Complexity. Notes on Complexity Theory: Fall 2005 Last updated: September, 2005.

An introduction to Parameterized Complexity Theory

Lecture 59 : Instance Compression and Succinct PCP s for NP

Reverse Mathematics. Benedict Eastaugh December 13, 2011

Static Program Analysis

Lecture 5: Introduction to Complexity Theory

2. Notation and Relative Complexity Notions

Polynomial Time Computation. Topics in Logic and Complexity Handout 2. Nondeterministic Polynomial Time. Succinct Certificates.

Chebyshev Polynomials, Approximate Degree, and Their Applications

BBM402-Lecture 11: The Class NP

DRAFT. Diagonalization. Chapter 4

Lecture 11: Extrema. Nathan Pflueger. 2 October 2013

Regular Languages and Finite Automata

Principles of Knowledge Representation and Reasoning

Transcription:

SEPARATING NOTIONS OF HIGHER-TYPE POLYNOMIAL-TIME Robert Irwin Syracuse University Bruce Kapron University of Victoria Jim Royer Syracuse University

FUNCTIONALS & EFFICIENCY Question: Suppose you have a programming language with a combinator: C: (N N) (N N) N. [C f g ] = [f ] [g ]. How would you say that an implementation of C is efficient? What does NOT work: You could say simply [f ], [g ] Ptime = [C f g ] Ptime. But this leaves open nasty possibilities such as [f ], [g ] Exptime & [C f g ] / PrimRec. What we would like: To say something along the lines of the complexity of C f g X C (the complexity of f, the complexity of g) for some sort of X C. That is, we want to lift complexity to higher types. 1

HIGHER-TYPE ANALOGUES OF PTIME In higher-type computational complexity, one of the first landmarks to identify is a higher-type analogue of polynomial time. Why? At type-level 2, the type-2 Basic Feasible Functionals (Mehlhorn 1974, Cook-Urquhart 1989) seem a reasonable choice. (More on this later.) Above type-level 2 (in the simple types), there are alternative notions: The Basic Feasible Functionals (Cook-Urquhart 1989) The Class D (Seth 1995)... Seth conjectured BFF D. In this paper, we clarify the nature of D (& lots of other things), and confirm Seth s conjecture. 2

CHARACTERIZATIONS OF THESE CLASSES The Type-2 BFFs = the functionals computable through: A Type-2 fragment of PV ω (Cook-Urquhart 1989) Type-2 Bounded Typed Loop Programs (Cook-Kapron 1989) A reasonable type-2 machine model (Kapron-Cook 1991) Type-2 Inflationary Tiered Loop Programs (Irwin-Kapron-Royer 1998)... The full BFFs = the functionals computable through: PV ω (Cook-Urquhart 1989) Bounded Typed Loop Programs (Cook-Kapron 1989) A poorly understood machine model (Seth 1995) Inflationary Tiered Loop Programs (this paper)... The Class D = the total functionals computable through: a very poorly understood machine model (Seth 1995) 3

A BIT OF EXPLICIT COMPUTATIONAL COMPLEXITY The conventional ingredients for a complexity class a machine/ cost model + a size measure for inputs + a family of bounding functions C Some type-1 examples: Det. TMs + length of a string + polynomials Ptime Det. SMMs + # nodes in a graph + polynomials Ptime Nondet. TMs + length of a string + polynomials NP Det. TMs + length of a string + 2 polys Exptime A type-2 example: OTMs/ans. length cost + lengths of strings & type-1 functions + 2nd-order polynomials BFF 2 4

EXPLICIT COMPLEXITY AT TYPE-LEVEL 2 OTMs/Answer length cost Oracle Turing Machines An oracle query costs the length of the answer. Every other twitch of the machine costs 1. Length of a type-1 function For f: N N, define f = λn max{ f(x) : x n }. 2nd-order polynomials P : : = num. const type-0 var P + P P P type-1 var (P) E.g., f ( f (2 y ) + 6) The Type-2 BFFs The things computable on OTMs in (2nd-order) poly time in the lengths of the (type-level 1 and 2) inputs (Kapron-Cook 1991) 5

EXPLICIT COMPLEXITY ABOVE TYPE-LEVEL 2 SETH S MACHINE CHARACTERIZATIONS OF BFF AND D These are only partial analogues of the Kapron-Cook characterization. E.g., there is no explicit notion of the length of a higher-type function, notion of polynomials over lengths, or mention of the new ingredient that pops up at type-level 3 the underlying domain of computation. (More on this shortly) There is a lot of cleverness and insight in these characterizations. WE WORK AT COMPLETING SETH S PICTURE We introduce higher-type notions of length and polynomials. Rather than start with machines, we use typed programming formalisms. These type-systems are based on h.t. polynomials over h.t. lengths. These formalisms are used to treat h.t. machines. (Not In This Talk!!!) 6

STEP 1: A REFERENCE P.L. FOR THE BFFS Bounded Typed Loop Programs (BTLPs) A simple imperative P.L. Simple types over N [N] = nat. numbers { 0, 1 } Our BTLP = C-K s BTLP global refs to type-n vars + λ-exps in proc calls No recursive procedures Iteration via Loop construct: Loop x with w do S Endloop means S is repeated x -many times Each y : = E in S is equiv to If E w then y : = E else y : = 0 No assignments to x or w in S We avoid dealing with h.t. recursion in this paper. 7

A BTLP PROCEDURE FOR (F,x) i< x F(λz i) Procedure SumUp (F: (N N) N, x: N) var i, maxi, sum; maxi : = 0; i : = 0; Loop x with x do (* Find the i for which F(λz: N i) is maximal *) If F(λz: N maxi)< F(λz: N i) then maxi : = i; Endif; i : = i+1; Endloop; sum : = 0; i : = 0; Loop x with ( x +1) ( F(λz: N maxi) +1) do (* Compute the sum *) sum : = sum + F(λz: N i); i : = i+1; Endloop; End Return sum 8

STEP 2: HIGHER TYPE LENGTH A FIRST ATTEMPT For F: (N N) N, lets try: F (g) = max{ F(f) f g }, where F Full (N N) N, g TMon ω ω. A FIRST PROBLEM If F is unbounded on { f f is 0-1 valued }, then F (λn 1) =. OPTION 1 Restrict F to TCont (N N) N. Then König s Lemma saves us (for the moment). OPTION 2 Restrict the max in the definition to better reflect what happens in BTLP proc. calls. We follow up on option 1 here. CONVENTIONS ω = tallies 0. x N. n ω. f: N N. len(x) = len. of dyadic rep. of x x = 0 len(x) f : ω ω and f (n) = max{ f(x) x n }. Full σ = all type-σ fncts TCont σ = total, cont. type-σ fncts TMon σ = hered. total, mono. incr. type-σ fncts 9

LENGTH AT TYPE-LEVEL 3 A SECOND ATTEMPT For Ψ: TCont ((N N) N) N lets try: Ψ (G) = max({ Ψ(F) F G }), where F: TCont (N N) N and G: TMon (ω ω) ω. A SECOND PROBLEM There are Ψ that are unbounded on { F F is 0-1 valued }, and hence have Ψ (λg 1) =. A (SORT OF) SOLUTION Further restrict Ψ so that it is bounded on sets of the form { F F G } for G TMon (ω ω) ω... and so on up the type-hierarchy. 10

BOUNDED LENGTH AND BOUNDED CONTINUITY What will have to pass for DEFINITIONS σ simple type over N σ b corresp. simple type over ω For σ at type-levels 1 or 2, BCont σ = TCont σ and TLen σ b = TMon σ b. x b = x, for x N. BCont N = N. TLen ω = ω. For σ = σ 0 σ n N, f TCont σ, l i TLen σi b & x i : BCont σi, define: f b ( l) = sup{ f( x) x i b l i }. BCont σ = the f s with f b total. For σ above type-level 2, BCont σ TCont σ and TLen σ b has noncont. elms. In general we have x 0 (x 1,..., x k ) b x 0 b ( x 1 b,..., x k b ), but not equality. TLen σ b = the total f b s. 11

STEP 3: HIGHER-TYPE POLYNOMIALS Example Consider the bounded continuous Ψ given by: Ψ(F, x) = i< x F(λz i). We can show that Ψ(F, x) ( x + 1) F b (λn x ). We want the RHS to be a polynomial bound in x and F b. HIGHER TYPE POLYNOMIALS (HTPs) Syntax Simply typed λ-calculus + base type ω + plus and times Semantics [ ] TLen : syntax TLen... the obvious thing Depth depth(t) = max depth of nesting of applications in t s n.f. 12

MORE ON THE DEPTH OF HTPs Example ( ) depth (k + 1) G(λn m) = 1. ( ) depth G(λn G(λm g(m + n))) = 3. LEMMA Suppose that t, t 1,..., t n are HTP-terms & s = t[x 1,..., x n t 1,..., t n ] a = depth(t) and b = max{depth(t i ) i = 1,... n} l = the max type-level of the types assigned to the t i s. Then, depth(s) e l (a, b), where: e 0 (a, b) = a + b. e l+1 (0, b) = b. e l+1 (a + 1, b) = max( 1 + e l+1 (a, b), e l (b, e l+1 (a, b)). The e l s climb the Grzegorczyk hierarchy. 13

STEP 4: INFLATIONARY TYPED LOOP PROGRAMS ITLP is an imperative P.L. similar to BTLP except types have a complexity component more dynamic iteration construct... ITLP Types An ITLP type is a pair (σ, d), where σ is a simple type over N the shape d is an element of ω the depth We often write (N, d) as N d (σ 0 σ n N, d) as σ 0 σ n d N X: (σ, d) X is σ object with a depth d HTP bound X: (σ, 0) X is a parameter 14

SOME ITLP TYPING RULES The Abstraction Rule v 0 : (σ 0, 0) v n : (σ n, 0) v(v 0,..., v m ): N i λv 0,..., v n v(v 0,..., v m ): σ 0 σ n i N (We ll have i >0 in all the actual uses of the above rule.) Procedure Application Rule v: σ 0 σ n i+1 N a 0 : (σ 0, d 0 ) a n : (σ n, d n ) v(a 0,..., a n ): N el (i+1,d) where d = max i<n d i and l = max i<n type-level(σ i ). Parameter Application Rule v: σ 0 σ n 0 N a 0 : (σ 0, d 0 ) a n : (σ n, d n ) v(a 0,..., a n ): N d+1 where d = max i<n d i. 15

MORE ON ITLP Type-Level 0 Downward Coercion ITLP includes a primitive down with the intended semantics down(x, y) = and typing rule v 0 : N i v 1 : N j down(v 0, v 1 ): N i { y, if y x 0, if y > x ( where i < j We ll see an important variant of this later. ) Semantics We have a Full- and a BCont-based semantics [ ] Full : syntax D Full [ ] BCont : syntax D BCont Elms of D X are of the form (x, d) x X σ, for some σ, and d ω. Also ( ) (x 0, d 0 ) (x 1, d 1 ),..., (x n, d n ) = (x 0 (x 1,..., x n ), d), where d is determined per the ITLP typing rules. 16

SOME THEOREMS, FINALLY!! THEOREM (Poly Boundedness) Suppose P is a closed type-(τ, d) ITLP procedure. Then there is a type- τ b, depth-d HTP p such that [P] BCont b [p] TLen. THEOREM (BTLP ITLP) Any BTLP procedure can be translated to an equivalent ITLP procedure, wrt the Full-semantics, COROLLARY (a) BTLP is poly-bounded on BCont args. (b) BTLP has a sensible BCont-based semantics. DEFINITION BCBFF = the BTLP-computable fcnls over BCont. 17

STEP 5: FLATTENING ITLP ITLP = ITLP the Procedure Application Rule (with the e l s) + The ITLP Procedure Application Rule v: σ 0 σ n i+1 N a 0 : (σ 0, d 0 ) a n : (σ n, d n ) v(a 0,..., a n ): N i+d+1 where d = max j n d j and where, for each j = 0,..., n, if σ j is an arrow type, then d j = 0. ITLP is reasonably close to Seth s machine-model for the BFFs. THEOREM (ITLP ITLP BTLP) ITLP, ITLP, and BTLP are inter-translatable. But some directions are more pleasant than others COROLLARY ITLP, ITLP, and BTLP each: (a) determine BFF wrt to their Full-based semantics. (b) determine BCBFF wrt to their BCont-based semantics. 18

A SIDE STEP: ADDING H.T. DOWNWARD COERCIONS Suppose we extend ITLP to allow terms such as ( ) where, say, λy down(bnd(y),g(x,y)) g: N N 1 N, bnd: N 1 N, x: N 3. How do we type ( )? Suppose down has the type N N 0 N. Then ITLP s rules assign N 6 N. But, we know that b [λy down(bnd(y),g(x,y))] BCont and this holds independent of the value of x! So, why not assign ( ) the same depth as bnd? [bnd] BCont b ITLP is such an extension with such typing rules. So what? 19

MORE ON ITLP A KEY EXAMPLE Procedure H(F: (N N) 0 N): N 3 End Procedure bnd(y: N 0 ): N 1 Return 0 1 End Procedure g(x: N 0, y: N 0 ): N 1 If x=y then Return 1 else Return 0 End (* Note: λy g(x,y) = C { x }. *) var w: N 3, x: N 3 ; x: = 0 2 ; For w : = 0 1 to x do ( ) x : = F λy down(bnd(y), g(x,y)) ; Endfor; Return x Suppose F 0 : Full (N N) N is F 0 (f) = { 0 x +1, if f = C { x } ; 0 0, if x[f C { x } ]; Then H(F 0 ) is undefined. However THM (Poly Boundedness) Suppose P is a closed type- (τ, d) ITLP procedure. Then there is a type- τ b, depth-d HTP p such that [P] BCont b [p] TLen. What is going on? 20

THE BFF VERSUS D SEPARATIONS DEFINITIONS D is the class of total ITLP -computable functionals over Full. D is the class of total functionals by Seth s D-machines over Full. BCD is the class of ITLP -computable functionals over BCont. BCD is the class of functionals by Seth s D-machines over BCont. THEOREM BCBFF BCD, with a witness at type-level 3. BFF D, with a witness at type-level 3. BCD BCD and D D. These containments are likely strict. 21

CONCLUSIONS When computing over Full It seems unlikely that D is recursively presentable. Hence, BFF seems the more natural class. When computing over BCont BCBFF seems too small. Hence, BCD seems the more natural class. But, is either Full or BCont a reasonable thing to compute over? Can we make this question more precise? Are there natural notions of feasibility for other computational domains? While both the complexity theory and semantics above are fairly humble, they mesh! 22