The L Machines are very high-level, in two senses:

Similar documents
Element x is R-minimal in X if y X. R(y, x).

Homework 5: Parallelism and Control Flow : Types and Programming Languages Fall 2015 TA: Evan Cavallo

Induction on Failing Derivations

Hoare Logic: Part II

Space-aware data flow analysis

Supplementary Notes on Inductive Definitions

Lecture Notes: Axiomatic Semantics and Hoare-style Verification

Lecture Notes on Inductive Definitions

Lecture Notes on Inductive Definitions

7 RC Simulates RA. Lemma: For every RA expression E(A 1... A k ) there exists a DRC formula F with F V (F ) = {A 1,..., A k } and

Dynamic Semantics. Dynamic Semantics. Operational Semantics Axiomatic Semantics Denotational Semantic. Operational Semantics

Recitation 2: Binding, Semantics, and Safety : Foundations of Programming Languages

COMP 3161/9161 Week 2

Models of Computation,

A call-by-name lambda-calculus machine

Decidability: Church-Turing Thesis

A Tableau Calculus for Minimal Modal Model Generation

A Thread Algebra with Multi-level Strategic Interleaving

Hoare Logic I. Introduction to Deductive Program Verification. Simple Imperative Programming Language. Hoare Logic. Meaning of Hoare Triples

III.2 (b) Higher level programming concepts for URMs Anton Setzer

Type Systems as a Foundation for Reliable Computing

Lecture Notes on Data Abstraction

Part I: Definitions and Properties

Axiomatic Semantics. Semantics of Programming Languages course. Joosep Rõõmusaare

Lecture Notes on The Curry-Howard Isomorphism

CS 6110 Lecture 28 Subtype Polymorphism 3 April 2013 Lecturer: Andrew Myers

Discrete Mathematics for CS Fall 2003 Wagner Lecture 3. Strong induction

PHIL 422 Advanced Logic Inductive Proof

TR : Binding Modalities

Safety Analysis versus Type Inference

An Operational Semantics for the Dataflow Algebra. A. J. Cowling

An Introduction to Logical Relations Proving Program Properties Using Logical Relations

Design of Distributed Systems Melinda Tóth, Zoltán Horváth

KRIPKE S THEORY OF TRUTH 1. INTRODUCTION

Informal Statement Calculus

Programming Languages and Types

PEANO AXIOMS FOR THE NATURAL NUMBERS AND PROOFS BY INDUCTION. The Peano axioms

Languages, regular languages, finite automata

Roy L. Crole. Operational Semantics Abstract Machines and Correctness. University of Leicester, UK

Mathematics 114L Spring 2018 D.A. Martin. Mathematical Logic

T Reactive Systems: Temporal Logic LTL

CS411 Notes 3 Induction and Recursion

Foundations of Informatics: a Bridging Course

König s Lemma and Kleene Tree

October 6, Equivalence of Pushdown Automata with Context-Free Gramm

Gödel s Incompleteness Theorem. Overview. Computability and Logic

Bidirectional Decision Procedures for the Intuitionistic Propositional Modal Logic IS4

Undecidable Problems and Reducibility

Human-Readable Machine-Verifiable Proofs for Teaching Constructive Logic

Chapter 4: Computation tree logic

Chapter 3 Deterministic planning

Practical Foundations for Programming Languages

Peano Arithmetic. CSC 438F/2404F Notes (S. Cook) Fall, Goals Now

Deterministic Finite Automata. Non deterministic finite automata. Non-Deterministic Finite Automata (NFA) Non-Deterministic Finite Automata (NFA)

Tute 10. Liam O'Connor. May 23, 2017

Notes on Proving Arithmetic Equations

Lecture Notes on Subject Reduction and Normal Forms

An Introduction to Modal Logic III

Limitations of OCAML records

Lecture notes on Turing machines

Lecture 11: Gödel s Second Incompleteness Theorem, and Tarski s Theorem

Classical Program Logics: Hoare Logic, Weakest Liberal Preconditions

Lecture Notes on Heyting Arithmetic

Introduction to Kleene Algebras

Gödel s Incompleteness Theorem. Overview. Computability and Logic

Context-free grammars and languages

CMPSCI 250: Introduction to Computation. Lecture #22: From λ-nfa s to NFA s to DFA s David Mix Barrington 22 April 2013

Step-indexed models of call-by-name: a tutorial example

Mechanizing Metatheory in a Logical Framework

Lecture Notes on Sequent Calculus

Provably Total Functions of Arithmetic with Basic Terms

Preliminaries. Introduction to EF-games. Inexpressivity results for first-order logic. Normal forms for first-order logic

Temporal logics and explicit-state model checking. Pierre Wolper Université de Liège

Inducing syntactic cut-elimination for indexed nested sequents

Logics For Epistemic Programs

Logic I - Session 10 Thursday, October 15,

Natural deduction for truth-functional logic

The Underlying Semantics of Transition Systems

The Pumping Lemma. for all n 0, u 1 v n u 2 L (i.e. u 1 u 2 L, u 1 vu 2 L [but we knew that anyway], u 1 vvu 2 L, u 1 vvvu 2 L, etc.

Creative Objectivism, a powerful alternative to Constructivism

Denotational semantics: proofs

Enhancing Active Automata Learning by a User Log Based Metric

Mathematical Logic. Introduction to Reasoning and Automated Reasoning. Hilbert-style Propositional Reasoning. Chiara Ghidini. FBK-IRST, Trento, Italy

17.1 Correctness of First-Order Tableaux

Introduction to Metalogic

Modal and temporal logic

Ordinal Computability

Harvard School of Engineering and Applied Sciences CS 152: Programming Languages

Applied Logic. Lecture 1 - Propositional logic. Marcin Szczuka. Institute of Informatics, The University of Warsaw

Soundness Theorem for System AS1

MAI0203 Lecture 7: Inference and Predicate Calculus

CSE 505, Fall 2005, Midterm Examination 8 November Please do not turn the page until everyone is ready.

arxiv:math/ v1 [math.lo] 5 Mar 2007

Advanced Topics in LP and FP

CMSC 336: Type Systems for Programming Languages Lecture 10: Polymorphism Acar & Ahmed 19 February 2008

Kleene realizability and negative translations

6.045: Automata, Computability, and Complexity Or, Great Ideas in Theoretical Computer Science Spring, Class 10 Nancy Lynch

Typing λ-terms. Types. Typed λ-terms. Base Types. The Typing Relation. Advanced Formal Methods. Lecture 3: Simply Typed Lambda calculus

Definition: Alternating time and space Game Semantics: State of machine determines who

Přednáška 12. Důkazové kalkuly Kalkul Hilbertova typu. 11/29/2006 Hilbertův kalkul 1

Transcription:

What is a Computer? State of the machine. CMPSCI 630: Programming Languages An Abstract Machine for Control Spring 2009 (with thanks to Robert Harper) Internal registers, memory, etc. Initial and final state. 1 What is a Computer? Instructions for transforming the state. What is a Computer? A computer is a transition system. Must be effective, usually simple. Rules for determining order of execution. Execution = instruction by instruction state transformation. Set of states S. Initial states I S. Final states F S. Relation S S. 2 Execution = transition sequence. 3 The L Machines Can view L{num str}, L{nat }, etc. as implicitly defining abstract machines, which we might call the L Machines. States: closed (simple arithmetic, PCF, etc.) expressions. Initial: any well-typed, closed expression. Final: closed values. The L Machines The L Machines are very high-level, in two senses: Control. Complex search rules specify order of execution. Rely on a metastack to manage search. Data. Parameter passing is by substitution, which is complex and generates new code on the fly. Transitions: is given by structural semantics rules. Evaluation: e v. 4 5

Managing Data Managing Control We ve already considered alternatives to the L Machines. At least one was aimed at supporting more realistic treatment of data: Environment semantics: combine hypothetical judgements and evaluation semantics for a more realistic treatment of variable binding The L Machines cheat by relying on implicit storage management. Search rules have one or more premises that must be applied recursively. Interpreter is not tail recursive (iterative). Real machines cannot rely on implicit storage management! 6 7 For example, Managing Control e 1 e 1 ap(e 1 ; e 2 ) ap(e 1 ; e 2) Managing Control The K{nat } abstract machine for the language L{nat } makes the flow of control explicit in the state of the machine. To apply this rule, we must (as the contextual semantics made somewhat more explicit): Explicit control stack, or continuation, that manages the flow of control. 1. Save the current state: ap( ; e 2 ). 2. Execute a step in e 1, obtaining e 1. Transition rules will have no premises fully iterative (tail recursive) implementation. 3. Restore the state: ap(e 1 ; e 2). 8 9 The K{nat } Abstract Machine The K{nat } Abstract Machine The state of the K Machine is a pair (k, e), where The state of K{nat } takes one of two forms: k is a control stack, or continuation; e is a closed expression. The K{nat } transition relation is defined inductively by a set of rules. an evaluation state k e corresponds to evaluating closed expression e relative to control stack k a return state k e, where e val, corresponds to evaluating stack k relative to closed value e The separator points to the focal entity of the state. 10 11

The K{nat } Abstract Machine: Control Stacks A control stack, or continuation, represents the context of evaluation, into which the value of the current expression is to be returned. Formally, the control stack is a list of stack frames: ε stack f frame k stack k; f stack Think of pushing a frame onto the control stack. The K{nat } Abstract Machine States: as just described Initial: ε e Final: ε e with e val Transitions: is given by structural semantics rules. Evaluation: ε e ε e with e val. 12 13 The K{nat } Abstract Machine: Stack Frames A stack frame records one pending computation. For K{nat } the appropriate stack frames are inductively defined as follows: s( ) frame ifz( ; e 1 ; x.e 2 ) frame ap( ; e 2 ) frame Think of a frame as an abstract return address; the marks the return point. Frames correspond to rules with transition premises (search rules) of L{nat }: an explicit record of pending computations replaces reliance on structure of transition derivation to record context. 14 The K{nat } Abstract Machine: Natural Numbers To evaluate z we simply return it: k z k z To evaluate s(e) we first evaluate the argument: k s(e) k; s( ) e... and when that completes we return its successor to the stack: k; s( ) e k s(e) 15 The K{nat } Abstract Machine: Case Analysis First we evaluate the test: k ifz(e; e 1 ; x.e 2 ) k; ifz( ; e 1 ; x.e 2 ) e... and then decide how to proceed: k; ifz( ; e 1 ; x.e 2 ) z k e 1 k; ifz( ; e 1 ; x.e 2 ) s(e) k [e/x]e 2 The K{nat } Abstract Machine: Functions To evaluate lam[τ](x.e) we simply return it: k lam[τ](x.e) k lam[τ](x.e) To evaluate ap(e 1 ; e 2 ) we first evaluate the function: k ap(e 1 ; e 2 ) k; ap( ; e 2 ) e 1... and then perform the application: k; ap( ; e 2 ) lam[τ](x.e) k [e 2 /x]e 16 17

The K{nat } Abstract Machine: General Recursion k fix[τ](x.e) k [fix[τ](x.e)/x]e Note that evaluation of general recursion requires no stack space!. Safety for the K{nat } Abstract Machine Based on a new typing judgement, k : τ, meaning stack k is well-formed and expects a value of type τ. A stack is well-formed if, when passed a value of the appropriate type, it safely transforms that value into another value. A stack represents the work remaining to complete a computation, so k : τ means stack k transforms a value of type τ into a value of type τ This judgement is inductively defined by: ε : τ k : τ f : τ τ k; f : τ 18 19 Frame Transformation Judgement The auxiliary judgement f : τ τ, stating that f transforms a value of type τ to a value of type τ is inductively defined by: s( ) :nat nat e 1 : τ x : nat e 2 : τ ifz( ; e 1 ; x.e 2 ) : nat τ e 2 : τ 2 ap( ; e 2 ):parr(τ 2 ; τ) τ Well-Formed States for the K{nat } Abstract Machine The two forms of state for the K{nat } Machine are well-formed provided that their stack and expression components match. This judgement is inductively defined by: k : τ e : τ k e ok k : τ e : τ e val k e ok 20 21 Theorem 1 (Preservation) If s ok and s s, then s ok. Theorem 2 (Progress) If s ok then either 1. s final, or Type Safety 2. there exists s such that s s. Correctness of the K{nat } Abstract Machine Does the K{nat } abstract machine correctly implement the PCF L Machine? Given a PCF expression (program) e, do we get the same result from evaluating e with the K{nat } Abstract Machine and with the PCF L Machine and vice versa? We want the following relationship between the K{nat } and the PCF L Machines: 1. [Completeness] If e e, where e val, then ε e ε e. 2. [Soundness] If ε e ε e, then e e with e val. 22 23

Proceed by induction on the definition of e e. This reduces to two cases: 1. If e val, then ε e ε e. 2. If e e, then for every v val, if ε e ε v, then ε e ε v. Proving the second requires inductive analysis of the derivation of e e. But there is a problem: Can t just consider empty stacks, since for example if e is ap(e 1 ; e 2 ) the first step of the K{nat } abstract machine is ε ap(e 1 ; e 2 ) ε; ap( ; e 2 ) e 1. First of these is easily proven by induction on structure of e. So more generally we need to prove: 24 If e e, then for every v val, if k e k v then k e k v. 25 Returning to our example where e is ap(e 1 ; e 2 ) and e is ap(e 1 ; e 2) with e 1 e 1 : We are given that k ap(e 1 ; e 2) k v and need to show that k ap(e 1 ; e 2 ) k v. It is easy to show that the first step of the former transition is k ap(e 1 ; e 2) k; ap( ; e 2 ) e 1. Idea: if e e, then any complete execution from e in any context determines a corresponding complete execution from e in that context. Evaluation semantics gives us the connection to the ultimate value v of each sub-expression that we require. Theorem 3 If e v, then for every k stack, k e k v. But this leaves us needing to apply induction to the derivation of e 1 e 1, which in turn requires having a v 1 such that e 1 v 1, which structural semantics doesn t give us. 26 27 Proof is by induction on the evaluation semantics for the PCF L Machine. Consider, for example, the rule: e 1 lam[τ 2 ](x.e) [e 2 /x]e v ap(e 1 ; e 2 ) v For arbitrary control stack k we are to show that k ap(e 1 ; e 2 ) k v. Using each of the three inductive hypotheses in succession, interleaved with steps of the abstract machine, we calculate: k ap(e 1 ; e 2 ) k; ap( ; e 2 ) e 1 k; ap( ; e 2 ) lam[τ 2 ](x.e) k [e 2 /x]e k v The other cases are handled similarly. 28 29

We would like to proceed by induction on the multistep transition ε e ε v. But this is awkward; intervening steps may alternate between evaluation and return states. Instead, we will view each K{nat } abstract machine state as encoding a PCF expression and show that K{nat } abstract machine transitions are simulated by PCF L Machine transitions under this encoding. We call the encoding unravel and write s e, meaning state s unravels to expression e. For initial states s = ε e and final states s = ε e, s e. Then we show that if s s, where s final, s e, and s e, then e val and e e. 30 31 To do this, we show the following two facts: 1. If s e and s final, then e val. 2. If s s, s e, s e and e v where v val, then e v. First of these is obvious, since unravelling of a final state is a value. Second requires proving: If s s, s e, and s e, then e e. The judgement s e, where s is either k e or k e is defined in terms of the auxiliary judgement k e = e by the following rules: k e = e k e e k e = e k e e That is, to unravel a state we wrap the stack around the expression. 32 33 The wrap relation is inductively defined as follows: ε e = e k s(e) =e k; s( ) e = e k ifz(e 1 ; e 2 ; x.e 3 ) = e k; ifz( ; e 2 ; x.e 3 ) e 1 = e k ap(e 1 ; e 2 )=e k; ap( ; e 2 ) e 1 = e Lemma 4 Judgement s e has mode (,!) and judgement k e = e has mode (,,!) That is, each state unravels to a unique expression and the result of wrapping a stack around an expression is uniquely determined. We are therefore justified in writing k e for the unique e such that k e = e. 34 35

The next lemma states that unravelling preserves the transition relation. Lemma 5 If e e, k e = d, k e = d, then d d. Proof is by induction on the transition e e. When the transition is a search rule (i.e., has a premise), the proof is an easy induction. When the transition is an axiom, the proof is by an inductive analysis of the stack k. Search rule case: For example, suppose e = ap(e 1 ; e 2 ), e = ap(e 1 ; e 2) and e 1 e 1. We have k e = d and k e = d. By the inductive definition of wrap, k; ap( ; e 2 ) e 1 = d and k; ap( ; e 2 ) e 1 = d. So by induction, d d as desired. 36 37 Axiom case: For example, suppose e = ap(lam[τ 2 ](x.e); e 2 ), e =[e 2 /x]e with e e directly. Assume that k e = d and k e = d ; we are to show that d d. We proceed by inner induction on the structure of k. Consider, for instance, the stack k = k ; ap( ; c 2 ). By the inductive definition rules of wrap, k ap(e; c 2 )=d and k ap(e ; c 2 )=d. But by the dynamic semantics rules ap(e; c 2 ) ap(e ; c 2 ). So by the inner inductive hypothesis we have d d as desired. If k = ε, the result is immediate. 38 39 We are finally in position to prove: Theorem 6 If s s, s e, and s e, then e e. Proof is by case analysis on the transition of K{nat }. In each case, after unravelling the transition will correspond to zero or one transitions of L{nat }. Summary Abstract machines come in all shapes and sizes. They differ in how many details of execution are made explicit. The K{nat } abstract machine manages control explicitly. Suppose that s = k s(e) and s = k; s( ) e. Note that k s(e) = e iff k; s( ) e = e, from which the result follows immediately. Suppose that s = k; ap(lam[τ](x.e 1 ); ) e 2 and s = k [e 2 /x]e 1. Let e be such that k; ap(lam[τ](x.e 1 ); ) e 2 = e and e be such that k [e 2 /x]e 1 = e. Observe that k ap(lam[τ](x.e 1 ); e 2 )= e. The result follows from Lemma 5. 40 41