Superficially Substructural Types (Technical Appendix)
|
|
- Amie Wiggins
- 5 years ago
- Views:
Transcription
1 Superficially Substructural Types (Technical Appendix) Neelakantan R. Krishnaswami MPI-SWS Derek Dreyer MPI-SWS Aaron Turon Northeastern University Deepak Garg MPI-SWS July 2012 Contents 1 The Language Syntax Typing Sharing Construct Operational Semantics Semantics Semantics of Sorts and Kinds Logical Semantics Entailment Relation Worlds Semantics of Types Semantics of Contexts Semantics of Typing Judgments Differences from the Paper 10 4 Fundamental Theorem and Soundness 10 1
2 1 The Language 1.1 Syntax Sorts σ ::= N 1 σ σ 2 Loc σ σ σ seq σ P(σ) Index Terms t ::= X n t + t t t t, t π 1 t π 2 t tt ff if(t, t, t) λx : σ. t t u t case(t, t, X t) ɛ t :: t fold(t, ɛ t, X :: Y t) {X : σ P } t t t > t t = t Propositions P, Q ::= P Q P Q P Q X : σ. P X : σ. P t = u t > u t t Kinds κ ::= σ κ Types A ::= 1 A B A B!A ptr t cap t A X : σ :: P. A X : σ :: P. A α : κ. A α : κ. A bool t nat t [A] if(t, A, B) α λx : σ. A A t Terms e ::= x e, e let x, y = e in e λx. e e e!v let!x = e in e new(e) get e e e := e e fix f(x). e share(e, v i ) tt ff if(e, e 1, e 2 ) n case(e, 0 e 1, s x e 2 ) [e] Eval Contexts E ::= [ ] E, e v, E let x, y = E in e E e v E!E let!x = E in e if(e, e 1, e 2 ) new(e) get e E get E v E := e e v := e E v := E v case(e, 0 e 1, s x e 2 ) share(e, v) Values v ::= v, v λx. e λx : σ. e!v l fix f(x). e x tt ff n Heaps h ::= h, l : v Contexts Index/Type Σ ::= Σ, α : κ Σ, X : σ Proposition Π ::= Π, P Unrestricted Γ ::= Γ, x : A Linear ::=, x : A 2
3 1.2 Typing Σ A : κ α : κ Σ Σ A : σ κ Σ t : σ Σ, X : σ A : κ Σ, α : κ A : Σ α : κ Σ A t : κ Σ λx : σ. A : σ κ Σ α : κ. A : Σ, X : σ A : Σ, X : σ P : prop Q {, } Σ Q X : σ :: P. A : Σ A : Σ B : Σ A B : Σ A : Σ B : Σ A B : Σ t : Loc Σ t : Loc Σ A : Σ A : Σ t : 2 Σ t : N Σ 1 : Σ ptr t : Σ cap t A : Σ!A : Σ bool t : Σ nat t : Σ t : 2 Σ A : κ Σ B : κ Σ if(t, A, B) : κ Σ P : prop Σ P : prop Σ Q : prop {,, } Σ P Q : prop c {, } Σ c : prop Σ, X : σ P : prop Q {, } Σ Q X : σ. P : prop Σ t : σ Σ u : σ Σ t = u : prop Σ t : N Σ u : N Σ t > u : prop Σ t : σ Σ u : P(σ) Σ t u : prop Σ; Π P (rules of first-order logic) Σ Π ok Σ ok Σ Π ok Σ Π, P ok Σ P : prop Σ Γ ok Σ ok Σ Γ ok Σ A : Σ Γ, x : A ok Σ ok Σ ok Σ ok Σ A : Σ, x : A ok 3
4 Σ; Π A B : κ (Standard congruence rules omitted) Σ A : κ Σ; Π A A : κ Σ; Π A B : κ Σ; Π B A : κ Σ; Π A B : κ Σ; Π B C : κ Σ; Π A C : κ Σ; Π A B : κ Σ, α : κ C : κ Σ; Π t = σ u Σ, X : σ A : κ Σ; Π [A/α]C [B/α]C : κ Σ; Π [t/x]a [u/x]a : κ Σ (λx : σ. A) t : κ Σ; Π (λx : σ. A) t [t/x]a : κ Σ, X : σ; Π A X B X : κ Σ; Π A B : σ κ Σ if(tt, A, B) : κ Σ; Π if(tt, A, B) A : κ Σ if(ff, A, B) : κ Σ; Π if(ff, A, B) B : κ Σ t : 2 Σ, X : 2 A : κ Σ; Π if(t, [tt/x]a, [ff/x]a) [t/x]a : κ Σ, X : σ; Π P Q Σ X : σ :: P. A : Σ; Π X : σ :: P. A X : σ :: Q. A : Σ X : σ :: P. (A B) : X FV(B) Σ; Π ( X : σ :: P. A) B X : σ :: P. (A B) : Σ, X : σ; Π P Q Σ X : σ :: P. A : Σ; Π X : σ :: P. A X : σ :: Q. A : Σ X : σ :: P. (A B) : X FV(B) Σ; Π ( X : σ :: P. A) B X : σ :: P. (A B) : Σ A : Σ t : Loc Σ; Π cap t A [cap t A] : Σ A : Σ; Π [[A]] [A] : Σ, X : σ A : Σ, X : σ P : prop Σ; Π X : σ :: P. [A] [ X : σ :: P. A] : Σ A : Σ B : Σ; Π [A B] [B A] : Σ; Π [A] [A ] : Σ; Π [B] [B ] : Σ; Π [A B] [A B ] : 4
5 Σ; Π; Γ; e : A Σ Π ok Σ Γ ok Σ ok x : A Σ; Π; Γ; x : A Σ Π ok Σ Γ ok Σ ok x : A Γ Σ; Π; Γ; x : A Σ Π ok Σ Γ ok Σ ok Σ ok Σ; Π; Γ; : 1 Σ; Π; Γ; 1 e 1 : A Σ; Π; Γ; 2 e 2 : B Σ; Π; Γ; 1, 2 e 1, e 2 : A B Σ; Π; Γ;, x : A e : B Σ; Π; Γ; λx. e : A B Σ; Π; Γ; v : A Σ; Π; Γ;!v :!A Σ; Π; Γ; 1 e : A B Σ; Π; Γ; 2, x : A, y : B e : C Σ; Π; Γ; 1, 2 let x, y = e in e : C Σ; Π; Γ; 1 e : A B Σ; Π; Γ; 2 e : A Σ; Π; Γ; 1, 2 e e : B Σ; Π; Γ; 1 e :!A Σ; Π; Γ, x : A; 2 e : C Σ; Π; Γ; 1, 2 let!x = e in e : C Σ; Π; Γ; e : A Σ; Π; Γ; new(e) : l : Loc ::.!ptr l cap l A Σ; Π; Γ; e : ptr t Σ; Π; Γ; e : cap t A Σ; Π; Γ;, get e e : A cap t 1 Σ; Π; Γ; 1 e : ptr t Σ; Π; Γ; 2 e : A Σ; Π; Γ; 3 e : cap t 1 Σ; Π; Γ; 1, 2, 3 e := e e : cap t A Σ; Π; Γ, f : A B; x : A e : B Σ; Π; Γ; fix f(x). e : A B Σ; Π; Γ; tt : bool tt Σ; Π; Γ; ff : bool ff Σ; Π; Γ; e : bool t Σ; Π, t = tt; Γ; e 1 : C Σ; Π, t = ff; Γ; e 2 : C Σ; Π; Γ;, if(e, e 1, e 2 ) : C Σ; Π; Γ; n : nat n Σ; Π; Γ; 1 e : nat t Σ; Π, t = 0; Γ; 2 e 1 : C Σ, X : N; Π, t = s X; Γ; 2, x : nat X e 2 : C Σ; Π; Γ; 1, 2 case(e, 0 e 1, s x e 2 ) : C 5
6 Σ; Π; Γ; e : A... continued Σ; Π; Γ; v : A Σ; Π; Γ; : [A] Σ, X : σ; Π, P ; Γ; v : A X FV(Π), FV(Γ), FV( ) Σ; Π; Γ; v : X : σ :: P. A Σ; Π; Γ; e : X : σ :: P. A Σ t : σ Σ; Π [t/x]p Σ; Π; Γ; e : [t/x]a Σ; Π; Γ; e : [t/x]a Σ t : σ Σ; Π [t/x]p Σ; Π; Γ; e : X : σ :: P. A Σ, X : σ; Π, P ; Γ;, x : A e : C Σ; Π; Γ; v : X : σ :: P. A X FV(Π), FV(Γ), FV( ), FV( ), FV(C) Σ; Π; Γ;, [v/x]e : C Σ, α : κ; Π; Γ; v : B α FV(Γ), FV( ) Σ; Π; Γ; v : α : κ. B Σ; Π; Γ; e : α : κ. B Σ A : κ Σ; Π; Γ; e : [A/α]B Σ A : κ Σ, α : κ B : Σ; Π; Γ; e : [A/α]B Σ; Π; Γ; e : α : κ. B Σ; Π; Γ; v : α : κ. B Σ, α : κ; Π; Γ;, x : B e : C α FV( ), FV( ), FV(Γ), FV(C) Σ; Π; Γ;, [v/x]e : C Σ; Π; Γ; e : A Σ; Π A B : Σ; Π; Γ; e : B Σ; Π P Q Σ; Π, P ; Γ; e : A Σ; Π, Q; Γ; e : A Σ A : Σ; Π; Γ; e : A Σ; Π X : σ. P Σ, X : σ; Π, P ; Γ; e : A Σ A : Σ; Π; Γ; e : A Σ; Π Σ A : Σ; Π; Γ; e : A 1.3 Sharing Construct where Σ A : σ Σ; Π; Γ; e : [A t] Σ; Π monoid σ (ɛ, ( )) Σ; Π; Γ; v i : [A/α]specT i Σ; Π; Γ; share(e, v i ) : α : σ. [α t]!spect i!splitt!joint!promotet spect i = X : σ. Y : σ i :: P i. B i [α (t i X)] Z : σ i i. C i [α (t i X)] where X, α FV(P i ), FV(Q i ), FV(B i ), FV(C i ), FV(t i ), FV(t i ) splitt = X, Y : σ. [α (X Y )] [α X] [α Y ] joint = X, Y : σ. [α X] [α Y ] [α (X Y )] promotet = X : σ :: X = X X. [α X]![α X] monoid σ (ɛ, ( )) = X : σ. ɛ X = X X, Y : σ. X Y = Y X X, Y, Z : σ. (X Y ) Z = X (Y Z) 6
7 1.4 Operational Semantics h; let = in e h; e h; let x 1, x 2 = v 1, v 2 in e h; [v 1 /x 1, v 2 /x 2 ]e h; (λx. e) v h; [v/x]e h; let!x =!v in e h; [v/x]e h; new(v) h [l : v];!l, h [l : v]; get l h [l : ]; v, h [l : ]; l := v h [l : v]; h; (fix f(x). e) v h; [fix f(x). e/f, v/x]e h; [e] h; h; if(tt, e, e ) h; e h; if(ff, e, e ) h; e h; case(0, 0 e, s x e ) h; e h; case(s v, 0 e, s x e ) h; [v/x]e h; share(v, v i ) h [l : ff];,!op i,!split,!join,!promote where op i = λx. let flag, = get l in let = l := tt in if flag then (fix f(x). f x) else let y = v i x in let = l := ff in y split = λx., join = λx. promote = λx.! 2 Semantics 2.1 Semantics of Sorts and Kinds h; e h ; e h; E[e] h ; E[e ] S : Sort Set S N = N S 1 = { } S σ σ = S σ S σ S σ σ = S σ S σ S 2 = {tt, ff} S σ = { } + S σ S seq σ = {x 1... x k x i S σ } S P(σ) = P(S σ ) K : Kind Set K = ValPred K σ κ = S σ K κ 2.2 Logical Semantics Entailment Relation The models relation = Σ is a subset of {(ρ, P ) ρ Env(Σ) Σ P : prop} 7
8 ρ = Σ always ρ = Σ P Q ρ = Σ P and ρ = Σ Q ρ = Σ P Q if ρ = Σ P then ρ = Σ Q ρ = Σ never ρ = Σ P Q ρ = Σ P or ρ = Σ Q ρ = Σ X : σ. P d S σ. ρ[x d] = Σ,X:σ P ρ = Σ X : σ. P d S σ. ρ[x d] = Σ,X:σ P ρ = Σ t = u : σ I Σ t : σ ρ = I Σ u : σ ρ ρ = Σ t > u : σ I Σ t : σ ρ > I Σ u : σ ρ ρ = Σ t u : σ I Σ t : σ ρ I Σ u : σ ρ 2.3 Worlds World n Island n HIsland n = = = { } W = (k, ω) k < n, j. ω Islandj+1 k ω[0] = HIsland k ι = (M,, ɛ, I, E) E EnvPred(M) Heap,,, λh.{(w, ɛ) W World n, h }, Heap (M,, ɛ) commutative monoid, I M ResPred n { } W ResPred n = ϕ ResAtom W. (W, r) ϕ n { = (W, r) ϕ } ResAtom n = (W, r) W World n, i. a i W.ω[i].M, r = (a 0,..., a m 1 ), m = W.ω EnvPred(M) = {{E M a E, a M. a a E} ValPred = V ValAtom W } W. (W, r, v) V r.(w, r r, v) V ValAtom = {(W, r, v) n. (W, r) ResAtom n } (k + 1, ω) = (k, ω k ) (ι 1,..., ι n ) k = ( ι 1 k,..., ι n k ) (M,, ɛ, I, E) k ϕ k = (M,, ɛ, λa. I(a) k, E) = {(W, r) ϕ W.k < k} (ι 1,..., ι n ) (ι 1,..., ι n ) = n n, i n. ι i = ι i (k, ω ) j (k, ω) = k = k j, ω ω k (k, ω ) (k, ω) = j. ((k, ω ) j (k, ω)) (s, r, r F ) : W = s = s 0... s m 1, m = W.ω, i 0... (m 1). ( W, s i ) W.ω[i].I((s r r F )[i]) r F [i] W.ω[i].E r r r h; e j h ; e = r. r = r r = r[0] = D :: ( h; e h ; e ) D j 8
9 2.4 Semantics of Types V Σ A : κ : Env(Σ) K κ V Σ 1 : ρ V Σ A 1 A 2 : ρ V Σ A B : ρ V Σ!A : ρ V Σ ptr t : ρ V Σ cap t A : ρ V Σ X : σ :: P. A : ρ V Σ X : σ :: P. A : ρ V Σ α : κ. A : ρ V Σ α : κ. A : ρ V Σ α : ρ V Σ bool t : ρ V Σ nat t : ρ V Σ [A] : ρ V Σ if(t, A, B) : κ ρ V λx : σ. A ρ V Σ A t : κ ρ = {(W, r, )} = {(W, r 1 r 2, v 1, v 2 ) (W, r 1, v 1 ) V A 1 ρ (W, r 2, v 2 ) V A 2 ρ} { } = (W, r, v) W W. (W, r, v ) V Σ A : ρ. (W, r r, v v ) E Σ B : ρ = {(W, r r,!v) (W, r, v) V A ρ r = r r} = {(W, r, l) l = I t ρ} = {(W, r [l : v], ) l = I t ρ (W, r, v) V Σ A : ρ} = {(W, r, v) d S σ. ρ[x d] = P = (W, r, v) V Σ, X : σ A : ρ[x d]} = {(W, r, v) d S σ. ρ[x d] = P (W, r, v) V Σ, X : σ A : ρ[x d]} = {(W, r, v) V K κ. (W, r, v) V Σ, α : κ A : ρ[α V ]} = {(W, r, v) V K κ. (W, r, v) V Σ, α : κ A : ρ[α V ]} = ρ(α) = {(W, r, d) d = I t ρ} = {(W, r, d) d = I t ρ} = {(W, r, ) v. (W, r, v) V Σ A : ρ} { V Σ A : κ ρ I t ρ = tt = V Σ B : κ ρ I t ρ = ff = λd S σ. V Σ, X : σ A : ρ[x d] = (V Σ A : σ κ ρ)(i t ρ) E A ρ = { (W, r, e) if j < W.k, (s, r, r F) : W, s r r F ; e j h; e then W j W, (s, r, r F ) : W, h = s r r F, r F r F, (W, r, e ) V A ρ } 2.5 Semantics of Contexts Env( ) = Env(Σ, α : κ) = {ρ, α V ρ Env(Σ), V K κ } Env(Σ, X : σ) = {ρ, X d ρ Env(Σ), d S σ } Substitutions for unrestricted context Defined by induction over a derivation of Σ Γ ok. U Σ ok ρ W U Σ Γ, x : A ok ρ W = { } = {γ, x (r, v) γ U Σ Γ ok ρ W (W, r, v) V Σ A : ρ r = r r} Substitutions for linear context Defined by induction over a derivation of Σ ok. L Σ ok ρ W L Σ, x : A ok ρ W = { } = {δ, x (r, v) δ L Σ ok ρ W (W, r, v) V Σ A : ρ} Functions π(γ) and π(δ) For γ U Σ Γ ok ρ W, ine π(γ) to be the monoidal product of the first components of all elements in the range of γ. Similarly, ine π(δ) for δ L Σ ok ρ W. 9
10 2.6 Semantics of Typing Judgments Suppose Σ Γ ok and Σ ok. We say that Σ; Π; Γ; e : A, if for every W, ρ Env(Σ), ρ = Σ Π, γ U Σ Γ ok ρ W and δ L Σ ok ρ W, it is the case that (W, π(γ) π(δ), δ(γ(e))) E A ρ. Similarly, we say that Σ; Π; Γ; v v : A, if for every W, ρ Env(Σ), ρ = Σ Π, γ U Σ Γ ok ρ W and δ L Σ ok ρ W, it is the case that (W, π(γ) π(δ), δ(γ(v))) V Σ A : ρ. 3 Differences from the Paper The inition of the model presented in the previous section differs slightly from, but is more general than, the inition of the model presented in the paper that this Appendix accompanies. First, in this Appendix, each island contains a set E (the environment invariant), which constrains the possible frame resources that are allowed on the island in the inition of E A ρ, but no such set exists in the inition of island in the paper. This additional set is included here because we expect it to be useful in encoding more advanced forms of state transition systems in our model, which we want to explore in future work. Because the paper omits this environment invariant E, it also conflates r and r F in the satisfaction relation at worlds: Instead of ining a quaternary relation (s, r, r F ) : W as in this Appendix, the paper ines a ternary relation (s, r ) : W and instantiates r with r r F in the inition of E A ρ. A second point of difference is that the inition of E A ρ provided in this Appendix existentially quantifies over a new environment resource r F, which extends the initial environment resource r F, but in the inition in the paper, r F = r F. A third difference between the two is notational: This Appendix uses the notation (W, r, v) V A ρ, whereas the paper writes the same inition as (r, v) V A W ρ. Modulo the notational difference, it is easy to show that the initions of V A ρ and E A ρ in the paper equate to the respective initions in this Appendix if we force the environment s invariant in each island to be the entire monoid of the island. This justifies why the model in this Appendix is more general than the model in the paper. Furthermore, the only construction of E in the soundness proof of the next section sets E equal to the entire monoid, so the soundness proof here also implies soundness w.r.t. the slightly different model in the paper. 4 Fundamental Theorem and Soundness Lemma 1. Well-formedness If Σ; Π; Γ; e : A, then Σ Γ ok and Σ ok. Proof. By simultaneous induction on typing and kinding derivations. Futures Let a K κ. We ine futures(a) K κ by induction on κ as follows: futures(a K ) futures(f K σ κ ) = {(W, r, v) (W, r, v) a W W } = λx S σ. futures(f x) Lemma 2 (World monotonicity). The following hold: 1. a V Σ A : κ ρ implies futures(a) = a 2. W W implies U Σ Γ ok ρ W U Σ Γ ok ρ W 3. W W implies L Σ ok ρ W L Σ ok ρ W Proof. (1) follows by induction on the given derivation of Σ A : κ. The interesting case of type variables α follows from the restriction that every element in K is future-closed. (2) and (3) are trivial consequences of (1) at κ =, using the initions of U Σ Γ ok ρ W and L Σ ok ρ W. Lemma 3 (Resource shifting). If (s, r r, r F ) : W, then (s, r, r r F ) : W. Proof. Let m = W.ω. To show (s, r, r r F ) : W, we need to show two things: (G1) s = s 0... s m 1 and for every i {0,..., m 1}, ( W, s i ) W.ω[i].I(s[i] r[i] r [i] r F [i]), and (G2) For every i {0,..., m 1}, r F [i] res [i] W.ω[i].E. (G1) is exactly the corresponding statement of invariance satisfaction for the given relation (s, r r, r F ) : W. (G2) follows because the given relation (s, r r, r F ) : W implies that r F [i] W.ω[i].E and E is closed under extensions. 10
11 Resource Futures Let a K κ. We ine rfutures(a) K κ by induction on κ as follows: rfutures(a K ) rfutures(f K σ κ ) = {(W, r, v) (W, r, v) a r r} = λx S σ. rfutures(f x) Lemma 4 (Resource monotonicity). The following hold: 1. If (W, r, e) E A ρ and r 1 r, then (W, r 1, e) E A ρ. 2. If a V Σ A : κ ρ, then rfutures(a) = a Proof. (1) is proved as follows. Because r 1 r, there is some r 2 such that r 1 = r r 2. Following the inition of (W, r 1, e) E A ρ, assume that j < W.k, (s, r r 2, r F ) : W and s r r 2 r F ; e j h; e. Now observe that (s, r r 2, r F ) : W implies (s, r, r F r 2 ) : W by Lemma 3, so instantiating the inition of (W, r, e) E A ρ with r = r, r F = r F r 2, s = s, we get W j W, (s, r, r F ) : W, σ = s r r F, r F r F r 2 and (W, r F, e ) V A ρ. It remains only to show that r F r F, but this follows trivially from r F r F r 2. (2) follows by induction on kinding judgments. For the case of A B, we use (1). Lemma 5 (Value inclusion). If (W, r, v) V Σ A : ρ, then (W, r, v) E A ρ. Proof. Since s r r F ; v 0 s r r F ; v, we can choose W = W, s = s, r = r and r F = r F in the inition E A ρ to obtain the result. Lemma 6 (Evaluation cut). If Σ; Π; Γ; 1 e : A and Σ; Π; Γ; 2, x : A E[x] : B, then Σ; Π; Γ; 1, 2 E[e] : B. Proof. We want to show that Σ; Π; Γ; 1, 2 E[e] : B. Following the inition of, assume a W, ρ Env(Σ), ρ = Σ Π, γ U Σ Γ ok ρ W and δ L Σ 1, 2 ok ρ W. From the last fact, we know that there are δ 1, δ 2 such that δ 1 L Σ 1 ok ρ W and δ 2 L Σ 2 ok ρ W. We want to show that (W, π(γ) π(δ 1 ) π(δ 2 ), δ 2 (δ 1 (γ(e[e])))) E B ρ, or, equivalently, (W, π(γ) π(δ 1 ) π(δ 2 ), δ 2 (γ(e))[δ 1 (γ(e))]) E B ρ. For ease of notation, ine 1. r 0 = π(γ), r 1 = π(δ 1 ) and r 2 = π(δ 2 ) Then, we want to show that (W, r 0 r 1 r 2, δ 2 (γ(e))[δ 1 (γ(e))]) E B ρ. Note that because γ U Σ Γ ok ρ W, 2. r 0 r 0 = r 0 Expanding the inition of E B ρ, pick j < W.k, s and r F such that 3. (s, r 0 r 1 r 2, r F ) : W 4. r 0 r 1 r 2 ; δ 2 (γ(e))[δ 1 (γ(e))] j h ; e It follows immediately that there are ê, ĥ, j 1, j 2 such that 5. j 1 + j 2 = j 6. r 0 r 1 r 2 ; δ 1 (γ(e)) j1 ĥ; ê 7. ĥ; δ2 (γ(e))[ê] j2 h ; e From the assumption Σ; Π; Γ; 1 e : A, we know that (W, r 0 r 1, γ(δ 1 (e))) E A ρ. Instantiating the inition of E A ρ with W = W, r = r 0 r 1, j = j 1, s = s, r F = r F r 2 r 0 and fact 6, we obtain Ŵ, ŝ, ˆr, r ˆF such that: 8. Ŵ j1 W 9. (ŝ, ˆr, rˆ F ) : Ŵ 10. ĥ = ŝ ˆr r ˆF 11. rˆ F r F r 2 r (Ŵ, ˆr, ê) V Σ A : ρ (This forces ê to be a value, say ˆv) From fact 11, there must be a rˆ F such that 11
12 13. rˆ F = r F r 2 r 0 rˆ F From the assumption γ U Σ Γ ok ρ W, fact 8 and Lemma 2(2), we get that 14. γ U Σ Γ ok ρ Ŵ Similarly, from the assumption δ 2 L Σ 2 ok ρ W, fact 8 and Lemma 2(3), we get that 15. δ 2 L Σ 2 ok ρ Ŵ This, together with fact 12 implies that: 16. (δ 2, x (ˆr, ˆv)) L Σ 2, x : A ok ρ Ŵ Using facts 14 and 16 with the inition of Σ; Π; Γ; 2, x : A E[x] : B, we derive that (Ŵ, π(γ) π(δ 2) ˆr, δ 2 (γ(e))[ˆv]) E B ρ. Equivalently, (Ŵ, r 0 r 2 ˆr, δ 2 (γ(e))[ˆv]) E B ρ. We now wish to instantiate the inition of E B ρ with W = Ŵ, j = j 2, r = r 0 r 2 ˆr, r F = r F rˆ F r 0, s = ŝ and fact 7. To do that, we must check the following: (G1) j 2 < Ŵ.k (G2) ĥ = ŝ r 0 r 2 ˆr r F rˆ F r 0 (G3) (ŝ, r 0 r 2 ˆr, r F rˆ F r 0 ) : Ŵ We prove all three facts below: Proof of (G1): From fact 8, we know that W.k = Ŵ.k + j 1. Since j 1 + j 2 = j, we get Ŵ.k j 2 = W.k j. Since j < W.k by assumption, Ŵ.k j 2 > 0, as required. Proof of (G2): From facts 10 and 21, we get ĥ = ŝ ˆr r F r 2 r 0 rˆ F. The result follow immediately from fact 2 (r 0 r 0 = r 0 ). Proof of (G3): Following the inition of (ŝ, r 0 r 2 ˆr, r F rˆ F r 0 ) : Ŵ, we need to prove two propositions: (a) ( Ŵ, ŝ) inv(ŵ, r 0 r 2 ˆr r F rˆ F r 0 ), and (b) r F rˆ F r 0 env(ŵ ). We can rewrite (a) as ( Ŵ, ŝ) inv(ŵ, ˆr r ˆF), which follows immediately from fact 18. To prove (b), we must show that for every j < Ŵ.ω, (r 0 r F rˆ F )[j] Ŵ.ω[j].E. We now consider two cases: j < W.ω and j W.ω. For j < W.ω, we know from fact 3 that r F [j] W.ω[j].E. Because W.ω[j].E is closed under extensions, (r 0 r F rˆ F )[j] W.ω[j].E. Finally, because Ŵ W, W.ω[j] = Ŵ.ω[j], so we derive the required (r 0 r F rˆ F )[j] Ŵ.ω[j].E. For j W.ω, r 0 r F rˆ F [j] = rˆ F [j] = rˆ F [j] (because r 0, r F, r 2 only contribute units at indexes beyond W.ω ), so we only need to show that rˆ F [j] Ŵ.ω[j].E. This follows from fact 18. Having proved (G1) (G3), we instantiate the inition of E B ρ with W = Ŵ, j = j 2, r = r 0 r 2 ˆr, r F = r F rˆ F r 0, s = ŝ and fact 7. This yields W, s, r, r F such that: 17. W j2 Ŵ 18. (s, r, r F ) : W 19. h = s r r F 20. r F r F rˆ F r (W, r, e ) V Σ B : ρ (This forces e to be a value, say v ) It remains only to check that (a) W j W and (b) r F r F. (a) follows from facts 8, 17 and j = j 1 + j 2. (b) follows from fact 20. Lemma 7 (World stepping). For every world W, n W n W. Proof. We only need to show that W 1 W. The rest follows by induction on n. To prove W 1 W, suppose that W = (k + 1, ω). Then, W = (k, ω k ). It suffices to show that (k, ω k ) (k + 1, ω), which, by inition of at worlds, reduces to proving that ω k ω k, which holds trivially. 12
13 Lemma 8. If (W, s) ω[i].i(r) and W.k < j, then (W, s) ω j [i].i(r). Proof. Immediate from inition of ω j. Lemma 9 (World step satisfaction). If (s, r, r F ) : W, then (s, r, r F ) : W. Proof. Let W = (k + 1, ω). Then, W = (k, ω k ). To prove (s, r, r F ) : W, we need to show two facts for every i {0,..., W.ω 1}: 1. r F [i] ω k [i].e 2. ( W, s[i]) ω k [i].i(s[i] r[i] r F [i]) (1) follows immediately from the following two facts: - r F [i] ω[i].e (because (s, r, r F ) : W ) - ω k [i].e = ω[i].e (by inition of k ) To prove (2), we note that because (s, r, r F ) : W, we have ( W, s[i]) ω[i].i(s[i] r[i] r F [i]). By Lemma 7, W W and because I(s[i] r[i] r F [i]) is closed under world extension, we obtain ( W, s[i]) ω[i].i(s[i] r[i] r F [i]). Finally, ( W ).k = k 1 < k, so by Lemma 8, we have ( W, s[i]) ω k [i].i(s[i] r[i] r F [i]), as needed. Lemma 10 (Administrative closure). If for every h, h; e n h; ê and (W, r, ê) E A ρ, then (W, r, e) E A ρ. Proof. We want to show that (W, r, e) E A ρ. Following the inition of E A ρ, pick j, s, r F, h, e such that 1. j < W.k 2. (s, r, r F ) : W 3. s r r F ; e j h; e Because evaluation is deterministic, we must have: 4. s r r F ; e n s r r F ; ê m h; e 5. j = m + n Let Ŵ = n W. By Lemma 7, Ŵ n W, so by facts 1 and 5 we get: 6. m < Ŵ.k By Lemma 9 on fact 2, 7. (s, r, r F ) : Ŵ Instantiating the inition of the given fact (W, r, ê) E A ρ with j = m, s = s, r = r, r F = r F, h = h, e = e using facts 6, 7, and 4, we get W, s, r, r F such that: 8. W m Ŵ 9. (s, r, r F ) : W 10. σ = s r r F 11. r F r F 12. (W, r, e ) V A ρ It remains only to check that W j W. This follows trivially from three previously derived facts: j = m + n, W m Ŵ and Ŵ n W. Lemma 11 (Index context weakening). Let ρ Env(Σ), ρ Env(Σ ) and dom(σ) dom(σ ) =. Then, the following hold whenever the left hand sides are ined: 1. ρ = Σ P implies ρ, ρ = Σ,Σ P. 13
14 2. V Σ A : κ ρ = V Σ, Σ A : κ (ρ, ρ ). 3. E A ρ = E A (ρ, ρ ). 4. U Σ Γ ok ρ W = U Σ, Σ Γ ok (ρ, ρ ) W 5. L Σ ok ρ W = L Σ, Σ ok (ρ, ρ ) W Proof. (1) (5) follow by induction on the initions of the various relations. The proof requires the assumption that I Σ t : σ ρ = I Σ, Σ t : σ (ρ, ρ ) when the left hand side is ined. Lemma 12 (Index term substitution). Let dom(σ) dom(σ ) =, ρ Env(Σ), Σ t : σ and d = I Σ t : σ ρ. Then, the following hold whenever the left hand sides are ined: 1. ρ, X d = Σ,X:σ P iff ρ = Σ [t/x]p. 2. V Σ, X : σ A : κ (ρ, X d) = V Σ [t/x]a : κ ρ. 3. E A (ρ, X d) = E [t/x]a ρ. 4. U Σ, X : σ Γ ok (ρ, X d) W = U Σ [t/x]γ ok ρ W 5. L Σ, X : σ ok (ρ, X d) W = L Σ [t/x] ok ρ W Proof. By induction on the initions of the various relations. The proof requires the assumption that I Σ, X : σ t : σ (ρ, X d) = I Σ [t/x]t : σ ρ. Lemma 13 (Type substitution). Let dom(σ) dom(σ ) =, ρ Env(Σ), Σ A : κ and a = V Σ A : κ ρ. Then, the following hold whenever the left hand sides are ined: 1. ρ, α a = Σ,α:κ P iff ρ = Σ P. 2. V Σ, α : κ B : κ (ρ, α a) = V Σ [A/α]B : κ ρ. 3. E B (ρ, α a) = E [A/α]B ρ. 4. U Σ, α : κ Γ ok (ρ, α a) W = U Σ [A/α]Γ ok ρ W 5. L Σ, α : κ ok (ρ, α a) W = L Σ [A/α] ok ρ W Proof. By induction on the initions of the various relations. Lemma 14 (Propositional soundness). If Σ; Π P, then for every ρ Env(Σ), ρ = Σ Π implies ρ = Σ P. Proof. By soundness of the proof system for first-order logic (i.e., by induction on the proof of Σ; Π P ). Lemma 15. If W j W, then W j W. Proof. Suppose W = (k + 1, ω ) and W = (k + 1, ω) where ω = ι 0,..., ι n and ω = ι 0,..., ι n. Then, W = (k, ω k ) and W = (k, ω k ). To show that W j W, we must prove the following three statements: (G1) k + j = k: Because W j W, k j = k + 1, so k + j = k. (G2) n n: This follows immediately from W j W. (G3) For i n, ι i k = ι i k k : Note that because k k, ι i k k = ι i k. So, it suffices to prove that ι i k = ι i k. From W j W we know that ι i = ι i k +1. Therefore, ι i k = ι i k +1 k = ι i k, as required. 14
15 Lemma 16 (Soundness of sharing). If Σ A : σ and Σ; Π monoid σ (ɛ, ( )), then the following semantic inference rule is sound: where Σ; Π; Γ; e : [A t] Σ; Π; Γ; v s i : [A/α]specT i Σ; Π; Γ; share(e, v s i ) : α : σ. [α t]!spect i!splitt!joint!promotet spect i = X : σ. Y : σ i :: P i. B i [α (t i X)] Z : σ i i. C i [α (t i X)] where X, α FV(P i ), FV(Q i ), FV(B i ), FV(C i ), FV(t i ), FV(t i ) splitt = X, Y : σ. [α (X Y )] [α X] [α Y ] joint = X, Y : σ. [α X] [α Y ] [α (X Y )] promotet = X : σ :: X = X X. [α X]![α X] monoid σ (ɛ, ( )) = X : σ. ɛ X = X X, Y : σ. X Y = Y X X, Y, Z : σ. (X Y ) Z = X (Y Z) Proof. Let B = α : σ. [α t]!spect i!splitt!joint!promotet. We want to prove that Σ; Π; Γ; share(e, vi s) : B. By applying Lemma 6 using the premise and the evaluation context E = share([ ], v i), we reduce this obligation to that of proving Σ; Π; Γ; x : [A t] share(x, vi s) : B. Assume a world W, ρ Env(Σ) such that ρ = Σ Π, γ U Σ Γ ok ρ W, and a (W, r, ) V Σ [A t] : ρ. It suffices to prove that (W, π(γ) r, share(, γ(vi s))) E B ρ. Let rs = π(γ). We need to prove that: (W, r s r, share(, γ(vi s ))) E B ρ Following the inition E, pick r F, s, j < W.k, h, e such that: 1. j < W.k 2. (s, r s r, r F ) : W 3. s r s r r F ; share(, γ(vi s)) j h; e. Call this point (A) in the proof we will return to it later to complete the proof. From the operational semantics of the construct share(, ), we immediately derive that j = 1 and 4. e = v 0 =,!op i,!split,!join,!promote 5. h = s r s r r F [l : ff] Assume that W.ω = ( W ).ω = n. Because Σ; Π monoid σ (ɛ, ( )), there is a monoid on S σ with operation + = I ρ and unit element I ɛ ρ, which also we denote with ɛ. We now ine a new world W s that extends W with one new, (n+1)th island as follows. The new island is (M,, ɛ, I, E), where M = {U(x), L(x, y) x, y S σ } { } and the operation is ined as follows: U(x) U(y) = U(x + y) L(x, y) U(z) = L(x, y + z) L( ) L( ) = = It is easily checked that the element ɛ = U(ɛ) is a unit for the monoid. We further ine: I(U(x)) = {(W, r [l : ff]) v. (W, r, v) (V Σ A : σ ρ) x} I(L(x, y)) = {(W, r [l : tt]) x = y} I( ) = E = M We denote by a the element of W s : (ɛ,..., ɛ, a). Let d = I t ρ. Returning to point (A) of our proof, we choose W = W s, s = s [l : ff] r, r F = r F, r = r s U(d) and show the following to complete the proof. 15
16 (G1) W j W. This is immediate because j = 1 and W = W s 0 ( W ) 1 W. (G2) ((s [l : ff] r, r s U(d), r F ) : W s. This is proved below. (G3) h = s [l : ff] r r F r s U(d). This is immediate from fact (5). (G4) r F = r F. This is trivial because r F = r F. (G5) (W s, r s U(d), v 0 ) V Σ B : ρ. This is proved below. Proof of (G2). By Lemma 9 applied to fact (2), we obtain that: 6. (s, r s r, r F ) : W To prove (G2), we must prove the following two facts: (G2.1) i < n + 1. r F [i] W s.ω[i].e Proof: For i < n, W s.ω[i] = ( W ).ω[i], so we only need to show that i < n + 1. r F [i] ( W ).ω[i].e, which follows from fact (6). For i = n, the statement is trivial because, by construction of W s, W s.ω[n].e = M. (G2.2) s [l : ff] r = s 0... s n, where ( W s, s i ) W s.ω[i].i(s[i] [l : ff][i] (r r s )[i] U(d) [i] r F [i]) Proof: From fact (6), we know that: 7. s = s 0... s n where ( W, s i ) ( W ).ω[i].i(s[i] (r rs )[i] r F [i]) We choose: s 0 = s 0,..., s n 1 = s n 1, s n = [l : ff] r and split 3 cases: Case i = 0: We have to show that ( W s, s 0 ) W s.ω[0].i(s[0] [l : ff] (r r s )[0] r F [0]). Because W s.ω[0] = ( W ).ω[0] by construction, it suffices to prove that ( W s, s 0 ) ( W ).ω[0].i(s[0] [l : ff] (r r s )[0] r F [0]). Since invariant of island 0 is independent of the argument, it also suffices to prove that ( W s, s 0 ) ( W ).ω[0].i(s[0] (r r s )[0] r F [0]). By construction, W s W. Therefore, by Lemma 15, W s W. Hence, because I at each island is closed under world extensions, it suffices to prove that ( W, s 0 ) ( W ).ω[0].i(s[0] (r r s )[0] r F [0]). This follows from fact (7). Case 0 < i < n: We have to show that ( W s, s i ) W s.ω[i].i(s[i] (r r s )[i] r F [i]). As in the previous case, it suffices to prove that ( W, s i ) ( W ).ω[i].i(s[i] (r r s )[i] r F [i]), which follows from fact (7). Case i = n: We must show that ( W s, [l : ff] r) W s.ω[n].i(u(d)). By construction, W s.ω[n].i(u(d)) = {(W, r [l : ff]) v. (W, r, v) (V Σ A : σ ρ) d}. Therefore, it suffices to prove that v. (W, r, v) (V Σ A : σ ρ) d. Since d = I t ρ, this is equivalent to v. (W, r, v) (V Σ A t : ρ), which follows from our initial assumption that (W, r, ) (V Σ [A t] : ρ). Proof of (G5). We have to show that (W s, r s U(d), v 0 ) V Σ B : ρ. Since B = α : σ...., we need a witness for α. We pick the following witness: T = λx S σ. {(W, r, ) r[n] U(x)} It now suffices to prove that (W s, r s U(d), v 0 ) V Σ, α : σ [α t]!spect i!splitt!joint!promotet : (ρ, α T ). Following the inition of V at and! and observing that because r s = π(γ), r s = r s r s, it suffices to prove each of the following: (G5.1) (W s, U(d), ) V Σ, α : σ [α t] : (ρ, α T ) (G5.2) (W s, ɛ, split) V Σ, α : σ splitt : (ρ, α T ) (G5.3) (W s, ɛ, join) V Σ, α : σ joint : (ρ, α T ) (G5.4) (W s, ɛ, promote) V Σ, α : σ promotet : (ρ, α T ) (G5.5) (W s, r s, op i ) V Σ, α : σ spect i : (ρ, α T ) 16
17 We prove each of these. Proof of (G5.1). It suffices to prove that there is v such that: (W s, U(t), v) V Σ, α : σ α t : (ρ, α T ) = (V Σ, α : σ α : σ (ρ, α T )) (I t (ρ, α T )) = T d = {(W, r, ) r[n] U(d)} Since U(d) [n] = U(d) U(d), we choose v = to complete the proof. Proof of (G5.2). By inition, split = λx., and splitt = X, Y : σ. [α (X Y )] [α X] [α Y ]. Following the inition of V at type., pick arbitrary d 1, d 2 S σ and ine ρ = ρ, α T, X d 1, Y d 2 and Σ = Σ, α : σ, X : σ, Y : σ. It suffices to prove that: (W s, ɛ, λx., ) V Σ [α (X Y )] [α X] [α Y ] : ρ. Following the inition of V, pick W W s and r, v such that 8. (W, r, v ) V Σ [α (X Y )] : ρ It suffices to prove that: (G5.2.1) (W, r,, ) V Σ [α X] [α Y ] : ρ From fact (8), there is v such that: Therefore, 9. r [n] U(d 1 ) U(d 2 ) 10. â: r [n] = U(d 1 ) U(d 2 ) â. (W, r, v ) V Σ α (X Y ) : ρ = (V Σ α : σ ρ ) (I X Y ρ ) = T (d 1 + d 2 ) Define r 1 = U(d 1 ) and r 2 = (r 2 [0],..., r 2 [n]), where: = {(W, r, ) r[n] U(d 1 + d 2 )} = {(W, r, ) r[n] U(d 1 ) U(d 2 )} r 2 [n] = { r [i] i < n U(d 2 ) â i = n Because of fact (10), r 1 r 2 = r. So, our goal (G5.2.1) can be reduced to proving: (G5.2.2) (W, r 1, ) V Σ [α X] : ρ Proof: It suffices to prove that: Since r 1 [n] = U(d 1 ) U(d 1 ), we are done. (G5.2.3) (W, r 2, ) V Σ [α Y ] : ρ Proof: It suffices to prove that: Since r 2 [n] = (U(d 2 ) â) U(d 2 ), we are done. (W, r 1, ) V Σ α X : ρ = (V Σ α : σ ρ ) (I X ρ ) = T d 1 = {(W, r, ) r[n] U(d 1 )} (W, r 2, ) V Σ α Y : ρ = (V Σ α : σ ρ ) (I Y ρ ) = T d 2 = {(W, r, ) r[n] U(d 2 )} 17
18 Proof of (G5.3). By inition, join = λx. and joint = X, Y : σ. [α X] [α Y ] [α (X Y )]. Following the inition of V, pick d 1, d 2 S σ and ine ρ = ρ, α T, X d 1, Y d 2 and Σ = Σ, α : σ, X : σ, Y : σ. It suffices to prove that: (W s, ɛ, λx. ) V Σ [α X] [α Y ] [α (X Y )] : ρ. Again following the inition of V, pick a W W s and r, v such that 11. (W, r, v ) V Σ [α X] [α Y ] : ρ It suffices to prove that (W, r, ) V Σ [α (X Y )] : ρ. It further suffices to prove that Therefore, it suffices to prove that: (G5.3.1) r [n] U(d 1 ) U(d 2 ) (W, r, ) V Σ α (X Y ) : ρ = T (d 1 + d 2 ) = {(W, r, ) r[n] U(d 1 + d 2 )} = {(W, r, ) r[n] U(d 1 ) U(d 2 )} From fact (11), we know that r = r 1 r 2 and v = v 1, v 2 such that: (W, r 1, v 1 ) V Σ [α X] : ρ and (W, r 2, v 2 ) V Σ [α Y ] : ρ. Hence, there are v 1, v 2 such that: 12. (W, r 1, v 1) V Σ α X : ρ 13. (W, r 2, v 2) V Σ α Y : ρ Simplifying the right hand side of fact (12), we get (W, r 1, v 1) T d 1 = {(W, r, ) r[n] U(d 1 )}. Therefore, r 1 [n] U(d 1 ). Similarly, using fact (13), r 2 [n] U(d 2 ). Combining, we get r [n] = r 1 [n] r 2 [n] U(d 1 ) U(d 2 ), which is exactly our required subgoal (G5.3.1). Proof of (G5.4). By inition, promote = λx.! and promotet = X : σ :: X = X X. [α X]![α X]. Following the inition of V, pick d S σ and ine Σ = Σ, α : σ, X : σ and ρ = ρ, α T, X d. Assume that ρ = Σ X = X X. Note that this implies: 14. d = d + d It suffices to prove that (W s, ɛ, λx.! ) V Σ [α X]![α X] : ρ. Again following the inition of V, pick a W W s and r, v such that 15. (W, r, v ) V Σ [α X] : ρ It suffices to prove that: (G5.4.1) (W, r,! ) V Σ![α X] : ρ From fact (15), there is a v such that (W, r, v ) V Σ α X : ρ = T d = {(W, r, ) r[n] U(d )} Therefore, r [n] U(d ). Define ˆr = (ɛ,..., ɛ, U(d )). Clearly, r ˆr, so by Lemma 4, we can reduce our subgoal (G5.4.1) to (W, ˆr,! ) V Σ![α X] : ρ. Because of fact (14), ˆr = ˆr ˆr, so it suffices to prove that (W, ˆr, ) V Σ [α X] : ρ, which is further reduced to (W, ˆr, ) V Σ α X : ρ. Note that V Σ α X : ρ = T d = {(W, r, ) r[n] U(d )}. So we only need to prove that ˆr[n] U(d ), which is trivial because ˆr[n] = U(d ). Proof of (G5.5). We want to prove that (W s, r s, op i ) V Σ, α : σ spect i : (ρ, α T ). By inition, spect i = X : σ. Y : σ i :: P i. B i [α (t i X)] Z : σ i :: Q i. C i [α (t i X)]. Following the inition of V, pick a S σ and d 1 S σ i and ine ρ = ρ, α T, X a, Y d 1 and Σ = Σ, α : σ, X : σ, Y : σ i. Assume that: 16. ρ = Σ P i 18
19 It suffices to prove that (W s, r s, op i ) V Σ B i [α (t i X)] Z : σ i :: Q i. C i [α (t i X)] : ρ. Again following the inition of V, pick a W 0 W, r a, v a such that: 17. (W 0, r a, v a ) V Σ B i [α (t i X)] : ρ It suffices to prove that (W 0, r s r a, op i v a ) E Z : σ i pick j 0, s 0, r F0, h f, e f such that: 18. j 0 < W 0.k 19. (s 0, r s r a, r F0 ) : W s 0 r s r a r F0 ; op i v a j0 h f ; e f :: Q i. C i [α (t i X)] ρ. Expanding the inition of E, Call this point (B) in the proof. We will return to it later to complete the proof. Let n 0 = W 0.ω. Since W 0 W s, n 0 n + 1. Further for i {0,..., n}. W 0.ω[i] = W s.ω[i]. From fact (19), we know that 21. s 0 = s 0,0... s 0,n0 such that (in particular), ( W 0, s 0,n ) W 0.ω[n].I((s 0 r s r a r F0 )[n]) = W s.ω[n].i((s 0 r s r a r F0 )[n]). The inition of I on the (n + 1)th island now forces either [l : ff] s 0,n or [l : tt] s 0,n. Hence, either [l : ff] s 0 or [l : tt] s 0. Further, because ( W 0, s 0,0 ) W 0.ω[0].I((s 0 r s r a r F0 )[0]) = W s.ω[0].i((s 0 r s r a r F0 )[0]), (s 0 r s r a r F0 )[0] cannot equal, so either [l : tt] s 0 r s r a r F0 or [l : ff] s 0 r s r a r F0. If the former, then using the inition of op i, we know that s 0 r s r a r F0 ; op i v a diverges (by taking the then branch in the inition of op i ). But from fact (20) we know that this configuration does not diverge, so it follows that [l : ff] s 0 r s r a r F0 and accordingly, [l : ff] s 0,n. Let: 22. s 0,n = s 0,n [l : ff] and s 0 = s 0,0... s 0,n 1 s 0,n s 0,n+1... s 0,n0. Using the inition of I on the (n + 1)th island, we also get 23. (s 0 r s r a r F0 )[n] = U(g) for some g S σ. Using fact (20) and our analysis above we also deduce that there are j 1 and j 0 such that: 24. s 0 r s r a r F0 ; op i v a j1 s 0 r s r a r F0 [l : tt]; E[(γ(v s i )) v a] j 0 h f ; e f where E[ ] = let y = [ ] in let = l := ff in y 25. j 0 = j 1 + j 0 Analyzing fact (24) further, we deduce that there are j 2 and j 3 such that: 26. j 0 = j 2 + j s 0 r s r a r F0 [l : tt]; (γ(v s i )) v a j2 h 1 ; e h 1 ; E[e 1 ] j3 h f ; e f Our next objective is to show that (γ(v s i )) v a is semantically well-typed and then apply the inition of E to fact (27). From fact (17), we know that there are r B a, r α a, v B a, v α a such that: 29. r a = ra B ra α 30. v a = va B, va α 31. (W 0, r B a, v B a ) V Σ B i : ρ 32. (W 0, r α a, v α a ) V Σ [α (t i X)] : ρ 19
20 Let m = I t i ρ. Fact (32) forces that v α a = and implies that there is a v α a such that: (W 0, ra α, va α ) V Σ α (t i X) : ρ = T (m + a) = {(W, r, ) r[n] U(m) U(a)} Therefore, r α a [n] U(m) U(a). Further from fact (23), U(g) = (r 0 r s r a r F0 )[n] r α a [n + 1], so by the inition of the monoid at the (n + 1)th island, there is some a such that: 33. r α a [n] = U(m + a + a ). 34. Choose f such that U(f) = (s 0 r s r B a r F0 )[n]. Then, g = m + a + a + f. Define g = a + a + f. From facts (21), (22) and (23), we know that ( W 0, s 0,n [l : ff]) W s.ω[n].i(u(g)). Hence, using the inition of I on the (n + 1)th island, we deduce that there is v such that ( W 0, s 0,n, v) (V Σ A : σ ρ) g = (V Σ A : σ ρ) (g + m) Pick a fresh variable X. Then, by Lemmas 12 and 11, we get ( W 0, s 0,n, v) V Σ, X : σ A (t i X ) : (ρ, X g ). This immediately implies: 35. ( W 0, s 0,n, ) V Σ, X : σ [A (t i X )] : (ρ, X g ) Choose: 36. W 1 = j1 W 0. Note that because of fact (24), j 1 > 1. By Lemma 2 applied to facts (35) and (36), we derive that: 37. (W 1, s 0,n, ) V Σ, X : σ [A (t i X )] : (ρ, X g ) By Lemmas 2 and 11 applied to fact (31), we deduce: 38. (W 1, r B a, v B a ) V Σ, X : σ B i : (ρ, X g ) Noting that v a = v B a, v α a = v B a,, we deduce from facts (37) and (38) that: 39. (W 1, s 0,n r B a, v a ) V Σ, X : σ B i [A (t i X )] : (ρ, X g ) Using the second premise of the given inference rule, we derive that (W, r s, γ(v s i )) V Σ [A/α]specT i : ρ. Expanding the inition of spect i, noting that α B i, C i, we get (W, r s, γ(v s i )) V Σ X : σ. Y : σ i :: P i. B i [A (t i X)] Z : σ i :: Q i. C i [A (t i X)] : ρ α-renaming X to X noting that by the side condition on spect i, X P i, Q i, B i, C i, t i, t i, we get: (W, r s, γ(v s i )) V Σ X : σ. Y : σ i :: P i. B i [A (t i X )] Z : σ i Using Lemma 2, observing that W 1 = j1 W 0 W 0 W s W W, we obtain: (W s, r s, γ(v s i )) V Σ X : σ. Y : σ i :: P i. B i [A (t i X )] Z : σ i :: Q i. C i [A (t i X )] : ρ :: Q i. C i [A (t i X )] : ρ We now instantiate the inition of V at the type ::. choosing the substitution ρ, X g (recall that ρ contains Y d 1 ). From fact (16) and Lemma 11, we obtain that ρ, X g = Σ,X :σ P i. Hence, we get: 40. (W s, r s, γ(v s i )) V Σ, X : σ B i [A (t i X )] Z : σ i :: Q i. C i [A (t i X )] : (ρ, X g ) From facts (39) and (40), following the inition of V at, we get: 41. (W s, r s r B a s 0,n, (γ(v s i )) v a) E Z : σ i :: Q i. C i [A (t i X )] (ρ, X g ) Now we wish to instantiate the inition of E in fact (41) using the reduction in fact (27). We choose: 20
21 42. W = W 1, j = j 2 r = r s r B a s 0,n s = s 0,0... s 0,n 1 [l : tt] s 0,n+1 s 0,n0 r F = L(g, a + a ) r F0 (r α a \{n}). where (r α a \{n}) is the same as r α a except in the (n + 1)th island, where the value is ɛ (or U(ɛ)). Call this point (C) in the proof, as we will return to it later. To apply the inition of E, we must show that: (G5.5.1) j 2 < W 1.k. Proof: W 1.k = ( j1 W 0 ).k = W 0.k j 1. Therefore, W 1.k j 2 = W 0.k (j 1 + j 2 ). From facts (25) and (26), j 0 j 1 + j 2. Therefore, W 1.k j 2 W 0.k j 0 > 0 (by fact (18)). (G5.5.2) Following fact (27): s 0 r s r a r F0 [l : tt] = r s r F. Proof: Immediate because on all islands except the (n + 1)th, r, s, r F is a repartitioning of s 0 r s r a r F0. (G5.5.3) (s, r, r F ) : W 1. Proof: Applying Lemma 9 to facts (19) and (36), we get: 43. (s 0, r s r a, r F0 ) : W 1 It suffices to prove three facts: (G ) For every j < n 0. r F [j] W 1.ω[j].E. Proof: For j n, r F [j] = ra α [j] r F0 [j] r F0 [j]. Since W 1.ω[j].E is extension closed, the subgoal is immediate from fact (43). For j = n, the subgoal is trivial because W 1.ω[n].E = G. (G ) For j n, ( W 1, s 0,j ) W 1.ω[j].I((s r r F )[j]). Proof: By inition of r, s, r F, for j n, (s r r F )[j] = (s 0 r s r a r F0 )[j]. immediate from fact (43). Therefore, the result is (G ) ( W 1, [l : tt]) W 1.ω[n].I((s r r F )[n]) Proof: From facts (42) and (34), we get (s r r F )[n] = L(g, a + a ) U(f) = L(g, a + a + f) = L(g, g ). Therefore by inition of the (n + 1)th island, W 1.ω[n].I((s r r F )[n]) = W 1.ω[n].I(L(g, g )) = {(W, r [l : tt])}. The subgoal is then obviously true. We now return to point (C) of our proof. Because of (G5.5.1) (G5.5.3), we can instantiate the inition of E at point (C) of the proof to obtain W 2, s 2, r 2, r F2 such that: 44. W 2 j2 W (s 2, r 2, r F2 ) : W h 1 = s 2 r 2 r F2 47. r F2 L(g, a + a ) r F0 (r α a \{n}) 48. (W 2, r 2, e 1 ) V Σ, X : σ Z : σ i :: Q i. C i [A (t i X )] : (ρ, X g ). From fact (48) we know that there is a d 2 S σ such that 49. ρ, X g, Z d 2 = Σ,X :σ,z:σ Q i Let Σ = Σ, X : σ, Z : σ and ρ = ρ, X g, Z d 2. Then fact (49) also implies: 50. ρ = Σ Q i Continuing with fact (48), we further derive v2 C, r2 C, r2 A 51. e 1 = v2 C, such that: 52. r 2 = r C 2 r A (W 2, r C 2, v C 2 ) V Σ C : ρ 21
22 54. (W 2, r A 2, ) V Σ [A (t i X )] : ρ Let m = I t i ρ. From fact (54), we obtain a v 2 such that (W 2, r A 2, v 2) V Σ A (t i X ) : ρ. Therefore, 55. (W 2, r A 2, v 2) (V Σ [A] : ρ ) (m + g ) From fact (47), either (s 2 r 2 r F2 )[n] is L(g, ) or it is. By fact (45), ( W 2, s 2,n ) W 2.ω[n].I((s 2 r 2 r F2 )[n]), so following the inition of the (n + 1)th island s I, we get: 56. (s 2 r 2 r F2 )[n] = L(g, g ) 57. s 2,n = [l : tt] s 2,n for some s 2,n From facts (57) and (46), we also get: 58. h 1 = h 1 [l : tt] for some h 1 From facts (28), (51) and (58) we get using the operational semantics that: 59. h 1 ; E[e 1 ] j3 h 1 [l : ff]; v c 2,. (Therefore, h f in fact (28) equals h 1 [l : ff].) From fact (47), there is a r F2 such that: 60. r F2 = L(g, a + a ) r F0 (r α a \{n}) r F2 We now return to point (B) in our proof and construct W, s, r, r F proof. Let n 2 = W 2.ω. We select: to satisfy the inition of E and close the 61. r F = r F0 (ra α \{n}) r F2 r = r2 C U(m + a + a ) s = s { 0... s n 2 where s i = s2,i if i n s 2,n [l : ff] r2 A if i = n W = j3 W 2 To complete the proof, it suffices to prove each of the following: (G5.5.4) W j0 W 0 (G5.5.5) (s, r, r F ) : W (G5.5.6) h 1 [l : ff] = s r r F (G5.5.7) r F r F0 (G5.5.8) (W, r, v C 2, ) V Σ Z : σ i We prove each of these below. :: Q i. C i [A (t i X)] : ρ Proof of (G5.5.4). Using facts (36), (44) and (61), we derive that W j1+j 2+j 3 W 0. From facts (25) and (26), we know that j 0 = j 1 + j 2 + j 3, which closes the subgoal. Proof of (G5.5.5). Using Lemma 9 on fact (45) and noting that W = j3 W 1, we obtain: 62. (s 2, r 2, r F2 ) : W It suffices to prove the following: (G ) For each j < n 2, r F [j] W.ω[j].E Proof: Using facts (60) and (61), for j n, r F [j] = r F2[j]. Therefore for j n, it is enough to prove that r F2 [j] W.ω[j].E, which follows from fact (62). For j = n, the subgoal holds trivially because W.ω[n].E = W s.ω[n].e = G. 22
23 (G ) For each j < n 2, ( W, s j ) W.ω[j].I((s r r F )[j]). Proof: We consider three cases. Case. j = 0. Then from fact (61), s 0 = s 2,0. So, we must show that ( W, s 2,0 ) W.ω[0].I((s r r F )[0]). From fact (62), we know that ( W, s 2,0 ) W.ω[0].I((s 2 r 2 r F2 )[0]). But on island 0, I(x) is independent of x, so we are done. Case. j = n. From facts (56), (52) and (60), we derive that (s 2 r C 2 r A 2 r F0 r F2 )[n] L(g, a + a ) = L(g, g ). Therefore, there is a g such that: 63. (s 2 r C 2 r A 2 r F0 r F2 )[n] = U(g ) and g = a + a + g Now observe that (s r r F )[n] = (s 2 r2 C r2 A r F0 r F2 )[n] U(m + a + a ) (Fact (61)) = U(g ) U(m + a + a ) (Fact (63)) = U(g + m + a + a ) = U(m + g ) (Fact (63)) We need to prove that ( W, s n) W.ω[n].I((s r r F )[n]) = W.ω[n].I(U(m + g )). Because W W s, this happens iff ( W, s n) W s.ω[n].i(u(m +g )) = {(W, r [l : ff]) v. (W, r, v) (V Σ A : σ ρ) (m + g )}. From fact (61), s n = s 2,n [l : ff] r A 2. Therefore, it is enough to prove that there is a v such that ( W, s 2,n r A 2, v) (V Σ A : σ ρ) (m + g ). By Lemma 4 and 2, it suffices to prove that ( W, s 2,n r A 2, v) (V Σ A : σ ρ) (m + g ). This follows from fact (55) and Lemma 11 (choosing v = v 2). Case. j {0, n}. In this case (s r r F )[j] = (s 2 r 2 r F2 )[j]. So the result follows from fact (62). Proof of (G5.5.6). Follows immediately from facts (46), (52), (58) and (61). Proof of (G5.5.7). Immediate from fact (61). Proof of (G5.5.8). Let Σ = Σ, Z : σ and ρ = ρ, Z d 2. Observing that from fact (61), r = r C 2 U(m + a + a ), it suffices to prove the following: (G ) ρ = Σ Q i. Proof: This follows from fact (50) and Lemma 11. (G ) (W, r C 2, v C 2 ) V Σ C i : ρ Proof: Follows from fact (53) using Lemmas 2 and 11. (G ) (W, U(m + a + a ), ) V Σ [A (t i X)] : ρ Proof: It suffices to prove that (W, U(m + a + a ), ) V Σ A (t i X) : ρ = (V Σ A : σ ρ ) (m + a) = T (m + a) = {(W, r, ) r[n] U(m + a)} Therefore, it suffices to show that U(m + a + a ) U(m + a), i.e., U(m + a) U(a ) U(m + a), which is trivial. Theorem 1 (Fundamental Theorem). The following hold: 1. If Σ; Π; Γ; e : A, then Σ; Π; Γ; e : A. 2. If Σ; Π; Γ; v : A, then Σ; Π; Γ; v v : A. 23
24 Proof. By simultaneous induction on the given typing derivations. Due to Lemma 5, (2) is stronger than (1) for values, so in cases where the term in the conclusion must be a value, we prove only (2). We introduce some notation. For an arbitrarily chosen world W, let ρ Env(Σ) such that ρ = Σ Π, γ U Σ Γ ok ρ W and δ L Σ ok ρ W. Σ Π ok Σ Γ ok Σ ok x : A Case. Σ; Π; Γ; x : A Here x is a value, so we prove only (2). We need to show that (W, π(γ) π(δ), δ(γ(x))) V Σ A : ρ. Let (x (r, v)) δ. Since δ L Σ ok ρ W, we know that (W, r, v) V Σ A : ρ. By Lemma 4, (W, π(γ) π(δ), v) V Σ A : ρ. The proof closes when we observe that δ(γ(x)) = v. Σ Π ok Σ Γ ok Σ ok x : A Γ Case. Σ; Π; Γ; x : A Here x is a value, so we prove only (2). We need to show that (W, π(γ) π(δ), δ(γ(x))) V Σ A : ρ. Let (x (r, v)) γ. Since γ U Σ Γ ok ρ W, we know that (W, r, v) V Σ A : ρ. By Lemma 4, (W, π(γ) π(δ), v) V Σ A : ρ. The proof closes when we observe that δ(γ(x)) = v. Σ Π ok Σ Γ ok Σ ok Σ ok Case. Σ; Π; Γ; : 1 Here is a value, so we prove only (2). Since γ(δ( )) =, if suffices to show that (W, π(γ) π(δ), ) V Σ 1 : ρ. This follows from the inition of V. Case. Σ; Π; Γ; 1 e 1 : A Σ; Π; Γ; 2 e 2 : B Σ; Π; Γ; 1, 2 e 1, e 2 : A B Proof of (1): Define E 1 = [ ], e 2. From i.h.(1) on the first premise, we know that Σ; Π; Γ; 1 e 1 : A, so by Lemma 6, it suffices to prove that Σ; Π; Γ; 1, x : A x, e 2 : A B. Define E 2 = x, [ ]. Then, our subgoal can be written as Σ; Π; Γ; 1, x : A E 2 [e 2 ] : A B. From i.h.(1) on the second premise, we know that Σ; Π; Γ; 2 e 2 : B. So, by Lemma 6, it suffices to show that Σ; Π; Γ; x : A, y : B E 2 [y] : A B, i.e., Σ; Π; Γ; x : A, y : B x, y : A B. To prove this, assume that δ L Σ x : A, y : B ok ρ W. It suffices to prove that (W, π(δ), δ(x), δ(y) ) V Σ A B : ρ. By inition, δ L Σ x : A, y : B ok ρ W, implies that δ = (x (r 1, v 1 ), y (r 2, x 2 )), where (W, r 1, x 1 ) V Σ A : ρ and (W, r 2, x 2 ) V Σ B : ρ. The last two facts immediately imply (W, r 1 r 2, x 1, x 2 ) V Σ A B : ρ, i.e., (W, π(δ), δ(x), δ(y) ) V Σ A B : ρ, as required. Proof of (2): If e 1, e 2 is a value, then both e 1 and e 2 are values. Pick a W, ρ Env(Σ), such that ρ = Σ Π, γ U Σ Γ ok ρ W and for i = 1, 2, δ i L Σ i ok ρ W. We want to show that (W, π(γ) π(δ 1 ) π(δ 2 ), δ 1 (δ 2 (γ( e 1, e 2 )))) V Σ A B : ρ. By i.h.(2) on the first premise, we derive that (W, π(γ) π(δ 1 ), δ 1 (γ(e 1 ))) V Σ A : ρ and similarly from the second premise we obtain (W, π(γ) π(δ 2 ), δ 2 (γ(e 2 ))) V Σ B : ρ. By inition of V, we now get (W, π(γ) π(δ 1 ) π(γ) π(δ 2 ), δ 1 (γ(e 1 )), δ 2 (γ(e 2 )) ) V Σ A B : ρ. Observing that π(γ) π(γ) = π(γ), we immediately derive (W, π(γ) π(δ 1 ) π(δ 2 ), δ 1 (δ 2 (γ( e 1, e 2 )))) V Σ A B : ρ, as needed. Case. Σ; Π; Γ; 1 e : A B Σ; Π; Γ; 2, x : A, y : B e : C Σ; Π; Γ; 1, 2 let x, y = e in e : C Here, statement (2) is vacuous because the term in the conclusion is not a value. To prove (1), we want to show that Σ; Π; Γ; 1, 2 let x, y = e in e : C. Define the evaluation context E = (let x, y = [ ] in e ). Then, we want to show that Σ; Π; Γ; 1, 2 E[e] : C. By i.h.(1) on the first premise we know that Σ; Π; Γ; 1 e : A B. So, by Lemma 6, it is enough to show that Σ; Π; Γ; 2, z : A B E[z] : C or, equivalently, Σ; Π; Γ; 2, z : A B let x, y = z in e : C. To prove this, pick W, ρ Env(Σ), ρ = Σ Π, γ U Σ Γ ok ρ W and (δ, z (r, v)) L Σ 2, z : A B ok ρ W. We need to show that (W, π(γ) π(δ 2 ) r, let x, y = v in δ 2 (γ(e ))) E C ρ. From (δ, z (r, v)) L Σ 2, z : A B ok ρ W, we get (W, r, v) V Σ A B : ρ. Expanding the inition of the latter, we know that there are r 1, r 2, v 1, v 2 such that v = v 1, v 2, r = r 1 r 2, (W, r 1, v 1 ) V Σ A : ρ and (W, r 2, v 2 ) V Σ B : ρ. By i.h.(1) on the second premise, it follows that (W, π(γ) π(δ 2 ) r 1 r 2, [v 2 /y][v 1 /x](δ 2 (γ(t )))) E C ρ. Since for every h, h; let x, y = v in δ 2 (γ(e )) h; [v 2 /y][v 1 /x](δ 2 (γ(t ))), Lemma 10 implies that (W, π(γ) π(δ 2 ) r, let x, y = v in δ 2 (γ(e ))) E C ρ, as required. Case. Σ, X : σ; Π, P ; Γ; v : A Σ; Π; Γ; v : X : σ :: P. A i FV(Π), FV(Γ), FV( ) 24
25 Since the term in the conclusion is a value, we prove only (2). We want to show that (W, π(γ) π(δ), δ(γ(v))) V Σ ( X : σ :: P. A) : ρ. Following the inition of V Σ ( X : σ :: P. A) : ρ, assume that there is term t S σ such that ρ[x t] = Σ P. We need to show that (W, π(γ) π(δ), δ(γ(v))) V Σ, X : σ A : ρ[x t]. Define ρ = ρ[x t]. It suffices to show that (W, π(γ) π(δ), δ(γ(v))) V Σ, X : σ A : ρ. From Lemma 11, we obtain that γ U Σ, X : σ Γ ok ρ W, δ L Σ, X : σ ok ρ W and ρ = Σ,X:σ Π, P. Hence, we can instantiate the i.h.(2) on the premise with ρ, γ and δ to obtain (W, π(γ) π(δ), δ(γ(v))) V Σ, X : σ A : ρ, as required. Σ; Π; Γ; e : X : σ :: P. A Σ t : σ Σ; Π [t/x]p Case. Σ; Π; Γ; e : [t/x]a Proof of (1): We want to show that Σ; Π; Γ; e : [t/x]a. Define E = [ ]. By i.h.(1) on the first premise, we know that Σ; Π; Γ; e : X : σ :: P. A, so by Lemma 6, it suffices to show that Σ; Π; Γ; x : ( X : σ :: P. A) E[x] : [t/x]a or, equivalently, Σ; Π; Γ; x : ( X : σ :: P. A) x : [t/x]a. Following the inition, choose W, ρ Env(Σ), ρ = Σ Π, γ U Σ Γ ok ρ W and (x (r, v)) L Σ x : ( X : σ :: P. A) ok ρ W. It suffices to prove that (W, π(γ) r, v) E [t/x]a ρ. By Lemma 4, we reduce this to proving (W, r, v) E [t/x]a ρ, and by Lemma 5, to (W, r, v) V Σ [t/x]a : ρ. Further, from the second premise we know that I Σ t : σ ρ is ined. Let this value be d. From the third premise and Lemma 14 we know that ρ = Σ [t/x]p, so by Lemma 12, we get ρ, X d = Σ,X:σ P. From (x (r, v)) L Σ x : ( X : σ :: P. A) ok ρ W, we know that (W, r, v) V Σ ( X : σ :: P. A) : ρ. Expanding the inition of V Σ ( X : σ :: P. A) : ρ at X = d, we get (W, r, v) V Σ, X : σ A : (ρ, X d). By Lemma 12, (W, r, v) V Σ [t/x]a : ρ, which is what we wanted to prove. Proof of (2): Assume that e is a value. We want to prove that (W, π(γ) π(δ), δ(γ(e))) V Σ [t/x]a : ρ. From i.h.(2) on the first premise, we know that (W, π(γ) π(δ), δ(γ(e))) V Σ ( X : σ :: P. A) : ρ. Let d = I Σ t : σ ρ (which is ined by the second premise). From the third premise and Lemma 14, we know that ρ, X d = Σ,X:σ P. Expanding the inition ofv Σ ( X : σ :: P. A) : ρ at X = d, we get (W, r, v) V Σ, X : σ A : (ρ, X d). By Lemma 12, (W, r, v) V Σ [t/x]a : ρ, which is what we wanted to prove. Σ; Π; Γ; e : A Case. Σ; Π; Γ; new(e) : l : Loc ::.!ptr l cap l A Here, the term in the conclusion is never a value, so it suffices to prove (1). We want to show that (W, π(γ) π(δ), new(e)) E l : Loc ::.!ptr l cap l A ρ. Defining E = new([ ]) and using Lemma 6 with the i.h.(1) on the premise, we reduce this to proving that for any (W, r, v) V Σ A : ρ, we have (W, π(γ) r, new(v)) E l : Loc ::.!ptr l cap l A ρ. Following the inition of E ρ, pick j, s, r F, h, e such that: 1. j < W.k 2. (s, π(γ) r, r F ) : W 3. s π(γ) r r F ; new(v) j h; e The operational semantics force: 4. j = 1 5. e =!l, for some l 6. h = s π(γ) r r F [l : v] We choose W = W, s = s, r = r [l : v], r F = π(γ) r F. It suffices to prove each of the following: (G1) W j W, i.e., W 1 W (G2) (s, r, r F ) : W (G3) h = s r r F (G4) r F r F (G5) (W, r, e ) V Σ l : Loc ::.!ptr l cap l A : ρ. 25
State-Dependent Representation Independence (Technical Appendix)
State-Dependent Representation Independence (Technical Appendix) Amal Ahmed Derek Dreyer Andreas Rossberg TTI-C MPI-SWS MPI-SWS amal@tti-c.org dreyer@mpi-sws.mpg.de rossberg@mpi-sws.mpg.de Contents August
More informationThe Marriage of Bisimulations and Kripke Logical Relations. Technical Appendix
The Marriage of Bisimulations and Kripke Logical Relations Technical Appendix Chung-Kil Hur Derek Dreyer Georg Neis Viktor Vafeiadis Max Planck Institute for Software Systems (MPI-SWS) {gil,dreyer,neis,viktor}@mpi-sws.org
More informationPredicate Logic. Xinyu Feng 09/26/2011. University of Science and Technology of China (USTC)
University of Science and Technology of China (USTC) 09/26/2011 Overview Predicate logic over integer expressions: a language of logical assertions, for example x. x + 0 = x Why discuss predicate logic?
More informationCharacterizing First Order Logic
Characterizing First Order Logic Jared Holshouser, Originally by Lindstrom September 16, 2014 We are following the presentation of Chang and Keisler. 1 A Brief Review of First Order Logic Definition 1.
More informationSyntax. Notation Throughout, and when not otherwise said, we assume a vocabulary V = C F P.
First-Order Logic Syntax The alphabet of a first-order language is organised into the following categories. Logical connectives:,,,,, and. Auxiliary symbols:.,,, ( and ). Variables: we assume a countable
More information1. Propositional Calculus
1. Propositional Calculus Some notes for Math 601, Fall 2010 based on Elliott Mendelson, Introduction to Mathematical Logic, Fifth edition, 2010, Chapman & Hall. 2. Syntax ( grammar ). 1.1, p. 1. Given:
More informationA simple proof that super-consistency implies cut elimination
A simple proof that super-consistency implies cut elimination Gilles Dowek 1 and Olivier Hermant 2 1 École polytechnique and INRIA, LIX, École polytechnique, 91128 Palaiseau Cedex, France gilles.dowek@polytechnique.edu
More informationCS 6110 Lecture 21 The Fixed-Point Theorem 8 March 2013 Lecturer: Andrew Myers. 1 Complete partial orders (CPOs) 2 Least fixed points of functions
CS 6110 Lecture 21 The Fixed-Point Theorem 8 March 2013 Lecturer: Andrew Myers We saw that the semantics of the while command are a fixed point. We also saw that intuitively, the semantics are the limit
More informationMonadic Refinements for Relational Cost Analysis (Appendix)
Monadic Refinements for Relational Cost Analysis (Appendix) Ivan Radiček Gilles Barthe Marco Gaboardi Deepak Garg Florian Zuleger Structure of the Appendix In the appendix we give material that was omitted
More informationNotes for Math 601, Fall based on Introduction to Mathematical Logic by Elliott Mendelson Fifth edition, 2010, Chapman & Hall
Notes for Math 601, Fall 2010 based on Introduction to Mathematical Logic by Elliott Mendelson Fifth edition, 2010, Chapman & Hall All first-order languages contain the variables: v 0, v 1, v 2,... the
More informationDenotational semantics: proofs
APPENDIX A Denotational semantics: proofs We show that every closed term M has a computable functional [[M ] as its denotation. A.1. Unification We show that for any two constructor terms one can decide
More informationLecture Notes on Linear Logic
Lecture Notes on Linear Logic 15-816: Modal Logic Frank Pfenning Lecture 23 April 20, 2010 1 Introduction In this lecture we will introduce linear logic [?] in its judgmental formulation [?,?]. Linear
More informationClassical Program Logics: Hoare Logic, Weakest Liberal Preconditions
Chapter 1 Classical Program Logics: Hoare Logic, Weakest Liberal Preconditions 1.1 The IMP Language IMP is a programming language with an extensible syntax that was developed in the late 1960s. We will
More informationCS522 - Programming Language Semantics
1 CS522 - Programming Language Semantics Simply Typed Lambda Calculus Grigore Roşu Department of Computer Science University of Illinois at Urbana-Champaign 2 We now discuss a non-trivial extension of
More informationEquational Logic. Chapter 4
Chapter 4 Equational Logic From now on First-order Logic is considered with equality. In this chapter, I investigate properties of a set of unit equations. For a set of unit equations I write E. Full first-order
More informationPredicate Logic. Xinyu Feng 11/20/2013. University of Science and Technology of China (USTC)
University of Science and Technology of China (USTC) 11/20/2013 Overview Predicate logic over integer expressions: a language of logical assertions, for example x. x + 0 = x Why discuss predicate logic?
More informationA Discrete Duality Between Nonmonotonic Consequence Relations and Convex Geometries
A Discrete Duality Between Nonmonotonic Consequence Relations and Convex Geometries Johannes Marti and Riccardo Pinosio Draft from April 5, 2018 Abstract In this paper we present a duality between nonmonotonic
More informationA Propositional Dynamic Logic for Instantial Neighborhood Semantics
A Propositional Dynamic Logic for Instantial Neighborhood Semantics Johan van Benthem, Nick Bezhanishvili, Sebastian Enqvist Abstract We propose a new perspective on logics of computation by combining
More informationBeyond First-Order Logic
Beyond First-Order Logic Software Formal Verification Maria João Frade Departmento de Informática Universidade do Minho 2008/2009 Maria João Frade (DI-UM) Beyond First-Order Logic MFES 2008/09 1 / 37 FOL
More informationArgumentative Characterisations of Non-monotonic Inference in Preferred Subtheories: Stable Equals Preferred
Argumentative Characterisations of Non-monotonic Inference in Preferred Subtheories: Stable Equals Preferred Sanjay Modgil November 17, 2017 Abstract A number of argumentation formalisms provide dialectical
More information1. Propositional Calculus
1. Propositional Calculus Some notes for Math 601, Fall 2010 based on Elliott Mendelson, Introduction to Mathematical Logic, Fifth edition, 2010, Chapman & Hall. 2. Syntax ( grammar ). 1.1, p. 1. Given:
More informationEquational Logic. Chapter Syntax Terms and Term Algebras
Chapter 2 Equational Logic 2.1 Syntax 2.1.1 Terms and Term Algebras The natural logic of algebra is equational logic, whose propositions are universally quantified identities between terms built up from
More informationIntroduction to Logic in Computer Science: Autumn 2006
Introduction to Logic in Computer Science: Autumn 2006 Ulle Endriss Institute for Logic, Language and Computation University of Amsterdam Ulle Endriss 1 Plan for Today Today s class will be an introduction
More informationThe Many Faces of Modal Logic Day 4: Structural Proof Theory
The Many Faces of Modal Logic Day 4: Structural Proof Theory Dirk Pattinson Australian National University, Canberra (Slides based on a NASSLLI 2014 Tutorial and are joint work with Lutz Schröder) LAC
More informationConsequence Relations and Natural Deduction
Consequence Relations and Natural Deduction Joshua D. Guttman Worcester Polytechnic Institute September 9, 2010 Contents 1 Consequence Relations 1 2 A Derivation System for Natural Deduction 3 3 Derivations
More informationLOGICAL STEP-INDEXED LOGICAL RELATIONS
LOGICAL STEP-INDEXED LOGICAL RELATIONS DEREK DREYER, AMAL AHMED, AND LARS BIRKEDAL MPI-SWS, Germany e-mail address: dreyer@mpi-sws.org Indiana University, USA e-mail address: amal@cs.indiana.edu IT University
More informationComplete Partial Orders, PCF, and Control
Complete Partial Orders, PCF, and Control Andrew R. Plummer TIE Report Draft January 2010 Abstract We develop the theory of directed complete partial orders and complete partial orders. We review the syntax
More informationhal , version 1-21 Oct 2009
ON SKOLEMISING ZERMELO S SET THEORY ALEXANDRE MIQUEL Abstract. We give a Skolemised presentation of Zermelo s set theory (with notations for comprehension, powerset, etc.) and show that this presentation
More informationApplied Logic. Lecture 1 - Propositional logic. Marcin Szczuka. Institute of Informatics, The University of Warsaw
Applied Logic Lecture 1 - Propositional logic Marcin Szczuka Institute of Informatics, The University of Warsaw Monographic lecture, Spring semester 2017/2018 Marcin Szczuka (MIMUW) Applied Logic 2018
More informationPhilosophy 244: #14 Existence and Identity
Philosophy 244: #14 Existence and Identity Existence Predicates The problem we ve been having is that (a) we want to allow models that invalidate the CBF ( xα x α), (b) these will have to be models in
More informationFirst-Order Logic. Resolution
First-Order Logic Resolution 1 Resolution for predicate logic Gilmore s algorithm is correct and complete, but useless in practice. We upgrade resolution to make it work for predicate logic. 2 Recall:
More informationNotes. Corneliu Popeea. May 3, 2013
Notes Corneliu Popeea May 3, 2013 1 Propositional logic Syntax We rely on a set of atomic propositions, AP, containing atoms like p, q. A propositional logic formula φ Formula is then defined by the following
More informationLecture Notes on A Concurrent Logical Framework
Lecture Notes on A Concurrent Logical Framework 15-816: Linear Logic Frank Pfenning Lecture 21 April 9, 2012 Today we will put many of pieces together that we have put in place so far, by moving from a
More informationModal Logic as a Basis for Distributed Computation
Modal Logic as a Basis for Distributed Computation Jonathan Moody 1 jwmoody@cs.cmu.edu October 2003 CMU-CS-03-194 School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213 1 This material
More informationEquational Reasoning in Algebraic Structures: a Complete Tactic
Equational Reasoning in Algebraic Structures: a Complete Tactic Luís Cruz-Filipe 1,2 and Freek Wiedijk 1 1 NIII, University of Nijmegen, Netherlands and 2 CLC, Lisbon, Portugal Abstract We present rational,
More informationDISJOINT-UNION PARTIAL ALGEBRAS
Logical Methods in Computer Science Vol. 13(2:10)2017, pp. 1 31 https://lmcs.episciences.org/ Submitted Dec. 07, 2016 Published Jun. 22, 2017 DISJOINT-UNION PARTIAL ALGEBRAS ROBIN HIRSCH AND BRETT MCLEAN
More informationA MODEL-THEORETIC PROOF OF HILBERT S NULLSTELLENSATZ
A MODEL-THEORETIC PROOF OF HILBERT S NULLSTELLENSATZ NICOLAS FORD Abstract. The goal of this paper is to present a proof of the Nullstellensatz using tools from a branch of logic called model theory. In
More informationModels of Computation,
Models of Computation, 2010 1 Induction We use a lot of inductive techniques in this course, both to give definitions and to prove facts about our semantics So, it s worth taking a little while to set
More informationApplied Logic for Computer Scientists. Answers to Some Exercises
Applied Logic for Computer Scientists Computational Deduction and Formal Proofs Springer, 2017 doi: http://link.springer.com/book/10.1007%2f978-3-319-51653-0 Answers to Some Exercises Mauricio Ayala-Rincón
More informationLogic I - Session 10 Thursday, October 15,
Logic I - Session 10 Thursday, October 15, 2009 1 Plan Re: course feedback Review of course structure Recap of truth-functional completeness? Soundness of SD Thursday, October 15, 2009 2 The course structure
More informationTyping λ-terms. Types. Typed λ-terms. Base Types. The Typing Relation. Advanced Formal Methods. Lecture 3: Simply Typed Lambda calculus
Course 2D1453, 200607 Advanced Formal Methods Lecture 3: Simply Typed Lambda calculus Mads Dam KTH/CSC Some material from B. Pierce: TAPL + some from G. Klein, NICTA Typing λterms The uptyped λcalculus
More information0.1 Uniform integrability
Copyright c 2009 by Karl Sigman 0.1 Uniform integrability Given a sequence of rvs {X n } for which it is known apriori that X n X, n, wp1. for some r.v. X, it is of great importance in many applications
More informationLecture Notes on Data Abstraction
Lecture Notes on Data Abstraction 15-814: Types and Programming Languages Frank Pfenning Lecture 14 October 23, 2018 1 Introduction Since we have moved from the pure λ-calculus to functional programming
More informationRealizability Semantics of Parametric Polymorphism, General References, and Recursive Types
Realizability Semantics of Parametric Polymorphism, General References, and Recursive Types Lars Birkedal IT University of Copenhagen Joint work with Kristian Støvring and Jacob Thamsborg Oct, 2008 Lars
More informationLECTURE 3 Functional spaces on manifolds
LECTURE 3 Functional spaces on manifolds The aim of this section is to introduce Sobolev spaces on manifolds (or on vector bundles over manifolds). These will be the Banach spaces of sections we were after
More informationModel Checking with CTL. Presented by Jason Simas
Model Checking with CTL Presented by Jason Simas Model Checking with CTL Based Upon: Logic in Computer Science. Huth and Ryan. 2000. (148-215) Model Checking. Clarke, Grumberg and Peled. 1999. (1-26) Content
More informationA Monadic Analysis of Information Flow Security with Mutable State
A Monadic Analysis of Information Flow Security with Mutable State Karl Crary Aleksey Kliger Frank Pfenning July 2003 CMU-CS-03-164 School of Computer Science Carnegie Mellon University Pittsburgh, PA
More informationTHE EQUATIONAL THEORIES OF REPRESENTABLE RESIDUATED SEMIGROUPS
TH QUATIONAL THORIS OF RPRSNTABL RSIDUATD SMIGROUPS SZABOLCS MIKULÁS Abstract. We show that the equational theory of representable lower semilattice-ordered residuated semigroups is finitely based. We
More informationMatching Logic: Syntax and Semantics
Matching Logic: Syntax and Semantics Grigore Roșu 1 and Traian Florin Șerbănuță 2 1 University of Illinois at Urbana-Champaign, USA grosu@illinois.edu 2 University of Bucharest, Romania traian.serbanuta@unibuc.ro
More informationSolution. 1 Solution of Homework 7. Sangchul Lee. March 22, Problem 1.1
Solution Sangchul Lee March, 018 1 Solution of Homework 7 Problem 1.1 For a given k N, Consider two sequences (a n ) and (b n,k ) in R. Suppose that a n b n,k for all n,k N Show that limsup a n B k :=
More informationThere are infinitely many set variables, X 0, X 1,..., each of which is
4. Second Order Arithmetic and Reverse Mathematics 4.1. The Language of Second Order Arithmetic. We ve mentioned that Peano arithmetic is sufficient to carry out large portions of ordinary mathematics,
More informationRoy L. Crole. Operational Semantics Abstract Machines and Correctness. University of Leicester, UK
Midlands Graduate School, University of Birmingham, April 2008 1 Operational Semantics Abstract Machines and Correctness Roy L. Crole University of Leicester, UK Midlands Graduate School, University of
More informationSolutions to Exercises. Solution to Exercise 2.4. Solution to Exercise 2.5. D. Sabel and M. Schmidt-Schauß 1
D. Sabel and M. Schmidt-Schauß 1 A Solutions to Exercises Solution to Exercise 2.4 We calculate the sets of free and bound variables: FV ((λy.(y x)) (λx.(x y)) (λz.(z x y))) = FV ((λy.(y x)) (λx.(x y)))
More informationOutline. Formale Methoden der Informatik First-Order Logic for Forgetters. Why PL1? Why PL1? Cont d. Motivation
Outline Formale Methoden der Informatik First-Order Logic for Forgetters Uwe Egly Vienna University of Technology Institute of Information Systems Knowledge-Based Systems Group Motivation Syntax of PL1
More informationPredicate Logic. Andreas Klappenecker
Predicate Logic Andreas Klappenecker Predicates A function P from a set D to the set Prop of propositions is called a predicate. The set D is called the domain of P. Example Let D=Z be the set of integers.
More informationRecent Developments in and Around Coaglgebraic Logics
Recent Developments in and Around Coaglgebraic Logics D. Pattinson, Imperial College London (in collaboration with G. Calin, R. Myers, L. Schröder) Example: Logics in Knowledge Representation Knowledge
More information1 The Observability Canonical Form
NONLINEAR OBSERVERS AND SEPARATION PRINCIPLE 1 The Observability Canonical Form In this Chapter we discuss the design of observers for nonlinear systems modelled by equations of the form ẋ = f(x, u) (1)
More informationAn Introduction to Modal Logic III
An Introduction to Modal Logic III Soundness of Normal Modal Logics Marco Cerami Palacký University in Olomouc Department of Computer Science Olomouc, Czech Republic Olomouc, October 24 th 2013 Marco Cerami
More informationA Tableau Calculus for Minimal Modal Model Generation
M4M 2011 A Tableau Calculus for Minimal Modal Model Generation Fabio Papacchini 1 and Renate A. Schmidt 2 School of Computer Science, University of Manchester Abstract Model generation and minimal model
More informationAutomated Synthesis of Tableau Calculi
Automated Synthesis of Tableau Calculi Renate A. Schmidt 1 and Dmitry Tishkovsky 1 School of Computer Science, The University of Manchester Abstract This paper presents a method for synthesising sound
More informationUsing the Prover I: Lee Pike. June 3, NASA Langley Formal Methods Group Using the Prover I:
Basic Basic NASA Langley Formal Methods Group lee.s.pike@nasa.gov June 3, 2005 Basic Sequents Basic Sequent semantics: The conjunction of the antecedents above the turnstile implies the disjunction of
More informationMinimal logic for computable functionals
Minimal logic for computable functionals Helmut Schwichtenberg Mathematisches Institut der Universität München Contents 1. Partial continuous functionals 2. Total and structure-total functionals 3. Terms;
More informationbe a path in L G ; we can associated to P the following alternating sequence of vertices and edges in G:
1. The line graph of a graph. Let G = (V, E) be a graph with E. The line graph of G is the graph L G such that V (L G ) = E and E(L G ) = {ef : e, f E : e and f are adjacent}. Examples 1.1. (1) If G is
More informationA Goal-Oriented Algorithm for Unification in EL w.r.t. Cycle-Restricted TBoxes
A Goal-Oriented Algorithm for Unification in EL w.r.t. Cycle-Restricted TBoxes Franz Baader, Stefan Borgwardt, and Barbara Morawska {baader,stefborg,morawska}@tcs.inf.tu-dresden.de Theoretical Computer
More informationChapter 2. Assertions. An Introduction to Separation Logic c 2011 John C. Reynolds February 3, 2011
Chapter 2 An Introduction to Separation Logic c 2011 John C. Reynolds February 3, 2011 Assertions In this chapter, we give a more detailed exposition of the assertions of separation logic: their meaning,
More informationA PROOF SYSTEM FOR SEPARATION LOGIC WITH MAGIC WAND
A PROOF SYSTEM FOR SEPARATION LOGIC WITH MAGIC WAND WONYEOL LEE, JINEON BAEK, AND SUNGWOO PARK Stanford, USA e-mail address: wonyeol@stanfordedu POSTECH, Republic of Korea e-mail address: gok01172@postechackr
More informationGeneral Patterns for Nonmonotonic Reasoning: From Basic Entailments to Plausible Relations
General Patterns for Nonmonotonic Reasoning: From Basic Entailments to Plausible Relations OFER ARIELI AND ARNON AVRON, Department of Computer Science, School of Mathematical Sciences, Tel-Aviv University,
More informationLogic, Sets, and Proofs
Logic, Sets, and Proofs David A. Cox and Catherine C. McGeoch Amherst College 1 Logic Logical Operators. A logical statement is a mathematical statement that can be assigned a value either true or false.
More informationConvex Optimization Notes
Convex Optimization Notes Jonathan Siegel January 2017 1 Convex Analysis This section is devoted to the study of convex functions f : B R {+ } and convex sets U B, for B a Banach space. The case of B =
More informationLecture Notes: Axiomatic Semantics and Hoare-style Verification
Lecture Notes: Axiomatic Semantics and Hoare-style Verification 17-355/17-665/17-819O: Program Analysis (Spring 2018) Claire Le Goues and Jonathan Aldrich clegoues@cs.cmu.edu, aldrich@cs.cmu.edu It has
More informationRelating State-Based and Process-Based Concurrency through Linear Logic
École Polytechnique 17 September 2009 Relating State-Based and Process-Based oncurrency through Linear Logic Iliano ervesato arnegie Mellon University - Qatar iliano@cmu.edu Specifying oncurrent Systems
More informationAlgebraic theories in the presence of binding operators, substitution, etc.
Algebraic theories in the presence of binding operators, substitution, etc. Chung Kil Hur Joint work with Marcelo Fiore Computer Laboratory University of Cambridge 20th March 2006 Overview First order
More informationTABLEAU SYSTEM FOR LOGIC OF CATEGORIAL PROPOSITIONS AND DECIDABILITY
Bulletin of the Section of Logic Volume 37:3/4 (2008), pp. 223 231 Tomasz Jarmużek TABLEAU SYSTEM FOR LOGIC OF CATEGORIAL PROPOSITIONS AND DECIDABILITY Abstract In the article we present an application
More informationFunctional Dependencies
Chapter 7 Functional Dependencies 7.1 Introduction 7.2 Proofs and Functional Dependencies 7.3 Keys and Functional Dependencies 7.4 Covers 7.5 Tableaux 7.6 Exercises 7.7 Bibliographical Comments 7.1 Introduction
More informationDynamic Noninterference Analysis Using Context Sensitive Static Analyses. Gurvan Le Guernic July 14, 2007
Dynamic Noninterference Analysis Using Context Sensitive Static Analyses Gurvan Le Guernic July 14, 2007 1 Abstract This report proposes a dynamic noninterference analysis for sequential programs. This
More informationSystems of modal logic
499 Modal and Temporal Logic Systems of modal logic Marek Sergot Department of Computing Imperial College, London utumn 2008 Further reading: B.F. Chellas, Modal logic: an introduction. Cambridge University
More informationCVO103: Programming Languages. Lecture 2 Inductive Definitions (2)
CVO103: Programming Languages Lecture 2 Inductive Definitions (2) Hakjoo Oh 2018 Spring Hakjoo Oh CVO103 2018 Spring, Lecture 2 March 13, 2018 1 / 20 Contents More examples of inductive definitions natural
More informationHow and when axioms can be transformed into good structural rules
How and when axioms can be transformed into good structural rules Kazushige Terui National Institute of Informatics, Tokyo Laboratoire d Informatique de Paris Nord (Joint work with Agata Ciabattoni and
More informationState-Dependent Representation Independence
State-Dependent Representation Independence Amal Ahmed TTI-C amal@tti-c.org Derek Dreyer MPI-SWS dreyer@mpi-sws.mpg.de Andreas Rossberg MPI-SWS rossberg@mpi-sws.mpg.de Abstract Mitchell s notion of representation
More informationLecture Notes: Program Analysis Correctness
Lecture Notes: Program Analysis Correctness 15-819O: Program Analysis Jonathan Aldrich jonathan.aldrich@cs.cmu.edu Lecture 5 1 Termination As we think about the correctness of program analysis, let us
More informationGS03/4023: Validation and Verification Predicate Logic Jonathan P. Bowen Anthony Hall
GS03/4023: Validation and Verification Predicate Logic Jonathan P. Bowen www.cs.ucl.ac.uk/staff/j.bowen/gs03 Anthony Hall GS03 W1 L3 Predicate Logic 12 January 2007 1 Overview The need for extra structure
More informationTopics in linear algebra
Chapter 6 Topics in linear algebra 6.1 Change of basis I want to remind you of one of the basic ideas in linear algebra: change of basis. Let F be a field, V and W be finite dimensional vector spaces over
More informationThe predicate calculus is complete
The predicate calculus is complete Hans Halvorson The first thing we need to do is to precisify the inference rules UI and EE. To this end, we will use A(c) to denote a sentence containing the name c,
More informationFirst-Order Logic. 1 Syntax. Domain of Discourse. FO Vocabulary. Terms
First-Order Logic 1 Syntax Domain of Discourse The domain of discourse for first order logic is FO structures or models. A FO structure contains Relations Functions Constants (functions of arity 0) FO
More informationMASTERS EXAMINATION IN MATHEMATICS
MASTERS EXAMINATION IN MATHEMATICS PURE MATHEMATICS OPTION FALL 2007 Full points can be obtained for correct answers to 8 questions. Each numbered question (which may have several parts) is worth the same
More informationThe non-logical symbols determine a specific F OL language and consists of the following sets. Σ = {Σ n } n<ω
1 Preliminaries In this chapter we first give a summary of the basic notations, terminology and results which will be used in this thesis. The treatment here is reduced to a list of definitions. For the
More informationModel Evolution with Equality Revised and Implemented
Model Evolution with Equality Revised and Implemented Peter Baumgartner 1 NICTA and The Australian National University, Canberra, Australia Björn Pelzer Institute for Computer Science, Universität Koblenz-Landau,
More informationTheorem 5.3. Let E/F, E = F (u), be a simple field extension. Then u is algebraic if and only if E/F is finite. In this case, [E : F ] = deg f u.
5. Fields 5.1. Field extensions. Let F E be a subfield of the field E. We also describe this situation by saying that E is an extension field of F, and we write E/F to express this fact. If E/F is a field
More informationQualifying Exam Logic August 2005
Instructions: Qualifying Exam Logic August 2005 If you signed up for Computability Theory, do two E and two C problems. If you signed up for Model Theory, do two E and two M problems. If you signed up
More informationParametric Completeness for Separation Theories
Parametric Completeness for Separation Theories James Brotherston Jules Villard Dept. of Computer Science, University College London, UK Abstract In this paper, we close the logical gap between provability
More informationA should be R-closed: R(A) A. We would like this to hold because we would like every element that the rules say should be in A to actually be in A.
CS 6110 S18 Lecture 7 Inductive Definitions and Least Fixed Points 1 Set Operators Last time we considered a set of rule instances of the form X 1 X 2... X n, (1) X where X and the X i are members of some
More informationPropositional and Predicate Logic. jean/gbooks/logic.html
CMSC 630 February 10, 2009 1 Propositional and Predicate Logic Sources J. Gallier. Logic for Computer Science, John Wiley and Sons, Hoboken NJ, 1986. 2003 revised edition available on line at http://www.cis.upenn.edu/
More informationCylindrical Algebraic Decomposition in Coq
Cylindrical Algebraic Decomposition in Coq MAP 2010 - Logroño 13-16 November 2010 Assia Mahboubi INRIA Microsoft Research Joint Centre (France) INRIA Saclay Île-de-France École Polytechnique, Palaiseau
More informationState-Dependent Representation Independence
State-Dependent Representation Independence Amal Ahmed TTI-C amal@tti-c.org Derek Dreyer MPI-SWS dreyer@mpi-sws.mpg.de Andreas Rossberg MPI-SWS rossberg@mpi-sws.mpg.de Abstract Mitchell s notion of representation
More informationA Logical Characterization for Weighted Event-Recording Automata
A Logical Characterization for Weighted Event-Recording Automata June 23, 2009 Karin Quaas Institut für Informatik, Universität Leipzig 04009 Leipzig, Germany quaas@informatik.uni-leipzig.de Abstract.
More informationMathematics 114L Spring 2018 D.A. Martin. Mathematical Logic
Mathematics 114L Spring 2018 D.A. Martin Mathematical Logic 1 First-Order Languages. Symbols. All first-order languages we consider will have the following symbols: (i) variables v 1, v 2, v 3,... ; (ii)
More informationLecture Notes on Proofs & Arithmetic
15-424: Foundations of Cyber-Physical Systems Lecture Notes on Proofs & Arithmetic André Platzer Carnegie Mellon University Lecture 9 1 Introduction Lecture 8 on Events & Delays discussed and developed
More information7 RC Simulates RA. Lemma: For every RA expression E(A 1... A k ) there exists a DRC formula F with F V (F ) = {A 1,..., A k } and
7 RC Simulates RA. We now show that DRC (and hence TRC) is at least as expressive as RA. That is, given an RA expression E that mentions at most C, there is an equivalent DRC expression E that mentions
More informationComputation Tree Logic
Computation Tree Logic Computation tree logic (CTL) is a branching-time logic that includes the propositional connectives as well as temporal connectives AX, EX, AU, EU, AG, EG, AF, and EF. The syntax
More informationKleene realizability and negative translations
Q E I U G I C Kleene realizability and negative translations Alexandre Miquel O P. D E. L Ō A U D E L A R April 21th, IMERL Plan 1 Kleene realizability 2 Gödel-Gentzen negative translation 3 Lafont-Reus-Streicher
More information