SAT based Abstraction-Refinement using ILP and Machine Learning Techniques

Size: px
Start display at page:

Download "SAT based Abstraction-Refinement using ILP and Machine Learning Techniques"

Transcription

1 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 1 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques Edmund Clarke James Kukula Anubhav Guta Ofer Strichman

2 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 2 Abstraction in Model Checking I Set of variables V = {x 1,..., x n }. Set of states S = D x1 D xn. Set of initial states I S. Set of transitions R S S. Transition system M = (S, I, R).

3 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 3 Abstract Model Abstraction Function h : S Ŝ ˆM = (Ŝ, Î, ˆR) I h h h h h Ŝ = {ŝ s. s S h(s) = ŝ}

4 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 4 Abstract Model Abstraction Function h : S Ŝ ˆM = (Ŝ, Î, ˆR) I I Î = {ŝ s. I(s) h(s) = ŝ}

5 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 5 Abstract Model Abstraction Function h : S Ŝ ˆM = (Ŝ, Î, ˆR) I I ˆR = {(ŝ 1, ŝ 2 ) s 1. s 2. R(s 1, s 2 ) h(s 1 ) = ŝ 1 h(s 2 ) = ŝ 2 }

6 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 6 Model Checking AG, is a non-temoral roositional formula resects h if for all s S, h(s) = s = ~ ~ ~ ~ ~ resects h

7 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 7 Model Checking AG, is a non-temoral roositional formula resects h if for all s S, h(s) = s = ~ ~ ~ ~ ~ ~ does not resect h

8 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 8 Preservation Theorem Let ˆM be an abstraction of M corresonding to the abstraction function h, and be a roositional formula that resects h. Then ˆM = AG M = AG ~ ~ ~

9 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 9 Converse of Preservation Theorem ˆM = AG M = AG ~ ~ ~ ~ ~ ~ Counterexamle is surious. Abstraction is too coarse.

10 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 10 Refinement h is a refinement of h if 1. s 1, s 2 S, h (s 1 ) = h (s 2 ) imlies h(s 1 ) = h(s 2 ). 2. s 1, s 2 S such that h(s 1 ) = h(s 2 ) and h (s 1 ) h (s 2 ). ~ ~ ~ ~ ~ ~

11 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 11 Refinement h is a refinement of h if 1. s 1, s 2 S, h (s 1 ) = h (s 2 ) imlies h(s 1 ) = h(s 2 ). 2. s 1, s 2 S such that h(s 1 ) = h(s 2 ) and h (s 1 ) h (s 2 ). ~ ~ ~ ~ ~ ~

12 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 12 Abstraction-Refinement 1. Generate an initial abstraction function h. 2. Build abstract machine ˆM based on h. Model check ˆM. If ˆM = ϕ, then M = ϕ. Return TRUE. 3. If ˆM = ϕ, check the counterexamle on the concrete model. If the counterexamle is real, M = ϕ. Return FALSE. 4. Refine h, and go to ste 2.

13 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 13 Abstraction Function Partition variables V into visible(v) and invisible(i) variables. V = {v 1,..., v k }. The artitioning defines our abstraction function h : S Ŝ. The set of abstract states is and the abstraction functions is Ŝ = D v1 D vk h(s) = (s(v 1 )... s(v k )) x1 x2 x3 x } 0 Refinement : Move variables from I to V. x1 x2 0

14 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 14 Building Abstract Model ˆM can be comuted efficiently if R is in functional form, e.g. sequential circuits. R(s, s ) = i( m j=1 x j = f x j (s, i)) ˆR(ŝ, ŝ ) = s I i( x j V ˆx j = f x j (ŝ, s I, i)) x1 x2 x3 x4 x5 x6 x1 x2 i1 i2 i3 i1 i2 i3 x3 x4 x5 x6

15 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 15 Checking the Counterexamle Counterexamle : ŝ 1, ŝ 2,... ŝ m Set of concrete aths for counterexamle : ψ m = { s 1... s m I(s 1 ) m 1 i=1 R(s i, s i+1 ) m i=1 h(s i ) = ŝ i } The right-most conjunct is a restriction of the visible variables to their values in the counterexamle. Counterexamle is surious ψ m is emty. Solve ψ m with a SAT solver.

16 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 16 Checking the Counterexamle Similar to BMC formulas, excet Path restricted to counterexamle. Also restrict values of (original) inuts that are assigned by counterexamle. If ψ m is satisfiable we found a real bug. If ψ m is unsatisfiable, refine.

17 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 17 Refinement Find largest index f (failure index), f < m such that ψ f is satisfiable. The set D of all states d f such that there is a concrete ath d 1...d f in ψ f is called the set of deadend states. Abstract Trace Concrete Trace } Dead end No concrete transition from D to a concrete state in the next abstract state.

18 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 18 Refinement Since there is an abstract transition from ŝ f to ŝ f+1, there is a non-emty set of transitions φ f from h 1 (ŝ f ) to h 1 (ŝ f+1 ). φ f = { s f, s f+1 R(s f, s f+1 ) h(s f ) = ŝ f h(s f+1 ) = ŝ f+1 } The set B of all states b f such that there is a transition b f, b f+1 in φ f is called the set of bad states. Abstract Trace Concrete Trace } Dead end Bad {

19 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 19 Refinement Dead States ~ ~ Bad States ~ ~ ~ ~

20 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 20 Refinement There is a surious transition from ŝ f to ŝ f+1. Surious transition because D and B lie in the same abstract state. Refinement : Put D and B is searate abstract states. d D, b B (h (d) h (b)) ~ ~ ~ ~ ~ ~

21 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 21 Refinement as Searation Let S = {s 1...s m } and T = {t 1...t n } be two sets of states (binary vectors) of size l, reresenting assignments to a set of variables W, W = l. (The state searation roblem) Find a minimal set of variables U = {u 1...u k }, U W, such that for each air of states (s i, t j ), 1 i m, 1 j n, there exists a variable u r U such that s i (u r ) t j (u r ). Let H denote the searating set for D and B. The refinement h is obtained by adding H to V. Proof : Since H searates D and B, for all d D, b B there exists u H s.t. d(u) b(u). Hence, h(d) h(b).

22 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 22 Refinement as Searation and Learning For systems of realistic size, It is not ossible to generate D and B, either exlicitly or symbolically. Comutationally exensive to searate large D and B. Generate samles for D(denoted S D ) and B(denoted S B ) and try to infer the searating variables from the samles. State of the art SAT solvers like Chaff can generate many samles in a short amount of time. Our algorithm is comlete because a counterexamle will eventually be eliminated in subsequent iterations.

23 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 23 Searation using Integer Linear Programming Searating S D from S B as an Integer Linear Programming (ILP) roblem: Min I i=1 v i subject to: ( s S D ) ( t S B ) 1 i I, s(v i ) t(v i ) v i 1 v i = 1 if and only if v i is in the searating set. One constraint er air of states, stating that at least one of the variables that searates the two states should be selected.

24 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 24 Examle s 1 = (0, 1, 0, 1) t 1 = (1, 1, 1, 1) s 2 = (1, 1, 1, 0) t 2 = (0, 0, 0, 1) Min 4 i=1 v i subject to: v 1 + v 3 1 /* Searating s 1 from t 1 / v 2 1 /* Searating s 1 from t 2 / v 4 1 /* Searating s 2 from t 1 / v 1 + v 2 + v 3 + v 4 1 /* Searating s 2 from t 2 / Otimal value of the objective function is 3, corresonding to one of the two otimal solutions (v 1, v 2, v 4 ) and (v 3, v 2, v 4 ).

25 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 25 Searation using Decision Tree Learning ILP-based searation: Minimal searation set Comutationally exensive Decision Tree Learning based searation: Non otimal Comutationally efficient

26 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 26 Decision Tree Learning Inut : Set of examles with classification. Each examle assigns values to a set of attributes. Outut : Decision Tree Each internal node is a test on some attribute. Each leaf corresonds to a classification. v1 0 1 v2 v

27 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 27 Searation using Decision Tree Learning Searating S D from S B as a Decision Tree Learning roblem: Attributes corresond to the invisible variables. The classifications are +1 and 1, corresonding to S D and S B, resectively. The examles are S D labeled +1, and S B labeled 1. Searating set : All the variables resent at an internal nodes of the decision tree. Proof: Let d S D and b S B. The decision tree will classify d as +1 and b as 1. So, there exists a node n in the decision tree, labeled with a variable v, such that d(v) b(v). By construction, v lies in the outut set.

28 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 28 Examle s 1 = (0, 1, 0, 1) t 1 = (1, 1, 1, 1) s 2 = (1, 1, 1, 0) t 2 = (0, 0, 0, 1) E = ((0, 1, 0, 1), +1), ((1, 1, 1, 0), +1), ((1, 1, 1, 1), 1), ((0, 0, 0, 1), 1) v1 0 1 v2 v Searating set : {v 1, v 2, v 4 }

29 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 29 Decision Tree Learning Algorithm DecT ree(examles, Attributes) ID3 Algorithm 1. Create a Root node for the tree. 2. If all examles are classified the same, return Root with this classification. 3. Let A = BestAttribute(Examles, Attributes). Label Root with attribute A. 4. Let Examles 0 and Examles 1 be subsets of Examles having values 0 and 1 for A, resectively. 5. Add a 0 branch to the Root ointing to subtree generated by Dectree(Examles 0, Attributes {A}).

30 6. Add a 1 branch to the Root ointing to subtree generated by Dectree(Examles 1, Attributes {A}). 7. return Root. The BestAttribute rocedure returns an attribute (which is a variable in our case) that causes the maximum reduction in entroy if the set is artitioned according to this variable.

31 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 30 Efficient Samling Direct search towards samles that contain more information. Iterative Algorithm. At each iteration, the algorithm finds new samles that are not searated by the current searating set. Let SeSet denote the searating set for the current set of samles. New samles that are not searated by SeSet are comuted by solving Φ(SeSet). = ψ f φ f v i SeSet v i = v i

32 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 31 Efficient Samling SeSet = ; i = 0; reeat forever { If Φ(SeSet) is satisfiable, derive d i and b i from solution; else exit; SeSet = Searating Set for { i j=0 {d j }, ij=0 {b j }}; i = i + 1; }

33 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 32 Exeriments NuSMV frontend. Cadence SMV. A ublic domain ILP solver. Chaff. Exeriments conducted on a 1.5GHz Athlon with 3Gb RAM running Linux. We used the IU family of circuits, which are various abstractions of an interface control circuit from Synosys.

34 Circuit SMV Samling - ILP Samling - DTL Eff. Sam. - DTL Time BDD(k) Time BDD(k) S L Time BDD(k) S L Time BDD(k) S L IU IU IU IU IU IU IU IU IU IU IU IU IU

35 Circuit SMV Samling - ILP Samling - DTL Eff. Sam. - DTL Time BDD(k) Time BDD(k) S L Time BDD(k) S L Time BDD(k) S L IU IU IU IU IU IU IU IU IU IU IU IU IU

36 SAT based Abstraction-Refinement using ILP and Machine Learning Techniques 33 Conclusions and Future Work Our algorithm outerforms standard model checking in both execution time and memory requirements. Exloit criteria other than size of searating set for characterizing a good refinement. Exlore other learning techniques.

Learning Abstractions for Model Checking

Learning Abstractions for Model Checking Learning Abstractions for Model Checking Anubhav Gupta June 2006 CMU-CS-06-131 School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213 Submitted in partial fulfillment of the requirements

More information

Model checking, verification of CTL. One must verify or expel... doubts, and convert them into the certainty of YES [Thomas Carlyle]

Model checking, verification of CTL. One must verify or expel... doubts, and convert them into the certainty of YES [Thomas Carlyle] Chater 5 Model checking, verification of CTL One must verify or exel... doubts, and convert them into the certainty of YES or NO. [Thomas Carlyle] 5. The verification setting Page 66 We introduce linear

More information

SAT Based Abstraction-Refinement Using ILP and Machine Learning Techniques

SAT Based Abstraction-Refinement Using ILP and Machine Learning Techniques SAT Based Abstraction-Refinement Using ILP and Machine Learning Techniques Edmund Clarke 1, Anubhav Gupta 1,JamesKukula 2, and Ofer Strichman 1 1 Computer Science, Carnegie Mellon University, Pittsburgh,

More information

Approximating min-max k-clustering

Approximating min-max k-clustering Aroximating min-max k-clustering Asaf Levin July 24, 2007 Abstract We consider the roblems of set artitioning into k clusters with minimum total cost and minimum of the maximum cost of a cluster. The cost

More information

On the Chvatál-Complexity of Knapsack Problems

On the Chvatál-Complexity of Knapsack Problems R u t c o r Research R e o r t On the Chvatál-Comlexity of Knasack Problems Gergely Kovács a Béla Vizvári b RRR 5-08, October 008 RUTCOR Rutgers Center for Oerations Research Rutgers University 640 Bartholomew

More information

CTL, the branching-time temporal logic

CTL, the branching-time temporal logic CTL, the branching-time temoral logic Cătălin Dima Université Paris-Est Créteil Cătălin Dima (UPEC) CTL 1 / 29 Temoral roerties CNIL Safety, termination, mutual exclusion LTL. Liveness, reactiveness, resonsiveness,

More information

where x i is the ith coordinate of x R N. 1. Show that the following upper bound holds for the growth function of H:

where x i is the ith coordinate of x R N. 1. Show that the following upper bound holds for the growth function of H: Mehryar Mohri Foundations of Machine Learning Courant Institute of Mathematical Sciences Homework assignment 2 October 25, 2017 Due: November 08, 2017 A. Growth function Growth function of stum functions.

More information

Solved Problems. (a) (b) (c) Figure P4.1 Simple Classification Problems First we draw a line between each set of dark and light data points.

Solved Problems. (a) (b) (c) Figure P4.1 Simple Classification Problems First we draw a line between each set of dark and light data points. Solved Problems Solved Problems P Solve the three simle classification roblems shown in Figure P by drawing a decision boundary Find weight and bias values that result in single-neuron ercetrons with the

More information

ABSTRACT MODEL REPAIR

ABSTRACT MODEL REPAIR ABSTRACT MODEL REPAIR GEORGE CHATZIELEFTHERIOU a, BORZOO BONAKDARPOUR b, PANAGIOTIS KATSAROS c, AND SCOTT A. SMOLKA d a Deartment of Informatics, Aristotle University of Thessaloniki, 54124 Thessaloniki,

More information

Finite-State Verification or Model Checking. Finite State Verification (FSV) or Model Checking

Finite-State Verification or Model Checking. Finite State Verification (FSV) or Model Checking Finite-State Verification or Model Checking Finite State Verification (FSV) or Model Checking Holds the romise of roviding a cost effective way of verifying imortant roerties about a system Not all faults

More information

Outline. CS21 Decidability and Tractability. Regular expressions and FA. Regular expressions and FA. Regular expressions and FA

Outline. CS21 Decidability and Tractability. Regular expressions and FA. Regular expressions and FA. Regular expressions and FA Outline CS21 Decidability and Tractability Lecture 4 January 14, 2019 FA and Regular Exressions Non-regular languages: Puming Lemma Pushdown Automata Context-Free Grammars and Languages January 14, 2019

More information

Named Entity Recognition using Maximum Entropy Model SEEM5680

Named Entity Recognition using Maximum Entropy Model SEEM5680 Named Entity Recognition using Maximum Entroy Model SEEM5680 Named Entity Recognition System Named Entity Recognition (NER): Identifying certain hrases/word sequences in a free text. Generally it involves

More information

ABSTRACT MODEL REPAIR

ABSTRACT MODEL REPAIR Logical Methods in Comuter Science Vol. 11(3:11)2015,. 1 43 www.lmcs-online.org Submitted Jul. 2, 2014 Published Se. 17, 2015 ABSTRACT MODEL REPAIR GEORGE CHATZIELEFTHERIOU a, BORZOO BONAKDARPOUR b, PANAGIOTIS

More information

Lecture 2: Symbolic Model Checking With SAT

Lecture 2: Symbolic Model Checking With SAT Lecture 2: Symbolic Model Checking With SAT Edmund M. Clarke, Jr. School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213 (Joint work over several years with: A. Biere, A. Cimatti, Y.

More information

On the capacity of the general trapdoor channel with feedback

On the capacity of the general trapdoor channel with feedback On the caacity of the general tradoor channel with feedback Jui Wu and Achilleas Anastasooulos Electrical Engineering and Comuter Science Deartment University of Michigan Ann Arbor, MI, 48109-1 email:

More information

Topic: Lower Bounds on Randomized Algorithms Date: September 22, 2004 Scribe: Srinath Sridhar

Topic: Lower Bounds on Randomized Algorithms Date: September 22, 2004 Scribe: Srinath Sridhar 15-859(M): Randomized Algorithms Lecturer: Anuam Guta Toic: Lower Bounds on Randomized Algorithms Date: Setember 22, 2004 Scribe: Srinath Sridhar 4.1 Introduction In this lecture, we will first consider

More information

An Introduction To Range Searching

An Introduction To Range Searching An Introduction To Range Searching Jan Vahrenhold eartment of Comuter Science Westfälische Wilhelms-Universität Münster, Germany. Overview 1. Introduction: Problem Statement, Lower Bounds 2. Range Searching

More information

Using BDDs to Decide CTL

Using BDDs to Decide CTL Using BDDs to Decide CTL Will Marrero DePaul University, Chicago, IL 60604, USA wmarrero@cs.deaul.edu Abstract. Comutation Tree Logic (CTL) has been used uite extensively and successfully to reason about

More information

Counterexample-Guided Abstraction Refinement

Counterexample-Guided Abstraction Refinement Counterexample-Guided Abstraction Refinement Edmund Clarke Orna Grumberg Somesh Jha Yuan Lu Helmut Veith Seminal Papers in Verification (Reading Group) June 2012 O. Rezine () Verification Reading Group

More information

DRAFT - do not circulate

DRAFT - do not circulate An Introduction to Proofs about Concurrent Programs K. V. S. Prasad (for the course TDA383/DIT390) Deartment of Comuter Science Chalmers University Setember 26, 2016 Rough sketch of notes released since

More information

Information collection on a graph

Information collection on a graph Information collection on a grah Ilya O. Ryzhov Warren Powell October 25, 2009 Abstract We derive a knowledge gradient olicy for an otimal learning roblem on a grah, in which we use sequential measurements

More information

Information collection on a graph

Information collection on a graph Information collection on a grah Ilya O. Ryzhov Warren Powell February 10, 2010 Abstract We derive a knowledge gradient olicy for an otimal learning roblem on a grah, in which we use sequential measurements

More information

On Wrapping of Exponentiated Inverted Weibull Distribution

On Wrapping of Exponentiated Inverted Weibull Distribution IJIRST International Journal for Innovative Research in Science & Technology Volume 3 Issue 11 Aril 217 ISSN (online): 2349-61 On Wraing of Exonentiated Inverted Weibull Distribution P.Srinivasa Subrahmanyam

More information

Convex Optimization methods for Computing Channel Capacity

Convex Optimization methods for Computing Channel Capacity Convex Otimization methods for Comuting Channel Caacity Abhishek Sinha Laboratory for Information and Decision Systems (LIDS), MIT sinhaa@mit.edu May 15, 2014 We consider a classical comutational roblem

More information

Recursive Estimation of the Preisach Density function for a Smart Actuator

Recursive Estimation of the Preisach Density function for a Smart Actuator Recursive Estimation of the Preisach Density function for a Smart Actuator Ram V. Iyer Deartment of Mathematics and Statistics, Texas Tech University, Lubbock, TX 7949-142. ABSTRACT The Preisach oerator

More information

ECE 534 Information Theory - Midterm 2

ECE 534 Information Theory - Midterm 2 ECE 534 Information Theory - Midterm Nov.4, 009. 3:30-4:45 in LH03. You will be given the full class time: 75 minutes. Use it wisely! Many of the roblems have short answers; try to find shortcuts. You

More information

LIMITATIONS OF RECEPTRON. XOR Problem The failure of the perceptron to successfully simple problem such as XOR (Minsky and Papert).

LIMITATIONS OF RECEPTRON. XOR Problem The failure of the perceptron to successfully simple problem such as XOR (Minsky and Papert). LIMITATIONS OF RECEPTRON XOR Problem The failure of the ercetron to successfully simle roblem such as XOR (Minsky and Paert). x y z x y z 0 0 0 0 0 0 Fig. 4. The exclusive-or logic symbol and function

More information

Combinatorics of topmost discs of multi-peg Tower of Hanoi problem

Combinatorics of topmost discs of multi-peg Tower of Hanoi problem Combinatorics of tomost discs of multi-eg Tower of Hanoi roblem Sandi Klavžar Deartment of Mathematics, PEF, Unversity of Maribor Koroška cesta 160, 000 Maribor, Slovenia Uroš Milutinović Deartment of

More information

RANDOM WALKS AND PERCOLATION: AN ANALYSIS OF CURRENT RESEARCH ON MODELING NATURAL PROCESSES

RANDOM WALKS AND PERCOLATION: AN ANALYSIS OF CURRENT RESEARCH ON MODELING NATURAL PROCESSES RANDOM WALKS AND PERCOLATION: AN ANALYSIS OF CURRENT RESEARCH ON MODELING NATURAL PROCESSES AARON ZWIEBACH Abstract. In this aer we will analyze research that has been recently done in the field of discrete

More information

arxiv: v2 [quant-ph] 2 Aug 2012

arxiv: v2 [quant-ph] 2 Aug 2012 Qcomiler: quantum comilation with CSD method Y. G. Chen a, J. B. Wang a, a School of Physics, The University of Western Australia, Crawley WA 6009 arxiv:208.094v2 [quant-h] 2 Aug 202 Abstract In this aer,

More information

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur Analysis of Variance and Design of Exeriment-I MODULE II LECTURE -4 GENERAL LINEAR HPOTHESIS AND ANALSIS OF VARIANCE Dr. Shalabh Deartment of Mathematics and Statistics Indian Institute of Technology Kanur

More information

Strong Matching of Points with Geometric Shapes

Strong Matching of Points with Geometric Shapes Strong Matching of Points with Geometric Shaes Ahmad Biniaz Anil Maheshwari Michiel Smid School of Comuter Science, Carleton University, Ottawa, Canada December 9, 05 In memory of Ferran Hurtado. Abstract

More information

Finding Shortest Hamiltonian Path is in P. Abstract

Finding Shortest Hamiltonian Path is in P. Abstract Finding Shortest Hamiltonian Path is in P Dhananay P. Mehendale Sir Parashurambhau College, Tilak Road, Pune, India bstract The roblem of finding shortest Hamiltonian ath in a eighted comlete grah belongs

More information

Principles. Model (System Requirements) Answer: Model Checker. Specification (System Property) Yes, if the model satisfies the specification

Principles. Model (System Requirements) Answer: Model Checker. Specification (System Property) Yes, if the model satisfies the specification Model Checking Princiles Model (System Requirements) Secification (System Proerty) Model Checker Answer: Yes, if the model satisfies the secification Counterexamle, otherwise Krike Model Krike Structure

More information

Cryptography. Lecture 8. Arpita Patra

Cryptography. Lecture 8. Arpita Patra Crytograhy Lecture 8 Arita Patra Quick Recall and Today s Roadma >> Hash Functions- stands in between ublic and rivate key world >> Key Agreement >> Assumtions in Finite Cyclic grous - DL, CDH, DDH Grous

More information

Game Specification in the Trias Politica

Game Specification in the Trias Politica Game Secification in the Trias Politica Guido Boella a Leendert van der Torre b a Diartimento di Informatica - Università di Torino - Italy b CWI - Amsterdam - The Netherlands Abstract In this aer we formalize

More information

Machine Learning: Homework 4

Machine Learning: Homework 4 10-601 Machine Learning: Homework 4 Due 5.m. Monday, February 16, 2015 Instructions Late homework olicy: Homework is worth full credit if submitted before the due date, half credit during the next 48 hours,

More information

Online Appendix to Accompany AComparisonof Traditional and Open-Access Appointment Scheduling Policies

Online Appendix to Accompany AComparisonof Traditional and Open-Access Appointment Scheduling Policies Online Aendix to Accomany AComarisonof Traditional and Oen-Access Aointment Scheduling Policies Lawrence W. Robinson Johnson Graduate School of Management Cornell University Ithaca, NY 14853-6201 lwr2@cornell.edu

More information

PROFIT MAXIMIZATION. π = p y Σ n i=1 w i x i (2)

PROFIT MAXIMIZATION. π = p y Σ n i=1 w i x i (2) PROFIT MAXIMIZATION DEFINITION OF A NEOCLASSICAL FIRM A neoclassical firm is an organization that controls the transformation of inuts (resources it owns or urchases into oututs or roducts (valued roducts

More information

Data Mining Project. C4.5 Algorithm. Saber Salah. Naji Sami Abduljalil Abdulhak

Data Mining Project. C4.5 Algorithm. Saber Salah. Naji Sami Abduljalil Abdulhak Data Mining Project C4.5 Algorithm Saber Salah Naji Sami Abduljalil Abdulhak Decembre 9, 2010 1.0 Introduction Before start talking about C4.5 algorithm let s see first what is machine learning? Human

More information

State Estimation with ARMarkov Models

State Estimation with ARMarkov Models Deartment of Mechanical and Aerosace Engineering Technical Reort No. 3046, October 1998. Princeton University, Princeton, NJ. State Estimation with ARMarkov Models Ryoung K. Lim 1 Columbia University,

More information

The exam is closed book, closed calculator, and closed notes except your one-page crib sheet.

The exam is closed book, closed calculator, and closed notes except your one-page crib sheet. CS 188 Fall 2015 Introduction to Artificial Intelligence Final You have approximately 2 hours and 50 minutes. The exam is closed book, closed calculator, and closed notes except your one-page crib sheet.

More information

CSE 311 Lecture 02: Logic, Equivalence, and Circuits. Emina Torlak and Kevin Zatloukal

CSE 311 Lecture 02: Logic, Equivalence, and Circuits. Emina Torlak and Kevin Zatloukal CSE 311 Lecture 02: Logic, Equivalence, and Circuits Emina Torlak and Kevin Zatloukal 1 Toics Proositional logic A brief review of Lecture 01. Classifying comound roositions Converse, contraositive, and

More information

GIVEN an input sequence x 0,..., x n 1 and the

GIVEN an input sequence x 0,..., x n 1 and the 1 Running Max/Min Filters using 1 + o(1) Comarisons er Samle Hao Yuan, Member, IEEE, and Mikhail J. Atallah, Fellow, IEEE Abstract A running max (or min) filter asks for the maximum or (minimum) elements

More information

Introduction to Probability and Statistics

Introduction to Probability and Statistics Introduction to Probability and Statistics Chater 8 Ammar M. Sarhan, asarhan@mathstat.dal.ca Deartment of Mathematics and Statistics, Dalhousie University Fall Semester 28 Chater 8 Tests of Hyotheses Based

More information

SAT-Solving: From Davis- Putnam to Zchaff and Beyond Day 3: Recent Developments. Lintao Zhang

SAT-Solving: From Davis- Putnam to Zchaff and Beyond Day 3: Recent Developments. Lintao Zhang SAT-Solving: From Davis- Putnam to Zchaff and Beyond Day 3: Recent Developments Requirements for SAT solvers in the Real World Fast & Robust Given a problem instance, we want to solve it quickly Reliable

More information

Anytime communication over the Gilbert-Eliot channel with noiseless feedback

Anytime communication over the Gilbert-Eliot channel with noiseless feedback Anytime communication over the Gilbert-Eliot channel with noiseless feedback Anant Sahai, Salman Avestimehr, Paolo Minero Deartment of Electrical Engineering and Comuter Sciences University of California

More information

A Social Welfare Optimal Sequential Allocation Procedure

A Social Welfare Optimal Sequential Allocation Procedure A Social Welfare Otimal Sequential Allocation Procedure Thomas Kalinowsi Universität Rostoc, Germany Nina Narodytsa and Toby Walsh NICTA and UNSW, Australia May 2, 201 Abstract We consider a simle sequential

More information

Node-voltage method using virtual current sources technique for special cases

Node-voltage method using virtual current sources technique for special cases Node-oltage method using irtual current sources technique for secial cases George E. Chatzarakis and Marina D. Tortoreli Electrical and Electronics Engineering Deartments, School of Pedagogical and Technological

More information

Decision Trees / NLP Introduction

Decision Trees / NLP Introduction Decision Trees / NLP Introduction Dr. Kevin Koidl School of Computer Science and Statistic Trinity College Dublin ADAPT Research Centre The ADAPT Centre is funded under the SFI Research Centres Programme

More information

XReason: A Semantic Approach that Reasons with Patterns to Answer XML Keyword Queries

XReason: A Semantic Approach that Reasons with Patterns to Answer XML Keyword Queries XReason: A Semantic Aroach that Reasons with Patterns to Answer XML Keyword Queries Cem Aksoy 1, Aggeliki Dimitriou 2, Dimitri Theodoratos 1, Xiaoying Wu 3 1 New Jersey Institute of Technology, Newark,

More information

Computation Tree Logic

Computation Tree Logic Comutation Tree Logic Finite State Model Checking of Branching Time Logic Kim Guldstrand Larsen BRICS@Aalborg 1 Tool Suort Finite State Systems System Descrition A Reuirement F CTL TOOL Course Objectives:

More information

A Comparison between Biased and Unbiased Estimators in Ordinary Least Squares Regression

A Comparison between Biased and Unbiased Estimators in Ordinary Least Squares Regression Journal of Modern Alied Statistical Methods Volume Issue Article 7 --03 A Comarison between Biased and Unbiased Estimators in Ordinary Least Squares Regression Ghadban Khalaf King Khalid University, Saudi

More information

Fig. 4. Example of Predicted Workers. Fig. 3. A Framework for Tackling the MQA Problem.

Fig. 4. Example of Predicted Workers. Fig. 3. A Framework for Tackling the MQA Problem. 217 IEEE 33rd International Conference on Data Engineering Prediction-Based Task Assignment in Satial Crowdsourcing Peng Cheng #, Xiang Lian, Lei Chen #, Cyrus Shahabi # Hong Kong University of Science

More information

Lecture 3: Decision Trees

Lecture 3: Decision Trees Lecture 3: Decision Trees Cognitive Systems II - Machine Learning SS 2005 Part I: Basic Approaches of Concept Learning ID3, Information Gain, Overfitting, Pruning Lecture 3: Decision Trees p. Decision

More information

Probability Estimates for Multi-class Classification by Pairwise Coupling

Probability Estimates for Multi-class Classification by Pairwise Coupling Probability Estimates for Multi-class Classification by Pairwise Couling Ting-Fan Wu Chih-Jen Lin Deartment of Comuter Science National Taiwan University Taiei 06, Taiwan Ruby C. Weng Deartment of Statistics

More information

Radial Basis Function Networks: Algorithms

Radial Basis Function Networks: Algorithms Radial Basis Function Networks: Algorithms Introduction to Neural Networks : Lecture 13 John A. Bullinaria, 2004 1. The RBF Maing 2. The RBF Network Architecture 3. Comutational Power of RBF Networks 4.

More information

Published: 14 October 2013

Published: 14 October 2013 Electronic Journal of Alied Statistical Analysis EJASA, Electron. J. A. Stat. Anal. htt://siba-ese.unisalento.it/index.h/ejasa/index e-issn: 27-5948 DOI: 1.1285/i275948v6n213 Estimation of Parameters of

More information

Convolutional Codes. Lecture 13. Figure 93: Encoder for rate 1/2 constraint length 3 convolutional code.

Convolutional Codes. Lecture 13. Figure 93: Encoder for rate 1/2 constraint length 3 convolutional code. Convolutional Codes Goals Lecture Be able to encode using a convolutional code Be able to decode a convolutional code received over a binary symmetric channel or an additive white Gaussian channel Convolutional

More information

ON THE LEAST SIGNIFICANT p ADIC DIGITS OF CERTAIN LUCAS NUMBERS

ON THE LEAST SIGNIFICANT p ADIC DIGITS OF CERTAIN LUCAS NUMBERS #A13 INTEGERS 14 (014) ON THE LEAST SIGNIFICANT ADIC DIGITS OF CERTAIN LUCAS NUMBERS Tamás Lengyel Deartment of Mathematics, Occidental College, Los Angeles, California lengyel@oxy.edu Received: 6/13/13,

More information

HARMONIC EXTENSION ON NETWORKS

HARMONIC EXTENSION ON NETWORKS HARMONIC EXTENSION ON NETWORKS MING X. LI Abstract. We study the imlication of geometric roerties of the grah of a network in the extendibility of all γ-harmonic germs at an interior node. We rove that

More information

Hotelling s Two- Sample T 2

Hotelling s Two- Sample T 2 Chater 600 Hotelling s Two- Samle T Introduction This module calculates ower for the Hotelling s two-grou, T-squared (T) test statistic. Hotelling s T is an extension of the univariate two-samle t-test

More information

Binary Decision Diagrams and Symbolic Model Checking

Binary Decision Diagrams and Symbolic Model Checking Binary Decision Diagrams and Symbolic Model Checking Randy Bryant Ed Clarke Ken McMillan Allen Emerson CMU CMU Cadence U Texas http://www.cs.cmu.edu/~bryant Binary Decision Diagrams Restricted Form of

More information

Notes on Machine Learning for and

Notes on Machine Learning for and Notes on Machine Learning for 16.410 and 16.413 (Notes adapted from Tom Mitchell and Andrew Moore.) Learning = improving with experience Improve over task T (e.g, Classification, control tasks) with respect

More information

Learning Decision Trees

Learning Decision Trees Learning Decision Trees Machine Learning Spring 2018 1 This lecture: Learning Decision Trees 1. Representation: What are decision trees? 2. Algorithm: Learning decision trees The ID3 algorithm: A greedy

More information

Theory of Parallel Hardware May 11, 2004 Massachusetts Institute of Technology Charles Leiserson, Michael Bender, Bradley Kuszmaul

Theory of Parallel Hardware May 11, 2004 Massachusetts Institute of Technology Charles Leiserson, Michael Bender, Bradley Kuszmaul Theory of Parallel Hardware May 11, 2004 Massachusetts Institute of Technology 6.896 Charles Leiserson, Michael Bender, Bradley Kuszmaul Final Examination Final Examination ffl Do not oen this exam booklet

More information

Decision Tree Learning

Decision Tree Learning Topics Decision Tree Learning Sattiraju Prabhakar CS898O: DTL Wichita State University What are decision trees? How do we use them? New Learning Task ID3 Algorithm Weka Demo C4.5 Algorithm Weka Demo Implementation

More information

22.615, MHD Theory of Fusion Systems Prof. Freidberg Lecture 13: PF Design II The Coil Solver

22.615, MHD Theory of Fusion Systems Prof. Freidberg Lecture 13: PF Design II The Coil Solver .615, MHD Theory of Fusion Systems Prof. Freidberg Lecture 13: PF Design II The Coil Solver Introduction 1. Let us assume that we have successfully solved the Grad Shafranov equation for a fixed boundary

More information

Linear diophantine equations for discrete tomography

Linear diophantine equations for discrete tomography Journal of X-Ray Science and Technology 10 001 59 66 59 IOS Press Linear diohantine euations for discrete tomograhy Yangbo Ye a,gewang b and Jiehua Zhu a a Deartment of Mathematics, The University of Iowa,

More information

Figure : An 8 bridge design grid. (a) Run this model using LOQO. What is the otimal comliance? What is the running time?

Figure : An 8 bridge design grid. (a) Run this model using LOQO. What is the otimal comliance? What is the running time? 5.094/SMA53 Systems Otimization: Models and Comutation Assignment 5 (00 o i n ts) Due Aril 7, 004 Some Convex Analysis (0 o i n ts) (a) Given ositive scalars L and E, consider the following set in three-dimensional

More information

Uncorrelated Multilinear Principal Component Analysis for Unsupervised Multilinear Subspace Learning

Uncorrelated Multilinear Principal Component Analysis for Unsupervised Multilinear Subspace Learning TNN-2009-P-1186.R2 1 Uncorrelated Multilinear Princial Comonent Analysis for Unsuervised Multilinear Subsace Learning Haiing Lu, K. N. Plataniotis and A. N. Venetsanooulos The Edward S. Rogers Sr. Deartment

More information

Decision Trees. Lewis Fishgold. (Material in these slides adapted from Ray Mooney's slides on Decision Trees)

Decision Trees. Lewis Fishgold. (Material in these slides adapted from Ray Mooney's slides on Decision Trees) Decision Trees Lewis Fishgold (Material in these slides adapted from Ray Mooney's slides on Decision Trees) Classification using Decision Trees Nodes test features, there is one branch for each value of

More information

Finite State Model Checking

Finite State Model Checking Finite State Model Checking Finite State Model Checking Finite State Systems System Descrition A Requirement F CTL TOOL No! Debugging Information Yes, Prototyes Executable Code Test sequences Tools: visualstate,

More information

18.312: Algebraic Combinatorics Lionel Levine. Lecture 12

18.312: Algebraic Combinatorics Lionel Levine. Lecture 12 8.3: Algebraic Combinatorics Lionel Levine Lecture date: March 7, Lecture Notes by: Lou Odette This lecture: A continuation of the last lecture: comutation of µ Πn, the Möbius function over the incidence

More information

A New Method of DDB Logical Structure Synthesis Using Distributed Tabu Search

A New Method of DDB Logical Structure Synthesis Using Distributed Tabu Search A New Method of DDB Logical Structure Synthesis Using Distributed Tabu Search Eduard Babkin and Margarita Karunina 2, National Research University Higher School of Economics Det of nformation Systems and

More information

Feedback-error control

Feedback-error control Chater 4 Feedback-error control 4.1 Introduction This chater exlains the feedback-error (FBE) control scheme originally described by Kawato [, 87, 8]. FBE is a widely used neural network based controller

More information

Proof Nets and Boolean Circuits

Proof Nets and Boolean Circuits Proof Nets and Boolean Circuits Kazushige Terui terui@nii.ac.j National Institute of Informatics, Tokyo 14/07/04, Turku.1/44 Motivation (1) Proofs-as-Programs (Curry-Howard) corresondence: Proofs = Programs

More information

Outline. EECS150 - Digital Design Lecture 26 Error Correction Codes, Linear Feedback Shift Registers (LFSRs) Simple Error Detection Coding

Outline. EECS150 - Digital Design Lecture 26 Error Correction Codes, Linear Feedback Shift Registers (LFSRs) Simple Error Detection Coding Outline EECS150 - Digital Design Lecture 26 Error Correction Codes, Linear Feedback Shift Registers (LFSRs) Error detection using arity Hamming code for error detection/correction Linear Feedback Shift

More information

Minimax Design of Nonnegative Finite Impulse Response Filters

Minimax Design of Nonnegative Finite Impulse Response Filters Minimax Design of Nonnegative Finite Imulse Resonse Filters Xiaoing Lai, Anke Xue Institute of Information and Control Hangzhou Dianzi University Hangzhou, 3118 China e-mail: laix@hdu.edu.cn; akxue@hdu.edu.cn

More information

Generation of Linear Models using Simulation Results

Generation of Linear Models using Simulation Results 4. IMACS-Symosium MATHMOD, Wien, 5..003,. 436-443 Generation of Linear Models using Simulation Results Georg Otte, Sven Reitz, Joachim Haase Fraunhofer Institute for Integrated Circuits, Branch Lab Design

More information

Spectral Clustering based on the graph p-laplacian

Spectral Clustering based on the graph p-laplacian Sectral Clustering based on the grah -Lalacian Thomas Bühler tb@cs.uni-sb.de Matthias Hein hein@cs.uni-sb.de Saarland University Comuter Science Deartment Camus E 663 Saarbrücken Germany Abstract We resent

More information

Robustness of classifiers to uniform l p and Gaussian noise Supplementary material

Robustness of classifiers to uniform l p and Gaussian noise Supplementary material Robustness of classifiers to uniform l and Gaussian noise Sulementary material Jean-Yves Franceschi Ecole Normale Suérieure de Lyon LIP UMR 5668 Omar Fawzi Ecole Normale Suérieure de Lyon LIP UMR 5668

More information

Lilian Markenzon 1, Nair Maria Maia de Abreu 2* and Luciana Lee 3

Lilian Markenzon 1, Nair Maria Maia de Abreu 2* and Luciana Lee 3 Pesquisa Oeracional (2013) 33(1): 123-132 2013 Brazilian Oerations Research Society Printed version ISSN 0101-7438 / Online version ISSN 1678-5142 www.scielo.br/oe SOME RESULTS ABOUT THE CONNECTIVITY OF

More information

AM 221: Advanced Optimization Spring Prof. Yaron Singer Lecture 6 February 12th, 2014

AM 221: Advanced Optimization Spring Prof. Yaron Singer Lecture 6 February 12th, 2014 AM 221: Advanced Otimization Sring 2014 Prof. Yaron Singer Lecture 6 February 12th, 2014 1 Overview In our revious lecture we exlored the concet of duality which is the cornerstone of Otimization Theory.

More information

ON POLYNOMIAL SELECTION FOR THE GENERAL NUMBER FIELD SIEVE

ON POLYNOMIAL SELECTION FOR THE GENERAL NUMBER FIELD SIEVE MATHEMATICS OF COMPUTATIO Volume 75, umber 256, October 26, Pages 237 247 S 25-5718(6)187-9 Article electronically ublished on June 28, 26 O POLYOMIAL SELECTIO FOR THE GEERAL UMBER FIELD SIEVE THORSTE

More information

Principles of Computed Tomography (CT)

Principles of Computed Tomography (CT) Page 298 Princiles of Comuted Tomograhy (CT) The theoretical foundation of CT dates back to Johann Radon, a mathematician from Vienna who derived a method in 1907 for rojecting a 2-D object along arallel

More information

A randomized sorting algorithm on the BSP model

A randomized sorting algorithm on the BSP model A randomized sorting algorithm on the BSP model Alexandros V. Gerbessiotis a, Constantinos J. Siniolakis b a CS Deartment, New Jersey Institute of Technology, Newark, NJ 07102, USA b The American College

More information

Elliptic Curves and Cryptography

Elliptic Curves and Cryptography Ellitic Curves and Crytograhy Background in Ellitic Curves We'll now turn to the fascinating theory of ellitic curves. For simlicity, we'll restrict our discussion to ellitic curves over Z, where is a

More information

Parallelism and Locality in Priority Queues. A. Ranade S. Cheng E. Deprit J. Jones S. Shih. University of California. Berkeley, CA 94720

Parallelism and Locality in Priority Queues. A. Ranade S. Cheng E. Deprit J. Jones S. Shih. University of California. Berkeley, CA 94720 Parallelism and Locality in Priority Queues A. Ranade S. Cheng E. Derit J. Jones S. Shih Comuter Science Division University of California Berkeley, CA 94720 Abstract We exlore two ways of incororating

More information

Learning Decision Trees

Learning Decision Trees Learning Decision Trees Machine Learning Fall 2018 Some slides from Tom Mitchell, Dan Roth and others 1 Key issues in machine learning Modeling How to formulate your problem as a machine learning problem?

More information

Best approximation by linear combinations of characteristic functions of half-spaces

Best approximation by linear combinations of characteristic functions of half-spaces Best aroximation by linear combinations of characteristic functions of half-saces Paul C. Kainen Deartment of Mathematics Georgetown University Washington, D.C. 20057-1233, USA Věra Kůrková Institute of

More information

Brownian Motion and Random Prime Factorization

Brownian Motion and Random Prime Factorization Brownian Motion and Random Prime Factorization Kendrick Tang June 4, 202 Contents Introduction 2 2 Brownian Motion 2 2. Develoing Brownian Motion.................... 2 2.. Measure Saces and Borel Sigma-Algebras.........

More information

For q 0; 1; : : : ; `? 1, we have m 0; 1; : : : ; q? 1. The set fh j(x) : j 0; 1; ; : : : ; `? 1g forms a basis for the tness functions dened on the i

For q 0; 1; : : : ; `? 1, we have m 0; 1; : : : ; q? 1. The set fh j(x) : j 0; 1; ; : : : ; `? 1g forms a basis for the tness functions dened on the i Comuting with Haar Functions Sami Khuri Deartment of Mathematics and Comuter Science San Jose State University One Washington Square San Jose, CA 9519-0103, USA khuri@juiter.sjsu.edu Fax: (40)94-500 Keywords:

More information

Verifying Two Conjectures on Generalized Elite Primes

Verifying Two Conjectures on Generalized Elite Primes 1 2 3 47 6 23 11 Journal of Integer Sequences, Vol. 12 (2009), Article 09.4.7 Verifying Two Conjectures on Generalized Elite Primes Xiaoqin Li 1 Mathematics Deartment Anhui Normal University Wuhu 241000,

More information

Inference for Empirical Wasserstein Distances on Finite Spaces: Supplementary Material

Inference for Empirical Wasserstein Distances on Finite Spaces: Supplementary Material Inference for Emirical Wasserstein Distances on Finite Saces: Sulementary Material Max Sommerfeld Axel Munk Keywords: otimal transort, Wasserstein distance, central limit theorem, directional Hadamard

More information

Memoryfull Branching-Time Logic

Memoryfull Branching-Time Logic Memoryfull Branching-Time Logic Orna Kuferman 1 and Moshe Y. Vardi 2 1 Hebrew University, School of Engineering and Comuter Science, Jerusalem 91904, Israel Email: orna@cs.huji.ac.il, URL: htt://www.cs.huji.ac.il/

More information

Generalized Coiflets: A New Family of Orthonormal Wavelets

Generalized Coiflets: A New Family of Orthonormal Wavelets Generalized Coiflets A New Family of Orthonormal Wavelets Dong Wei, Alan C Bovik, and Brian L Evans Laboratory for Image and Video Engineering Deartment of Electrical and Comuter Engineering The University

More information

John Weatherwax. Analysis of Parallel Depth First Search Algorithms

John Weatherwax. Analysis of Parallel Depth First Search Algorithms Sulementary Discussions and Solutions to Selected Problems in: Introduction to Parallel Comuting by Viin Kumar, Ananth Grama, Anshul Guta, & George Karyis John Weatherwax Chater 8 Analysis of Parallel

More information

A Qualitative Event-based Approach to Multiple Fault Diagnosis in Continuous Systems using Structural Model Decomposition

A Qualitative Event-based Approach to Multiple Fault Diagnosis in Continuous Systems using Structural Model Decomposition A Qualitative Event-based Aroach to Multile Fault Diagnosis in Continuous Systems using Structural Model Decomosition Matthew J. Daigle a,,, Anibal Bregon b,, Xenofon Koutsoukos c, Gautam Biswas c, Belarmino

More information

E( x ) = [b(n) - a(n,m)x(m) ]

E( x ) = [b(n) - a(n,m)x(m) ] Exam #, EE5353, Fall 0. Here we consider MLPs with binary-valued inuts (0 or ). (a) If the MLP has inuts, what is the maximum degree D of its PBF model? (b) If the MLP has inuts, what is the maximum value

More information