Parts 3-6 are EXAMPLES for cse634

Similar documents
Cse352 AI Homework 2 Solutions

Cse537 Ar*fficial Intelligence Short Review 1 for Midterm 2. Professor Anita Wasilewska Computer Science Department Stony Brook University

MIDTERM: CS 6375 INSTRUCTOR: VIBHAV GOGATE October,

CSCI Homework Set 1 Due: September 11, 2018 at the beginning of class

Midterm: CS 6375 Spring 2015 Solutions

Final Exam December 12, 2017

Predicate Logic. CSE 191, Class Note 02: Predicate Logic Computer Sci & Eng Dept SUNY Buffalo

Lecture 7 Decision Tree Classifier

Module Contact: Dr Pierre Chardaire, CMP Copyright of the University of East Anglia Version 1

First-Order Logic First-Order Theories. Roopsha Samanta. Partly based on slides by Aaron Bradley and Isil Dillig

Decision T ree Tree Algorithm Week 4 1

Reification of Boolean Logic

Guest Speaker. CS 416 Artificial Intelligence. First-order logic. Diagnostic Rules. Causal Rules. Causal Rules. Page 1

Final Exam December 12, 2017

Data Mining: Concepts and Techniques. (3 rd ed.) Chapter 8. Chapter 8. Classification: Basic Concepts

Propositional Resolution Introduction

CS534 Machine Learning - Spring Final Exam

MAS114: Solutions to Exercises

1 [15 points] Search Strategies

Strong AI vs. Weak AI Automated Reasoning

CSE507. Course Introduction. Computer-Aided Reasoning for Software. Emina Torlak

CS145: INTRODUCTION TO DATA MINING

Using first-order logic, formalize the following knowledge:

Final. Introduction to Artificial Intelligence. CS 188 Spring You have approximately 2 hours and 50 minutes.

Department of Computer and Information Science and Engineering. CAP4770/CAP5771 Fall Midterm Exam. Instructor: Prof.

CS280, Spring 2004: Final

Introduction to Decision Sciences Lecture 2

Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen. Intelligent Data Analysis. Decision Trees

Midterm. You may use a calculator, but not any device that can access the Internet or store large amounts of data.

Decision Trees. CS57300 Data Mining Fall Instructor: Bruno Ribeiro

Course 395: Machine Learning

Lecture 7 Artificial neural networks: Supervised learning

KE/Tableaux. What is it for?

Discrete Mathematics. Instructor: Sourav Chakraborty. Lecture 4: Propositional Logic and Predicate Lo

Holdout and Cross-Validation Methods Overfitting Avoidance

Midterm: CS 6375 Spring 2018

Artificial Neural Networks Examination, June 2005

Machine Learning, Midterm Exam: Spring 2009 SOLUTION

Learning Goals of CS245 Logic and Computation

COGS Q250 Fall Homework 7: Learning in Neural Networks Due: 9:00am, Friday 2nd November.

Name (NetID): (1 Point)

CMPSCI 311: Introduction to Algorithms Second Midterm Exam

CSE 546 Final Exam, Autumn 2013

Machine Learning

Learning Decision Trees

FINAL: CS 6375 (Machine Learning) Fall 2014

Decision Trees. Tirgul 5

Midterm, Fall 2003

Moving Average Rules to Find. Confusion Matrix. CC283 Intelligent Problem Solving 05/11/2010. Edward Tsang (all rights reserved) 1

COMP219: Artificial Intelligence. Lecture 19: Logic for KR

Andrew/CS ID: Midterm Solutions, Fall 2006

Predicate Calculus lecture 1

Sample questions for Fundamentals of Machine Learning 2018

ECLT 5810 Classification Neural Networks. Reference: Data Mining: Concepts and Techniques By J. Hand, M. Kamber, and J. Pei, Morgan Kaufmann

Learning and Neural Networks

) (d o f. For the previous layer in a neural network (just the rightmost layer if a single neuron), the required update equation is: 2.

Price: $25 (incl. T-Shirt, morning tea and lunch) Visit:

Artificial Intelligence Chapter 7: Logical Agents

Review of Lecture 1. Across records. Within records. Classification, Clustering, Outlier detection. Associations

Machine Learning Lecture 12

The exam is closed book, closed notes except your one-page (two sides) or two-page (one side) crib sheet.

What Do Neural Networks Do? MLP Lecture 3 Multi-layer networks 1

Solving Equations by Adding and Subtracting

Introduction to Metalogic

ARTIFICIAL INTELLIGENCE

Naïve Bayesian. From Han Kamber Pei

Rules Build Arguments Rules Building Arguments

CSE 352 (AI) LECTURE NOTES Professor Anita Wasilewska. NEURAL NETWORKS Learning

The exam is closed book, closed calculator, and closed notes except your one-page crib sheet.

Extensions to the Logic of All x are y: Verbs, Relative Clauses, and Only

Machine Learning

Administrative notes. Computational Thinking ct.cs.ubc.ca

Final exam of ECE 457 Applied Artificial Intelligence for the Spring term 2007.

CC283 Intelligent Problem Solving 28/10/2013

Theorem. For every positive integer n, the sum of the positive integers from 1 to n is n(n+1)

Lecture VII: Classification I. Dr. Ouiem Bchir

An Empirical Study of Building Compact Ensembles

Classification and Regression Trees

Bayesian Classifiers and Probability Estimation. Vassilis Athitsos CSE 4308/5360: Artificial Intelligence I University of Texas at Arlington

Learning Decision Trees

Warm-Up Problem. Please fill out your Teaching Evaluation Survey! Please comment on the warm-up problems if you haven t filled in your survey yet.

Machine Learning, Midterm Exam

Exam Machine Learning for the Quantified Self with answers :00-14:45

Lecture Notes on From Rules to Propositions

Rule-Based Classifiers

CS103 Handout 12 Winter February 4, 2013 Practice Midterm Exam

Neural Networks Task Sheet 2. Due date: May

Final Examination. Adrian Georgi Josh Karen Lee Min Nikos Tina. There are 12 problems totaling 150 points. Total time is 170 minutes.

COMP219: Artificial Intelligence. Lecture 20: Propositional Reasoning

COMP2411 Lecture 10: Propositional Logic Programming. Note: This material is not covered in the book. Resolution Applied to Horn Clauses

Propositional Resolution

Sample questions for COMP-424 final exam

Why Learning Logic? Logic. Propositional Logic. Compound Propositions

Predicate Logic: Introduction and Translations

Final Exam, Fall 2002

EVALUATING MISCLASSIFICATION PROBABILITY USING EMPIRICAL RISK 1. Victor Nedel ko

Mathematical induction

Mathematical Logic Part Three

FINAL EXAM: FALL 2013 CS 6375 INSTRUCTOR: VIBHAV GOGATE

Final Examination CS540-2: Introduction to Artificial Intelligence

Transcription:

1 Parts 3-6 are EXAMPLES for cse634 FINAL TEST CSE 352 ARTIFICIAL INTELLIGENCE Fall 2008 There are 6 pages in this exam. Please make sure you have all of them INTRODUCTION Philosophical AI Questions Q1. Write some pro and contra arguments about TURING TEST as the test of artificial intelligence. Q2. Describe some technological science fiction inventions in your favorite Science Fiction movie that are realistic as of today s state of technology, and some others that are not realistic. PART ONE Expert Systems Q1. Write a simple (5-6 rules long) Expert System (in English) for a task (or tasks) chosen by you. Q2. Conceptualize your system in PROPOSITIONAL LOGIC. Q3. Pose some questions to your system in order to build a database of FACTS related to your task and use backward chaining to show if your task succeeds, or fails depending on the answers to the questions. HINT: Look at Hmk1part1 solutions and our book. PART TWO Resolution and Predicate Logic Q1. Write all resolvents of Δ = { {a, a, b, c}, {a, b}, { b, c}, {b} } Q2. Apply all possible Resolution Strategies (state which are complete) to verify whether the set Δ below is satisfiable, or not. Δ = { {a, b}, { a, b }, {b, c}, {a, c}, { a}, {a, b, a }, { b, a, c, b}, {b}, {c}} Q3. Use resolution deduction (and completeness of your resolution strategy) to decide the validity of the following argument. From A1: ( ( (a ( b /\ c)) (b a) ) \/ ( (a c) \/ b) ) and A2: ( (a b) a),

2 we deduce that B: ( (a ( b /\ c) ) (a /\ (b /\ c) ) ) Q4. Prove the following Law of Quantifiers ( x A( x) U x B( x)) x( A( x) U B( x)) Give an example of a non empty domain X and formulas A(x), B(x), For which the direction does not hold. Q5. (a) Translate the following sentences into Predicate Logic and AI Logic under intended interpretation. Write down your choice of predicates. A1: All flowers are blue and pretty. A2: All blue flowers are pretty. A3: All flowers are pretty. (b) Decide whether the ARGUMENT: A1 /\ A2 => A3 is VALID, or NOT VALID. Show work. Q6. (a) Translate the following sentences into Predicate Logic and AI Logic under intended interpretation. Write down your choice of predicates. A1: All students are clever. A2: Some clever students have blue eyes. A3: There is a student with blue eyes. (b) Decide whether the ARGUMENT: A1 /\ A2 => A3 is VALID, or NOT VALID. Show work. PART THREE Classification Rules Definition 1 Given a classification dataset DB with a set A = {a 1, a 2,, a n } of attributes and a class attribute C with values {c 1, c 2,, c k } (k classes), any expression; a 1 = v 1 Λ Λ a k = v k, where a i A and v i are corresponding values of attributes, is called attributes DESCRIPTION, for short a DESCRIPTION. The expression C = c k is called a class DESCRIPTION, for short a CLASS. Definition 2

3 A CHARACTERISTIC FORMULA is any expression C = c k a 1 = v 1 Λ Λ a k = v k, or shortly we write it as CLASS DESCRIPTION. Definition 3 A DETERMINANT formula is any expression a 1 = v 1 Λ Λ a k = v k C =c k, or shortly a formula DESCIPTION CLASS. Definition 4 A characteristic formula CLASS DESCRIPTION is called a CHARACTERISITIC RULE of a given dataset DB iff it is TRUE in DB, i.e. when the following condition holds {o: DESCRIPTION} {o: CLASS } not=, Where {o: DESCRIPTION} is the set of all records of DB corresponding to the Attributes DESCRIPTION, {o: CLASS} is the set of all records of DB corresponding to the CLASS description. Definition 5 A discriminant formula DESCRIPTION CLASS is called a DISCRIMINANT RULE for a given DB iff it is TRUE in the DB, i.e. the following holds 1. {o: DESCRIPTION} not= 2. {o: DESCRIPTION} {o: CLASS} Q1. Prove that for any data base DB, for any of its DISCRIMINANT rules DESCRIPTION CLASS, the formula CLASS DESCRIPTION is a characteristic RULE for the DB, but not vice-versa. See lecture notes for an example of a characteristic rule CLASS DESCRIPTION such that the corresponding formula DESCRIPTION CLASS is not a discriminant rule of DB and write your own example for the dataset below. Given a dataset: Record a 1 a 2 a 3 a 4 C o 1 + 1 1 0 + o 2 + 1 2 0 + o 3-0 0 0 - o 4-0 2 1 - o 5 + 1 1 0 +

4 C class attribute Q2. Use the data set given above to create an example of a characteristic rule CLASS DESCRIPTION such that the corresponding formula DESCRIPTION CLASS is not a discriminant rule. Q3. For the data set given above find {o :DESCRIPTION} for the following descriptions 1) a 1 = + Λ a 2 = 1 2) a 3 = 1 Λ a 4 = 0 3) a 2 = 0 Λ a 1 = + 4) a 1 ={o : C= +}, {o : C = -} Q4. For the following formulae determine whether they are / are not DISCRIMINANT / CHARACTERISTIC RULES of our dataset. 1) a 1 = + Λ a 2 = 1 C = - 2) C = + a 1 = - Λ a 2 = 1 Λ a 3 = 1 3) C = - a 1 = 1 4) C = + a 1 = + Λ a 4 = 0 5) a 1 = + Λ a 2 = 1 Λ a 3 = 1 C = + 6) a 1 = + Λ a 3 = 2 C = - PART FOUR Classification by Decision Tree Induction I Given a TRAINING and TEST classification data as follows: TRAINING Record a 1 a 2 a 3 C o 1 1 1 0 + o 2 0 0 1 - o 3 0 1 1 + o 4 0 0 1 - o 5 1 1 0 + o 6 1 1 1 - o 7 0 0 0 - o 8 0 0 1 - TEST Record a 1 a 2 a 3 C o 1 1 1 0 + o 2 1 0 1 - o 3 0 0 1 + o 4 0 0 0 - Q1. Use attribute a 2 as the ROOT of the Decision Tree. Construct the tree (show steps) using your own choice of all node attributes. Q2. Write down all the discriminant rules determined by your tree in two forms: 1. using the att = v expression and in a PREDICATE form using expressions 2. att(x, v) instead of att = v. 3. Are all of the rules determined by your tree TRUE in the TRAINING SET?

5 Q3. Use the TEST data to evaluate predictive accuracy of your set of rules generated by the given TRAINING data. Q4. Re-write your TEST data using your choice of real values for attributes and values instead of our abstract attributes values. Example a 1 = age, 1 = old, 0 = young a 2 = color, 1 = blue, 0 = red PART FIVE ID3 Algorithm for DECISION TREE Q1. For the data set given below, split it into TRAIN and TEST using not repeated Holdout (N- N/3 ; N/3). Q2. Construct the Decision Tree using ID3 algorithm using your own choice of attributes as consecutive nodes. Q3. Construct the Decision Tree using ID3 algorithm with the General Majority Voting (i.e. majority voting at any node of the tree) using your own choice of attributes as consecutive nodes. Q4. Write down the set of discovered DISCRIMINANT RULES for Q2 and Q3. Evaluate accuracy of your rules. Q5. Evaluate predictive accuracy for your both sets of rules from Q4 and your choice of the TEST data. CLASSIFICATION DATA: Age Income Student Credit Rating Buys Computer <=30 high yes Fair No <=30 high No Excellent No 31 40 high No Fair Yes >40 medium yes Fair Yes >40 Low no Fair Yes >40 low Yes Excellent No 31 40 low Yes Excellent No <=30 medium Yes Fair No >40 high Yes Fair No >40 low Yes Fair No <=30 medium Yes Excellent Yes >40 medium No Excellent Yes 31 40 high Yes Fair No >40 medium No Excellent No

6 PART SIX Neural Networks Classification Given two records (Training Sample) a 1 a 2 a 3 Class 0.5 0 0.2 1 0.1-0.1-0.2 0.3 0.2 0.1 0.2-0.3, -0.1 1 0 0 1. Construct a Neural Network with one hidden layer (your own topology) and evaluate a passage of TWO EPOCHS. Use Learning Rate l = 0.7 REMEMBER: YOU HAVE TO SET YOUR INITIAL WEIGHTS AND BIASES RANDOMLY; 2. Write you re the terminating conditions for your network 3. Write a condition for success; i.e. how you decide that the record is well classified.