CERTAINTY FACTORS. Expert System Lab Work. Lecture #6

Similar documents
Praktikum 3 Sistem pakar

REASONING UNDER UNCERTAINTY: CERTAINTY THEORY

Reasoning about uncertainty

Reasoning Under Uncertainty

Reasoning with Uncertainty

Knowledge representation DATA INFORMATION KNOWLEDGE WISDOM. Figure Relation ship between data, information knowledge and wisdom.

Probabilistic Reasoning and Certainty Factors

Probabilistic Reasoning; Network-based reasoning

Introduction to Artificial Intelligence. Prof. Inkyu Moon Dept. of Robotics Engineering, DGIST

Discrete Mathematics

Fuzzy expert systems

Reasoning with Uncertainty

Uncertainty and Rules

Data Mining and Machine Learning

Bayesian Reasoning. Adapted from slides by Tim Finin and Marie desjardins.

Modeling and reasoning with uncertainty

Artificial Intelligence Programming Probability

Lecture 10: Introduction to reasoning under uncertainty. Uncertainty

Today s s lecture. Lecture 16: Uncertainty - 6. Dempster-Shafer Theory. Alternative Models of Dealing with Uncertainty Information/Evidence

CSC Discrete Math I, Spring Propositional Logic

Propositional Logic. Fall () Propositional Logic Fall / 30

Uncertainty and Bias UIUC, 403 Advanced Physics Laboratory, Fall 2014

THE LOGIC OF OPINION

Machine Learning: Exercise Sheet 2

Pei Wang( 王培 ) Temple University, Philadelphia, USA

Strength-based and time-based supervised networks: Theory and comparisons. Denis Cousineau Vision, Integration, Cognition lab. Université de Montréal

Handling imprecise and uncertain class labels in classification and clustering

Reasoning Systems Chapter 4. Dr Ahmed Rafea

A hierarchical fusion of expert opinion in the Transferable Belief Model (TBM) Minh Ha-Duong, CNRS, France

Reasoning Under Uncertainty: Introduction to Probability

Logic for Computer Science - Week 4 Natural Deduction

Logical Agents. Knowledge based agents. Knowledge based agents. Knowledge based agents. The Wumpus World. Knowledge Bases 10/20/14

Compound Propositions

Bayesian Classification. Bayesian Classification: Why?

Concept Learning Mitchell, Chapter 2. CptS 570 Machine Learning School of EECS Washington State University

[read Chapter 2] [suggested exercises 2.2, 2.3, 2.4, 2.6] General-to-specific ordering over hypotheses

Concept Learning through General-to-Specific Ordering

SEVERE WEATHER PLAN FOR SPECIAL EVENTS or VENUES. Plan/Event Information for:

Propositional Logic: Logical Agents (Part I)

Description Logics. Foundations of Propositional Logic. franconi. Enrico Franconi

COMP5211 Lecture Note on Reasoning under Uncertainty

CS 188: Artificial Intelligence. Our Status in CS188

Typical Supervised Learning Problem Setting

Answers Machine Learning Exercises 2

Bayesian Learning. Examples. Conditional Probability. Two Roles for Bayesian Methods. Prior Probability and Random Variables. The Chain Rule P (B)

Computational Learning Theory

ICS141: Discrete Mathematics for Computer Science I

Building Bayesian Networks. Lecture3: Building BN p.1

Introduction Propositional Logic. Discrete Mathematics Andrei Bulatov

15414/614 Optional Lecture 1: Propositional Logic


Detecting Anomalous and Exceptional Behaviour on Credit Data by means of Association Rules. M. Delgado, M.D. Ruiz, M.J. Martin-Bautista, D.

COMP219: Artificial Intelligence. Lecture 19: Logic for KR

Knowledge-Based Systems and Deductive Databases

Probability Basics. Robot Image Credit: Viktoriya Sukhanova 123RF.com

CMPT 310 Artificial Intelligence Survey. Simon Fraser University Summer Instructor: Oliver Schulte

Propositional Logic. Spring Propositional Logic Spring / 32

Chapter 5: Section 5-1 Mathematical Logic

Examples. Example (1) Example (2) Let x, y be two variables, and denote statements p : x = 0 and q : y = 1. Solve. x 2 + (y 1) 2 = 0.

Bayesian belief networks

Probabilistic Robotics

Logic. Introduction to Artificial Intelligence CS/ECE 348 Lecture 11 September 27, 2001

DISCRETE STRUCTURES WEEK5 LECTURE1

Adaptive Fuzzy Spiking Neural P Systems for Fuzzy Inference and Learning

Outline. [read Chapter 2] Learning from examples. General-to-specic ordering over hypotheses. Version spaces and candidate elimination.

Concept Learning.

Propositional Logic. Yimei Xiang 11 February format strictly follow the laws and never skip any step.

Basic Probabilistic Reasoning SEG

Uncertainty. Variables. assigns to each sentence numerical degree of belief between 0 and 1. uncertainty

Random Variables. A random variable is some aspect of the world about which we (may) have uncertainty

Overview. Knowledge-Based Agents. Introduction. COMP219: Artificial Intelligence. Lecture 19: Logic for KR

LING 106. Knowledge of Meaning Lecture 3-1 Yimei Xiang Feb 6, Propositional logic

Truth-Functional Logic

Discrete Structures of Computer Science Propositional Logic I

Cogs 14B: Introduction to Statistical Analysis

Fuzzy Systems. Possibility Theory

Reasoning under Uncertainty: Intro to Probability

Uncertainty. Logic and Uncertainty. Russell & Norvig. Readings: Chapter 13. One problem with logical-agent approaches: C:145 Artificial

Topic 1: Propositional logic

Lecture 8. Probabilistic Reasoning CS 486/686 May 25, 2006

ARTIFICIAL INTELLIGENCE

This lecture. Reading. Conditional Independence Bayesian (Belief) Networks: Syntax and semantics. Chapter CS151, Spring 2004

Outline. Training Examples for EnjoySport. 2 lecture slides for textbook Machine Learning, c Tom M. Mitchell, McGraw Hill, 1997

Uncertainty and Bayesian Networks

12. Vagueness, Uncertainty and Degrees of Belief

COSE212: Programming Languages. Lecture 1 Inductive Definitions (1)

A.I. in health informatics lecture 3 clinical reasoning & probabilistic inference, II *

Encoding formulas with partially constrained weights in a possibilistic-like many-sorted propositional logic

Pengju XJTU 2016

Fuzzy and Rough Sets Part I

SMPS 08, 8-10 septembre 2008, Toulouse. A hierarchical fusion of expert opinion in the Transferable Belief Model (TBM) Minh Ha-Duong, CNRS, France

Uncertainty and Belief Networks. Introduction to Artificial Intelligence CS 151 Lecture 1 continued Ok, Lecture 2!

Chapter Summary. Propositional Logic. Predicate Logic. Proofs. The Language of Propositions (1.1) Applications (1.2) Logical Equivalences (1.

COSE212: Programming Languages. Lecture 1 Inductive Definitions (1)

Introduction to machine learning. Concept learning. Design of a learning system. Designing a learning system

PL: Truth Trees. Handout Truth Trees: The Setup

Version Spaces.

RAINFALL SERIES LIM TOU HIN

Discrete Mathematics and Applications COT3100

Fuzzy reliability using Beta distribution as dynamical membership function

Transcription:

CERTAINTY FACTORS Expert System Lab Work Lecture #6

Uncertanity There is a uncertainty in the mind Of experts when make a decision accordance their expertise

Representation Uncertainty Basis Concept of Probability Certainty Factor Bayesian Reasoning Fuzzy Neural Network Genetic Algorithm

Causes of Uncertainty Not completely and uncertainty information Unknown data Unify the different of view point of experts Imprecise language (always,often,seldom,some times)

Causes Uncertainty 5 Imprecise language Term Ray Simpson (1944) Always Very often Usually Often Generally Frequently Rather often About as often as not Now and then Sometimes Occasionally Once in a while Not often Usually not Seldom Hardly ever Very seldom Rarely Almost never Never Mean value 99 88 85 78 78 73 65 50 20 20 20 15 13 10 10 7 6 5 3 0 Term Milton Hakel (1968) Always Very often Usually Often Rather often Frequently Generally About as often as not Now and then Sometimes Occasionally Once in a while Not often Usually not Seldom Hardly ever Very seldom Rarely Almost never Never Mean value 100 87 79 74 74 72 72 50 34 29 28 22 16 16 9 8 7 5 2 0

Certainty Theory Certainty factor (cf ), value to measure degree of belief from expert. Maximum value of cf is +1.0 (definitely true) and minimum -1.0 (definitely false). Two aspects: Certainty in the Evidence The evidence can have a certainty factor attached Certainty in Rule

Computation of Certainty Factors 1. Model Net Belief (Shortliffe dan Buchanan) CF (Rule)=MB(H,E)-MD(H,E) MB(H,E) = 1 ; if P(H)=1 max[p(h E),P(H)]-P(H)/1-P(H) ; Otherwise M D(H,E) = 1 ; if P(H)=0 min[p(h E),P(H)]-P(H)/1-P(H) ; Otherwise

Certainty Factors 2. Direct Interview with Expert Uncertain Term Certainty Factor Definitely not -1.0 Almost certainly not -0.8 Probably not -0.6 Maybe not -0.4 Unknown -0.2 to +0.2 Maybe +0.4 Probably +0.6 Almost certainly +0.8 Definitely +1.0

Example : Expert : If headache and have a cold and fever, then most possibility is influenza Rule : IF evidence 1= headache AND evidence 2= have a cold AND evidence 3= fever THEN seek=influenza (cf = 0.8)

Expert System With CF In Expert System with CF, knowledge base composed of set of rules with syntax: IF <evidence> THEN <hypothesis> {cf } CF refers to degree of belief for hypothesis H when evidence E occured.

Example Degree of belief for hypothesis H when evidence E occured. cf (H,E) = cf (E) * cf(rule) e.g: IF sky is clear THEN the forecast is sunny {cf 0.8} With cf of sky is clear is 0.5 cf (H,E) = 0.5 * 0.8 = 0.4

Multiple Antecedents How the CF if we have two evidence? With conjunction (i.e. and) Use minimum cf of evidence With disjunction(i.e. or) Use maximum cf of evidence

Conjunctive Antecedents - Example Conjunctive Rules: cf(h, E 1 and E 2 and E i ) = min{cf(e 1, E 2, E i )} *cf(rule) IF there are dark clouds E 1 AND the wind is stronger E 2 THEN it will rain {cf 0.8} If cf(e 1 ) = 0.5 and cf(e 2 ) = 0.9, then cf(h, E) = min{0.5, 0.9} * 0.8 = 0.4

Disjunctive Antecedents - Example Disjunctive Rules: cf(h, E 1 or E 2 or E i ) = max{cf(e 1, E 2, E i )} *cf(rule) IF there are dark clouds E 1 OR the wind is stronger E 2 THEN it will rain {cf 0.8} If cf(e 1 ) = 0.5 and cf(e 2 ) = 0.9, then cf(h, E) = max{0.5, 0.9} * 0.8 = 0.72

Similarly Concluded Rules How to determine CF if two rules have similar conclusion? Rule 1: IF weatherperson predicts rain (E 1 ) THEN it will rain {cf 1 0.7} Rule 2: IF farmer predicts rain (E 2 ) THEN it will rain {cf 2 0.9}

Similarly Concluded Rules - Example Two rules with similar conclusion: IF weatherperson predicts rain E 1 THEN it will rain {cf 1 0.7} IF farmer predicts rain E 2 THEN it will rain {cf 2 0.9} Suppose cf(e 1 ) = 1.0 and cf(e 2 ) = 1.0 cf 1 (H 1, E 1 ) = cf(e 1 ) * cf(rule 1 ) cf 2 (H 2, E 2 ) = cf(e 2 ) * cf(rule 2 ) cf 1 (H 1, E 1 ) = 1.0 * 0.7 = 0.7 cf 2 (H 2, E 2 ) = 1.0 * 0.9 = 0.9

Similarly Concluded Rules Formula :

Similarly Concluded Rules New CF value from facts above above can be formulated: cf (cf1,cf2) = cf1+cf2*(1-cf1) = 0.7+0.9(1-0.7) = 0.97

EXERCISE Determine New Value of CF from some facts in below : 1. IF fever THEN typhus {cf -0.39} 2. IF amount of thrombosis is low THEN typhus {cf -0.51} 3. IF the body is weak THEN typhus {cf 0.87} 4. IF diarhea/constipation THEN typhus {cf 0.63}

Combining CF with CLIPS

Combining CF with CLIPS

Combining CF with CLIPS

Combining CF with CLIPS

Combining CF with CLIPS

Lakukan Beberapa Langkah - Load -> Reset -> (agenda) - Perhatikan urutan rule yang dijalankan Rule dengan Salience yang lebih besar akan dieksekusi terlebih dahulu

Modifikasi Salience dari Rule (defrule both-positif (declare (salience 6)......... (defrule both-negatif (declare (salience 10).........

Lakukan langkah yang sama : Load -> Reset -> Agenda Lihat pada Agenda: Akan terlihat bahwa rule dengan salience tinggi, akan dijalankan terlebih dahulu

Combining CF with CLIPS Cf = 0.8390766