Analysis of Classification in Interval-Valued Information Systems

Size: px
Start display at page:

Download "Analysis of Classification in Interval-Valued Information Systems"

Transcription

1 Analysis of Classification in Interval-Valued Information Systems AMAURY CABALLERO, KANG YEN, EDUARDO CABALLERO Department of Electrical & Computer Engineering Florida International University W. Flagler Street, Miami, Florida USA Abstract. The problem of classification has been studied by many authors, and different methods have been developed. Frequently the data for the different attributes is obtained from sources, where the obtained values for each attribute may change due to noise, equipment imprecision, etc. Under this condition becomes useful to define intervals, where those parameters may change values. Using these intervals, and applying rough sets, it is possible in many cases to optimize or reduce the number of attributes that will be used in the classification. When the object can not be completely determined from the rules obtained from the rough set analysis, fuzzy logic results a useful approach. This paper analyzes the conditions for which the classification is possible using a combination of rough sets and fuzzy logic. Examples are included for clarification purposes. Key Words: Fuzzy Logic, Rough Sets, Classification, Interval-valued Information Systems 1 Introduction When analyzing an information system or a database, frequently we face problems lie attributes redundancy, missing or diffuse values, which are due in general to noise and missing partial data. Rough set theory is a very useful approach for minimizing the number of attributes necessary to represent the desired category structure by eliminating redundancy. The lac of data or complete nowledge of the system maes developing a model a practically impossible tas using conventional means. This lac of data can be attributed to sensors failure, or simple due to incomplete system information. At last, diffuse values could be related to noise or imprecise measurements from sensors. In many applications, the information is collected from different sensors, which are corrupted by noise and outliers. The present wor analyzes the use of combined rough sets and fuzzy logic in the classification tas. Limitations in the process of classification due to the imposed data input conditions are presented. Examples have been included to demonstrate the concepts of using rough and fuzzy sets, in classification applications. Rough sets theory, developed by Pawla [1, 2], can be used as a tool to recover data dependencies and to reduce the number of attributes contained in a given data set by using the data alone, without additional information [3,4]. In many practical cases lie for example, when we are getting information from a group of sensors in robotics, image processing, etc, the received information, representing the same value, may vary in some interval due to external causes. For dealing with interval-valued information systems one frequently used procedure is the discretization. Yee Leung et al. [5] have presented a very useful method for obtaining rules, based on rough sets for using the minimum number of attributes and giving a first approach in the objects classification. Their method can be extracted in the following steps: ISSN: ISBN:

2 2 Attribute Reduction Using Rough Sets 1. Table preparation: From the original table, a new one presenting the minimum and maximum values for each parameter and for each object is generated. 2. Define the misclassification rates (α ij ), where ij α denote the misclassification error where l i, l j are the minimum values and u i, u j are the maximum values. For clarifying the concept, a numerical example is presented in Fig.1. The region of intersection of the values of the two objects represents the zone, where the classification becomes problematic. For the presented case, the misclassification error between the classes i and j for attribute is given by α ij = min{( , )/( )} = 0.57 Note that in general α ij α ji. 3. Define α ij as the error that class u i being misclassified into class u j in the system. α ij = min { α ij : m}, where m is the maximum number of attributes 4. Find out the maximum mutual classification error between classes (β ij ) β ij = max { α ij, α ji } where β ij = β ji. 5. For each pair of classes, find out the permissible misclassification rate between classes u i & u j in the system β ij = min β ij for 1 m Let s define the permissible misclassification rate between classes as α. If β ij α, there must exist an attribute A so that, by using A, the two classes U i and U j can be separated within the permissible misclassification rate α. between the classes i and j for attribute, given by α ij = 0 if [l i, u i ] I [l j, u j ] = 0; α ij = min{(u i l j, u j l i )/(u i l i )},1}, if [l i, u i ] I [l j, u j ] 0, 6. Prepare a table with α-tolerance relations, locating: 1- when U i & U j can not be separated (all β ij > α); 0 - when U i & U j can be separated (all β ij α). This is the α-tolerance matrix. 7. Find out the discernibility matrix: For each U ij, locate the parameters for which β ij α. 8. Find out the minimal implicants: f i α = { D ij α } 9. Write the rules: For each class, find out the minimal implicants. These are the rule antecedents. The consequents will be given by the classes (U ij ) that presented 1 in their correspondent position of the α-tolerance matrix. 3 Possible Situations In this tas solution it can appear three different situations: Case # 1.- All the misclassification errors α ij are smaller than the permissible misclassification rate between classes α or are equal to zero in the limit. In this case each object is classified without any possible error or with an error given by α. Step 1: One example reflecting this situation is given in Table 1 that represents the minimum and maximum values of five parameters (A ) for five different objects (U i ). ISSN: ISBN:

3 l i = 3.2 u i = 3.9 l j = 2.8 u j = 3.6 Fig.1. Region of Coincidence of Classes i and j for the Attribute Step 2: The obtained values of the misclassification rate of objects U1 and U2 for the parameters A for 1 5 (α 12 ) are calculated as follows: α 1 12 = 0.5; α 2 12 = 0.07; α 3 12 = 0.21; α 4 12 = 0.46; α 5 12 = 0. From the obtained results, it is clear that the minimum value is α 5 12 = 0. Following the same procedure with all the objects, it is possible to get α ij Step 3: Definition of α ij. These values are obtained in Table 2. Step 4: β ij = max { α ij, α ji } = 0 Step 5: β ij = min βij = 0 Step 6: α-tolerance Relation Matrix: As all the β ij = 0, it is possible to differentiate each object from all the others so, this matrix will have 1 in the positions, where i = j similarly to Table 2. Step 7: Discernibility Matrix: the parameters for which β ij = 0 are presented in Table 3. Step 8: Minimal Implicants: Obtained from each column for I α i = { D α ij } I 1 = (A 1 A 5 ) (A 2 A 5 ) I 2 = (A 1 A 5 ) (A 2 A 5 ) I 3 = A 1 (A 2 ) (A 2 A 4 ) I 4 = (A 1 A 5 ) (A 2 A 5 ) I 5 = A 1 (A 2 ) (A 2 A 4 ) Step 9: The following rules can be selected.. Rule # 1: If A 1 [4.1, 4.9] and A 5 [4.8, 5.6], Then the object can be classified as U 1 Rule # 2: If A 1 [4.5, 5.2], A 3 [3.3, 4.8] and A 5 [3.9, 4.7], Then the object can be classified as U 2 Rule # 3: If A 1 [2.1, 2.8], Then the object can be classified as U 3 Rule # 4: If A 1 [3.1, 5.4], A 3 [1.9, 2.8] and A 5 [2.4, 4.0], Then the object can be classified as U 4 Rule # 5: If A 1 [1.5, 2.0], Then the object can be classified as U 5 The objects are uniquely classified using only three parameters: A 1, A 3, and A 5. A 2 and A 4 are not necessary for the classification. Case # 2: Not all the misclassification errors α ij are smaller than the permissible misclassification rate between classes α. One example of application of this situation is the iris classification problem. R. Fisher [6] presents the classes Setosa, Versicolor, and Virginica, defined by: SL sepal length, SW sepal width, PL petal length, and PW petal width. The same classes will be used in our example,which was previously presented in [7]. We calculate the mean, as well as the minimum and maximum for each class included. The results are presented in Table 4 ISSN: ISBN:

4 Table 1. Example of an Interval-Valued Information System A 1 A 2 A 3 A 4 A 5 U 1 4.1, , , , , 5.6 U 2 4.5, , , , , 4.7 U 3 2.1, , , , , 5.4 U 4 3.1, , , , , 4.0 U 5 1.5, , , , , 4.8 Table 2. Error of Misclassification of Object U i into U j α ij U 1 U 2 U 3 U 4 U 5 U U U U U Table 3. Discernibility Matrix U 1 U 2 U 3 U 4 U 5 U 1 1 {A 5 } {A 1, A 2 } {A 5 } {A 1, A 2, A 4, A 5 } U 2 {A 5 } 1 {A 1, A 2 } {A 3 } {A 1, A 2 } U 3 {A 1, A 2} {A 1, A 2 } 1 {A 1, A 2 } {A 1, A 3, A 4 } U 4 {A 5 } {A 3 } {A 1, A 2 } 1 {A 1, A 2 } U 5 {A 1, A 2, A 4, A 5 } {A 1, A 2 } {A 1, A 3, A 4 } {A 1, A 2 } 1 Table 4. Attributes for Each Class Attributes Setosa Versicolor Virginica x av Min Max x av Min Max x av Min Max SL (A 1 ) SW (A 2) PL (A 3) PW (A 4 ) Following the steps in a similar way to the previous case, it is obtained that the error that objects in class U i being misclassified into class U j in the system is defined as A ij = min{a ij : m} and are given in Table 5. Table 5. Error of Misclassification of Object U i into U j α ij U 1 U 2 U 3 U U U Selecting α = 0.2, the permissible misclassification rate for the present example is shown on Table 6. Table 6. Permissible Misclassification Rate between Classes U i and U j β ij U 1 U 2 U 3 U U U ISSN: ISBN:

5 0.2 The matrix T A for the α-tolerance relations, where all β ij > α is represented in Table 7 Table 7. α-tolerance Relations U 1 U 2 U 3 U U U T A 0.2 From Table 7, it is clear that object U 1 Setosa can be uniquely defined from the given attributes, but objects U 2 Versicolor and U 3 Virginica may not be separated. This situation can be expressed by S A 0.2 (U 1 ) = {U 1 } S A 0.2 (U 2 ) = S A 0.2 (U 3 ) = {U 2, U 3 } where S A 0.2 (U) denotes that these are the sets of objects which are possible indiscernible by A within U, within the misclassification rate α = 0.2. The 0.2-discernibility set is given on Table 8. Table 8. Discernibility Set U 1 U 2 U 3 U 1 1 A 3, A 4 A 3, A 4 U 2 A 3, A 4 1 U 3 A 3, A 4 1 The obtained function is f = A 3 A 4 Using rough sets, it has been demonstrated that the important attributes for the classification are: A 3 : PL-petal length, and A 4 : PW-petal width. From the previous results, the following rules can be extracted: Rule # 1: IF A 3 [1, 1.9] or A 4 [0.1,0.6] THEN it is U 1 Setosa. Rule # 2: IF A 3 [3.0, 5.1] or A 4 [1.0, 1.8] THEN it can be U 2 Versicolor or U 3 Virginica. Rule # 3: If A 3 [4.5, 6.9] or A 4 [1.4, 2.5] THEN it can be U 2 Versicolor or U 3 Virginica. Rule 1 is clear for Setosa classification. Note that from Rule # 2 and Rule # 3, it is possible to develop other rules that substitute them: Rule # 4: IF A 3 [3.0, 4.5] or A 4 [1.0, 1.4] THEN it is U 2 Versicolor. Rule # 5: If A 3 [5.1, 6.9] or A 4 [1.8, 2.5] THEN it is U 3 Virginica. Rule # 6: If A 3 [4.5, 5.1] or A 4 [1.4, 1.8] THEN it can be U 2 Versicolor or U 3 Virginica. In order to select between U 2 and U 3, into the coincident interval, one possibility is to use fuzzy logic. E. D. Cox has presented a useful method for dealing with this situation using the Compatibility Index (CI) [8]. The authors propose the following procedure: 1) For each of the objects U i in the fired rule, find the degree of membership with the imposed conditions, for each of the participating attributes A. 2) Find the compatibility index (CI) for each object. 3) Compare the different compatibility indexes and select the object with the greater one. For the example, it was proposed for each interval a triangular membership function with the maximum value coincident with the mean value for the interval, and defining the domain from the minimum to maximum values in the interval. The compatibility indexes are calculated using for this case, a single measurement, which is given by: SL (sepal length) = 5.6 SW (sepal width) = 3.0 PL (petal length) = 4.5 PW (petal width) = 1.5 ISSN: ISBN:

6 Rule Rule # 6 has been used taing into consideration only the petal length (A 3 ) and petal width (A 4 ), as stated by the rule. In this case the obtained compatibility indexes for versicolor and virginica are respectively 0.7 and 0.1. From the two compatibility indexes, it is clear that the selection is versicolor. A test was made to all the objects in the table used in the example [6]. In this case, there are 50 virginica and 50 versicolor examples. The algorithm fails in one versicolor and one virginica. This gives an average classification rate of 98% for the analyzed table. Case # 3: All the misclassification errors α ij are bigger than the permissible misclassification rate between classes α. In this case the only possibility is to use fuzzy logic or any other method that will permit, using all the parameters, compare the different objects and based on the comparison, define the one, which presents the smaller differences with the imposed parameters. 4 Conclusions The attribute minimization using rough set theory is extremely useful when dealing with large databases. If the number of possible values for the attributes is large, the selection of interval values is mandatory. Fuzzy logic can be used together with the rough theory for obtaining a unique response, in case where this is not possible using the rough theory alone. Three possible situations are presented based on the misclassification error. In the first two, the number of attributes can be minimized and it is possible as well, to classify the classes using rough sets and fuzzy logic. In the third case, it is not possible to reduce the number of parameters for some fixed permissible misclassification rate between classes α. And the only possibility is to apply fuzzy logic or any other method looing for similarity between the received information and the different classes. References [1] Z. Pawla, Rough Set, Int l J. of Computer & Information Sciences, No. 11, 1982, pp [2] Z. Pawla, Rough Sets: Theoretical Aspects of Reasoning About Data, Dordrecht: Kluwer, [3] E. A. Rady, et Al. A Modified Rough Set Approach to Incomplete Information Systems, Journal of Applied Mathematics and Decision Sciences, Volume 2007, Article ID [4] Feng-Hsu Wang, On Acquiring Classification Knowledge from Noisy Data Based on Rough Sets, Elsevier, Expert Systems with Applications 29(2005) [5] Y. Leung, et Al., A Rough Set Approach for the Discovery of Classification Rules in Interval-valued Information Systems, Int l J. of Approximate Reasoning, No. 47, 2007, pp [6] R. Fisher, The Use of Multiple Measurement in Taxonomic Problem, Ann. Eugenic, No. 7, 1936, pp [7] A. Caballero, K. Yen,Y. Fang. Classification with Diffuse or Incomplete Information. WSEAS Transactions on Systems and Control, Issue 6, Vol. 3, June pp [8] E.D. Cox, Fuzzy Logic for Business and Industry, Rocland: Charles River Media, ISSN: ISBN:

Method for Optimizing the Number and Precision of Interval-Valued Parameters in a Multi-Object System

Method for Optimizing the Number and Precision of Interval-Valued Parameters in a Multi-Object System Method for Optimizing the Number and Precision of Interval-Valued Parameters in a Multi-Object System AMAURY CABALLERO*, KANG YEN*, *, JOSE L. ABREU**, ALDO PARDO*** *Department of Electrical & Computer

More information

Decision Trees (Cont.)

Decision Trees (Cont.) Decision Trees (Cont.) R&N Chapter 18.2,18.3 Side example with discrete (categorical) attributes: Predicting age (3 values: less than 30, 30-45, more than 45 yrs old) from census data. Attributes (split

More information

Classification techniques focus on Discriminant Analysis

Classification techniques focus on Discriminant Analysis Classification techniques focus on Discriminant Analysis Seminar: Potentials of advanced image analysis technology in the cereal science research 2111 2005 Ulf Indahl/IMT - 14.06.2010 Task: Supervised

More information

On Improving the k-means Algorithm to Classify Unclassified Patterns

On Improving the k-means Algorithm to Classify Unclassified Patterns On Improving the k-means Algorithm to Classify Unclassified Patterns Mohamed M. Rizk 1, Safar Mohamed Safar Alghamdi 2 1 Mathematics & Statistics Department, Faculty of Science, Taif University, Taif,

More information

ENTROPIES OF FUZZY INDISCERNIBILITY RELATION AND ITS OPERATIONS

ENTROPIES OF FUZZY INDISCERNIBILITY RELATION AND ITS OPERATIONS International Journal of Uncertainty Fuzziness and Knowledge-Based Systems World Scientific ublishing Company ENTOIES OF FUZZY INDISCENIBILITY ELATION AND ITS OEATIONS QINGUA U and DAEN YU arbin Institute

More information

ISyE 6416: Computational Statistics Spring Lecture 5: Discriminant analysis and classification

ISyE 6416: Computational Statistics Spring Lecture 5: Discriminant analysis and classification ISyE 6416: Computational Statistics Spring 2017 Lecture 5: Discriminant analysis and classification Prof. Yao Xie H. Milton Stewart School of Industrial and Systems Engineering Georgia Institute of Technology

More information

Data Mining and Analysis

Data Mining and Analysis 978--5-766- - Data Mining and Analysis: Fundamental Concepts and Algorithms CHAPTER Data Mining and Analysis Data mining is the process of discovering insightful, interesting, and novel patterns, as well

More information

Applied Multivariate and Longitudinal Data Analysis

Applied Multivariate and Longitudinal Data Analysis Applied Multivariate and Longitudinal Data Analysis Chapter 2: Inference about the mean vector(s) Ana-Maria Staicu SAS Hall 5220; 919-515-0644; astaicu@ncsu.edu 1 In this chapter we will discuss inference

More information

An Introduction to Multivariate Methods

An Introduction to Multivariate Methods Chapter 12 An Introduction to Multivariate Methods Multivariate statistical methods are used to display, analyze, and describe data on two or more features or variables simultaneously. I will discuss multivariate

More information

Hamidreza Rashidy Kanan. Electrical Engineering Department, Bu-Ali Sina University

Hamidreza Rashidy Kanan. Electrical Engineering Department, Bu-Ali Sina University Lecture 3 Fuzzy Systems and their Properties Hamidreza Rashidy Kanan Assistant Professor, Ph.D. Electrical Engineering Department, Bu-Ali Sina University h.rashidykanan@basu.ac.ir; kanan_hr@yahoo.com 2

More information

ROUGH NEUTROSOPHIC SETS. Said Broumi. Florentin Smarandache. Mamoni Dhar. 1. Introduction

ROUGH NEUTROSOPHIC SETS. Said Broumi. Florentin Smarandache. Mamoni Dhar. 1. Introduction italian journal of pure and applied mathematics n. 32 2014 (493 502) 493 ROUGH NEUTROSOPHIC SETS Said Broumi Faculty of Arts and Humanities Hay El Baraka Ben M sik Casablanca B.P. 7951 Hassan II University

More information

Application of Rough Set Theory in Performance Analysis

Application of Rough Set Theory in Performance Analysis Australian Journal of Basic and Applied Sciences, 6(): 158-16, 1 SSN 1991-818 Application of Rough Set Theory in erformance Analysis 1 Mahnaz Mirbolouki, Mohammad Hassan Behzadi, 1 Leila Karamali 1 Department

More information

Data Mining and Analysis: Fundamental Concepts and Algorithms

Data Mining and Analysis: Fundamental Concepts and Algorithms Data Mining and Analysis: Fundamental Concepts and Algorithms dataminingbook.info Mohammed J. Zaki 1 Wagner Meira Jr. 2 1 Department of Computer Science Rensselaer Polytechnic Institute, Troy, NY, USA

More information

The Fourth International Conference on Innovative Computing, Information and Control

The Fourth International Conference on Innovative Computing, Information and Control The Fourth International Conference on Innovative Computing, Information and Control December 7-9, 2009, Kaohsiung, Taiwan http://bit.kuas.edu.tw/~icic09 Dear Prof. Yann-Chang Huang, Thank you for your

More information

The Annals of Human Genetics has an archive of material originally published in print format by the Annals of Eugenics ( ).

The Annals of Human Genetics has an archive of material originally published in print format by the Annals of Eugenics ( ). The Annals of Human Genetics has an archive of material originally published in print format by the Annals of Eugenics (1925-1954). This material is available in specialised libraries and archives. We

More information

T 2 Type Test Statistic and Simultaneous Confidence Intervals for Sub-mean Vectors in k-sample Problem

T 2 Type Test Statistic and Simultaneous Confidence Intervals for Sub-mean Vectors in k-sample Problem T Type Test Statistic and Simultaneous Confidence Intervals for Sub-mean Vectors in k-sample Problem Toshiki aito a, Tamae Kawasaki b and Takashi Seo b a Department of Applied Mathematics, Graduate School

More information

Tools of AI. Marcin Sydow. Summary. Machine Learning

Tools of AI. Marcin Sydow. Summary. Machine Learning Machine Learning Outline of this Lecture Motivation for Data Mining and Machine Learning Idea of Machine Learning Decision Table: Cases and Attributes Supervised and Unsupervised Learning Classication

More information

Quality Assessment and Uncertainty Handling in Uncertainty-Based Spatial Data Mining Framework

Quality Assessment and Uncertainty Handling in Uncertainty-Based Spatial Data Mining Framework Quality Assessment and Uncertainty Handling in Uncertainty-Based Spatial Data Mining Framework Sk. Rafi, Sd. Rizwana Abstract: Spatial data mining is to extract the unknown knowledge from a large-amount

More information

Feature Selection with Fuzzy Decision Reducts

Feature Selection with Fuzzy Decision Reducts Feature Selection with Fuzzy Decision Reducts Chris Cornelis 1, Germán Hurtado Martín 1,2, Richard Jensen 3, and Dominik Ślȩzak4 1 Dept. of Mathematics and Computer Science, Ghent University, Gent, Belgium

More information

Applied Multivariate Analysis

Applied Multivariate Analysis Department of Mathematics and Statistics, University of Vaasa, Finland Spring 2017 Discriminant Analysis Background 1 Discriminant analysis Background General Setup for the Discriminant Analysis Descriptive

More information

Data Mining and Analysis: Fundamental Concepts and Algorithms

Data Mining and Analysis: Fundamental Concepts and Algorithms Data Mining and Analysis: Fundamental Concepts and Algorithms dataminingbook.info Mohammed J. Zaki 1 Wagner Meira Jr. 1 Department of Computer Science Rensselaer Polytechnic Institute, Troy, NY, USA Department

More information

The Semi-Pascal Triangle of Maximum Deng Entropy

The Semi-Pascal Triangle of Maximum Deng Entropy The Semi-Pascal Triangle of Maximum Deng Entropy Xiaozhuan Gao a, Yong Deng a, a Institute of Fundamental and Frontier Science, University of Electronic Science and Technology of China, Chengdu, 610054,

More information

Semiparametric Discriminant Analysis of Mixture Populations Using Mahalanobis Distance. Probal Chaudhuri and Subhajit Dutta

Semiparametric Discriminant Analysis of Mixture Populations Using Mahalanobis Distance. Probal Chaudhuri and Subhajit Dutta Semiparametric Discriminant Analysis of Mixture Populations Using Mahalanobis Distance Probal Chaudhuri and Subhajit Dutta Indian Statistical Institute, Kolkata. Workshop on Classification and Regression

More information

ARPN Journal of Science and Technology All rights reserved.

ARPN Journal of Science and Technology All rights reserved. Rule Induction Based On Boundary Region Partition Reduction with Stards Comparisons Du Weifeng Min Xiao School of Mathematics Physics Information Engineering Jiaxing University Jiaxing 34 China ABSTRACT

More information

Uncertainty and Rules

Uncertainty and Rules Uncertainty and Rules We have already seen that expert systems can operate within the realm of uncertainty. There are several sources of uncertainty in rules: Uncertainty related to individual rules Uncertainty

More information

In most cases, a plot of d (j) = {d (j) 2 } against {χ p 2 (1-q j )} is preferable since there is less piling up

In most cases, a plot of d (j) = {d (j) 2 } against {χ p 2 (1-q j )} is preferable since there is less piling up THE UNIVERSITY OF MINNESOTA Statistics 5401 September 17, 2005 Chi-Squared Q-Q plots to Assess Multivariate Normality Suppose x 1, x 2,..., x n is a random sample from a p-dimensional multivariate distribution

More information

Learning from Examples

Learning from Examples Learning from Examples Adriano Cruz, adriano@nce.ufrj.br PPGI-UFRJ September 20 Adriano Cruz, adriano@nce.ufrj.br (PPGI-UFRJ) Learning from Examples September 20 / 40 Summary Introduction 2 Learning from

More information

Classification Methods II: Linear and Quadratic Discrimminant Analysis

Classification Methods II: Linear and Quadratic Discrimminant Analysis Classification Methods II: Linear and Quadratic Discrimminant Analysis Rebecca C. Steorts, Duke University STA 325, Chapter 4 ISL Agenda Linear Discrimminant Analysis (LDA) Classification Recall that linear

More information

Adaptive Mixture Discriminant Analysis for. Supervised Learning with Unobserved Classes

Adaptive Mixture Discriminant Analysis for. Supervised Learning with Unobserved Classes Adaptive Mixture Discriminant Analysis for Supervised Learning with Unobserved Classes Charles Bouveyron SAMOS-MATISSE, CES, UMR CNRS 8174 Université Paris 1 (Panthéon-Sorbonne), Paris, France Abstract

More information

Rough Neutrosophic Sets

Rough Neutrosophic Sets Neutrosophic Sets and Systems, Vol. 3, 2014 60 Rough Neutrosophic Sets Said Broumi 1, Florentin Smarandache 2 and Mamoni Dhar 3 1 Faculty of Arts and Humanities, Hay El Baraka Ben M'sik Casablanca B.P.

More information

Interesting Patterns. Jilles Vreeken. 15 May 2015

Interesting Patterns. Jilles Vreeken. 15 May 2015 Interesting Patterns Jilles Vreeken 15 May 2015 Questions of the Day What is interestingness? what is a pattern? and how can we mine interesting patterns? What is a pattern? Data Pattern y = x - 1 What

More information

On Rough Set Modelling for Data Mining

On Rough Set Modelling for Data Mining On Rough Set Modelling for Data Mining V S Jeyalakshmi, Centre for Information Technology and Engineering, M. S. University, Abhisekapatti. Email: vsjeyalakshmi@yahoo.com G Ariprasad 2, Fatima Michael

More information

Applied Soft Computing

Applied Soft Computing Applied Soft Computing 13 (2013) 563 573 Contents lists available at SciVerse ScienceDirect Applied Soft Computing j ourna l ho mepage: www.elsevier.com/locate/asoc Neutrosophic classifier: An extension

More information

Mathematical Approach to Vagueness

Mathematical Approach to Vagueness International Mathematical Forum, 2, 2007, no. 33, 1617-1623 Mathematical Approach to Vagueness Angel Garrido Departamento de Matematicas Fundamentales Facultad de Ciencias de la UNED Senda del Rey, 9,

More information

Support Vector Machine. Industrial AI Lab.

Support Vector Machine. Industrial AI Lab. Support Vector Machine Industrial AI Lab. Classification (Linear) Autonomously figure out which category (or class) an unknown item should be categorized into Number of categories / classes Binary: 2 different

More information

Fuzzy Rough Sets with GA-Based Attribute Division

Fuzzy Rough Sets with GA-Based Attribute Division Fuzzy Rough Sets with GA-Based Attribute Division HUGANG HAN, YOSHIO MORIOKA School of Business, Hiroshima Prefectural University 562 Nanatsuka-cho, Shobara-shi, Hiroshima 727-0023, JAPAN Abstract: Rough

More information

Uncertainty modeling for robust verifiable design. Arnold Neumaier University of Vienna Vienna, Austria

Uncertainty modeling for robust verifiable design. Arnold Neumaier University of Vienna Vienna, Austria Uncertainty modeling for robust verifiable design Arnold Neumaier University of Vienna Vienna, Austria Safety Safety studies in structural engineering are supposed to guard against failure in all reasonable

More information

Nonlinear Optimization Subject to a System of Fuzzy Relational Equations with Max-min Composition

Nonlinear Optimization Subject to a System of Fuzzy Relational Equations with Max-min Composition The 7th International Symposium on Operations Research and Its Applications (ISORA 08) Lijiang, China, October 31 Novemver 3, 2008 Copyright 2008 ORSC & APORC, pp. 1 9 Nonlinear Optimization Subject to

More information

Naïve Bayes Introduction to Machine Learning. Matt Gormley Lecture 18 Oct. 31, 2018

Naïve Bayes Introduction to Machine Learning. Matt Gormley Lecture 18 Oct. 31, 2018 10-601 Introduction to Machine Learning Machine Learning Department School of Computer Science Carnegie Mellon University Naïve Bayes Matt Gormley Lecture 18 Oct. 31, 2018 1 Reminders Homework 6: PAC Learning

More information

Rough Set Theory Fundamental Assumption Approximation Space Information Systems Decision Tables (Data Tables)

Rough Set Theory Fundamental Assumption Approximation Space Information Systems Decision Tables (Data Tables) Rough Set Theory Fundamental Assumption Objects from the domain are perceived only through the values of attributes that can be evaluated on these objects. Objects with the same information are indiscernible.

More information

ECE662: Pattern Recognition and Decision Making Processes: HW TWO

ECE662: Pattern Recognition and Decision Making Processes: HW TWO ECE662: Pattern Recognition and Decision Making Processes: HW TWO Purdue University Department of Electrical and Computer Engineering West Lafayette, INDIANA, USA Abstract. In this report experiments are

More information

Adaptive Non-singleton Type-2 Fuzzy Logic Systems: A Way Forward for Handling Numerical Uncertainties in Real World Applications

Adaptive Non-singleton Type-2 Fuzzy Logic Systems: A Way Forward for Handling Numerical Uncertainties in Real World Applications Int. J. of Computers, Communications & Control, ISSN 1841-9836, E-ISSN 1841-9844 Vol. VI (2011), No. 3 (September), pp. 503-529 Adaptive Non-singleton Type-2 Fuzzy Logic Systems: A Way Forward for Handling

More information

MLE/MAP + Naïve Bayes

MLE/MAP + Naïve Bayes 10-601 Introduction to Machine Learning Machine Learning Department School of Computer Science Carnegie Mellon University MLE/MAP + Naïve Bayes Matt Gormley Lecture 19 March 20, 2018 1 Midterm Exam Reminders

More information

A Rough Set Interpretation of User s Web Behavior: A Comparison with Information Theoretic Measure

A Rough Set Interpretation of User s Web Behavior: A Comparison with Information Theoretic Measure A Rough et Interpretation of User s Web Behavior: A Comparison with Information Theoretic easure George V. eghabghab Roane tate Dept of Computer cience Technology Oak Ridge, TN, 37830 gmeghab@hotmail.com

More information

Probabilistic Learning

Probabilistic Learning Artificial Intelligence: Representation and Problem Solving 15-381 April 17, 2007 Probabilistic Learning Reminder No class on Thursday - spring carnival. 2 Recall the basic algorithm for learning decision

More information

Ratio of Vector Lengths as an Indicator of Sample Representativeness

Ratio of Vector Lengths as an Indicator of Sample Representativeness Abstract Ratio of Vector Lengths as an Indicator of Sample Representativeness Hee-Choon Shin National Center for Health Statistics, 3311 Toledo Rd., Hyattsville, MD 20782 The main objective of sampling

More information

Fuzzy Modal Like Approximation Operations Based on Residuated Lattices

Fuzzy Modal Like Approximation Operations Based on Residuated Lattices Fuzzy Modal Like Approximation Operations Based on Residuated Lattices Anna Maria Radzikowska Faculty of Mathematics and Information Science Warsaw University of Technology Plac Politechniki 1, 00 661

More information

MLE/MAP + Naïve Bayes

MLE/MAP + Naïve Bayes 10-601 Introduction to Machine Learning Machine Learning Department School of Computer Science Carnegie Mellon University MLE/MAP + Naïve Bayes MLE / MAP Readings: Estimating Probabilities (Mitchell, 2016)

More information

Efficient Approach to Pattern Recognition Based on Minimization of Misclassification Probability

Efficient Approach to Pattern Recognition Based on Minimization of Misclassification Probability American Journal of Theoretical and Applied Statistics 05; 5(-): 7- Published online November 9, 05 (http://www.sciencepublishinggroup.com/j/ajtas) doi: 0.648/j.ajtas.s.060500. ISSN: 36-8999 (Print); ISSN:

More information

Index. Cambridge University Press Relational Knowledge Discovery M E Müller. Index. More information

Index. Cambridge University Press Relational Knowledge Discovery M E Müller. Index. More information s/r. See quotient, 93 R, 122 [x] R. See class, equivalence [[P Q]]s, 142 =, 173, 164 A α, 162, 178, 179 =, 163, 193 σ RES, 166, 22, 174 Ɣ, 178, 179, 175, 176, 179 i, 191, 172, 21, 26, 29 χ R. See rough

More information

Support Vector Machine. Industrial AI Lab. Prof. Seungchul Lee

Support Vector Machine. Industrial AI Lab. Prof. Seungchul Lee Support Vector Machine Industrial AI Lab. Prof. Seungchul Lee Classification (Linear) Autonomously figure out which category (or class) an unknown item should be categorized into Number of categories /

More information

ROUGH set methodology has been witnessed great success

ROUGH set methodology has been witnessed great success IEEE TRANSACTIONS ON FUZZY SYSTEMS, VOL. 14, NO. 2, APRIL 2006 191 Fuzzy Probabilistic Approximation Spaces and Their Information Measures Qinghua Hu, Daren Yu, Zongxia Xie, and Jinfu Liu Abstract Rough

More information

Neutrosophic Modeling and Control

Neutrosophic Modeling and Control ICCCT 10 Neutrosophic Modeling and Control Swati Aggarwal, Ranjit Biswas Computer Science and Engineering ITM University Gurgaon,India swati1178@gmail.com Abstract--- Quite recently, Neutrosophic Logic

More information

Application of the Fuzzy Weighted Average of Fuzzy Numbers in Decision Making Models

Application of the Fuzzy Weighted Average of Fuzzy Numbers in Decision Making Models Application of the Fuzzy Weighted Average of Fuzzy Numbers in Decision Making Models Ondřej Pavlačka Department of Mathematical Analysis and Applied Mathematics, Faculty of Science, Palacký University

More information

LEC 4: Discriminant Analysis for Classification

LEC 4: Discriminant Analysis for Classification LEC 4: Discriminant Analysis for Classification Dr. Guangliang Chen February 25, 2016 Outline Last time: FDA (dimensionality reduction) Today: QDA/LDA (classification) Naive Bayes classifiers Matlab/Python

More information

Artificial Neural Networks Lecture Notes Part 2

Artificial Neural Networks Lecture Notes Part 2 Artificial Neural Networks Lecture Notes Part 2 About this file: If you have trouble reading the contents of this file, or in case of transcription errors, email gi0062@bcmail.brooklyn.cuny.edu Acknowledgments:

More information

Chapter 7, continued: MANOVA

Chapter 7, continued: MANOVA Chapter 7, continued: MANOVA The Multivariate Analysis of Variance (MANOVA) technique extends Hotelling T 2 test that compares two mean vectors to the setting in which there are m 2 groups. We wish to

More information

Naïve Bayes Introduction to Machine Learning. Matt Gormley Lecture 3 September 14, Readings: Mitchell Ch Murphy Ch.

Naïve Bayes Introduction to Machine Learning. Matt Gormley Lecture 3 September 14, Readings: Mitchell Ch Murphy Ch. School of Computer Science 10-701 Introduction to Machine Learning aïve Bayes Readings: Mitchell Ch. 6.1 6.10 Murphy Ch. 3 Matt Gormley Lecture 3 September 14, 2016 1 Homewor 1: due 9/26/16 Project Proposal:

More information

Reasoning with Uncertainty

Reasoning with Uncertainty Reasoning with Uncertainty Representing Uncertainty Manfred Huber 2005 1 Reasoning with Uncertainty The goal of reasoning is usually to: Determine the state of the world Determine what actions to take

More information

Minimum Error Classification Clustering

Minimum Error Classification Clustering pp.221-232 http://dx.doi.org/10.14257/ijseia.2013.7.5.20 Minimum Error Classification Clustering Iwan Tri Riyadi Yanto Department of Mathematics University of Ahmad Dahlan iwan015@gmail.com Abstract Clustering

More information

On (Weighted) k-order Fuzzy Connectives

On (Weighted) k-order Fuzzy Connectives Author manuscript, published in "IEEE Int. Conf. on Fuzzy Systems, Spain 2010" On Weighted -Order Fuzzy Connectives Hoel Le Capitaine and Carl Frélicot Mathematics, Image and Applications MIA Université

More information

Cheng Soon Ong & Christian Walder. Canberra February June 2018

Cheng Soon Ong & Christian Walder. Canberra February June 2018 Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 (Many figures from C. M. Bishop, "Pattern Recognition and ") 1of 305 Part VII

More information

Classification: Linear Discriminant Analysis

Classification: Linear Discriminant Analysis Classification: Linear Discriminant Analysis Discriminant analysis uses sample information about individuals that are known to belong to one of several populations for the purposes of classification. Based

More information

BINARY TREE-STRUCTURED PARTITION AND CLASSIFICATION SCHEMES

BINARY TREE-STRUCTURED PARTITION AND CLASSIFICATION SCHEMES BINARY TREE-STRUCTURED PARTITION AND CLASSIFICATION SCHEMES DAVID MCDIARMID Abstract Binary tree-structured partition and classification schemes are a class of nonparametric tree-based approaches to classification

More information

Standard & Conditional Probability

Standard & Conditional Probability Biostatistics 050 Standard & Conditional Probability 1 ORIGIN 0 Probability as a Concept: Standard & Conditional Probability "The probability of an event is the likelihood of that event expressed either

More information

AMONG many alternative means for knowledge representation, Belief Rule-Base Inference Methodology Using the Evidential Reasoning Approach RIMER

AMONG many alternative means for knowledge representation, Belief Rule-Base Inference Methodology Using the Evidential Reasoning Approach RIMER 266 IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS PART A: SYSTEMS AND HUMANS, VOL. 36, NO. 2, MARCH 2006 Belief Rule-Base Inference Methodology Using the Evidential Reasoning Approach RIMER Jian-Bo

More information

Adaptive Mixture Discriminant Analysis for Supervised Learning with Unobserved Classes

Adaptive Mixture Discriminant Analysis for Supervised Learning with Unobserved Classes Adaptive Mixture Discriminant Analysis for Supervised Learning with Unobserved Classes Charles Bouveyron To cite this version: Charles Bouveyron. Adaptive Mixture Discriminant Analysis for Supervised Learning

More information

Rough Set Model Selection for Practical Decision Making

Rough Set Model Selection for Practical Decision Making Rough Set Model Selection for Practical Decision Making Joseph P. Herbert JingTao Yao Department of Computer Science University of Regina Regina, Saskatchewan, Canada, S4S 0A2 {herbertj, jtyao}@cs.uregina.ca

More information

Computational Intelligence, Volume, Number, VAGUENES AND UNCERTAINTY: A ROUGH SET PERSPECTIVE. Zdzislaw Pawlak

Computational Intelligence, Volume, Number, VAGUENES AND UNCERTAINTY: A ROUGH SET PERSPECTIVE. Zdzislaw Pawlak Computational Intelligence, Volume, Number, VAGUENES AND UNCERTAINTY: A ROUGH SET PERSPECTIVE Zdzislaw Pawlak Institute of Computer Science, Warsaw Technical University, ul. Nowowiejska 15/19,00 665 Warsaw,

More information

Fuzzy control systems. Miklós Gerzson

Fuzzy control systems. Miklós Gerzson Fuzzy control systems Miklós Gerzson 2016.11.24. 1 Introduction The notion of fuzziness: type of car the determination is unambiguous speed of car can be measured, but the judgment is not unambiguous:

More information

Article from. Predictive Analytics and Futurism. July 2016 Issue 13

Article from. Predictive Analytics and Futurism. July 2016 Issue 13 Article from Predictive Analytics and Futurism July 2016 Issue 13 Regression and Classification: A Deeper Look By Jeff Heaton Classification and regression are the two most common forms of models fitted

More information

Degenerate Expectation-Maximization Algorithm for Local Dimension Reduction

Degenerate Expectation-Maximization Algorithm for Local Dimension Reduction Degenerate Expectation-Maximization Algorithm for Local Dimension Reduction Xiaodong Lin 1 and Yu Zhu 2 1 Statistical and Applied Mathematical Science Institute, RTP, NC, 27709 USA University of Cincinnati,

More information

Introduction to machine learning and pattern recognition Lecture 2 Coryn Bailer-Jones

Introduction to machine learning and pattern recognition Lecture 2 Coryn Bailer-Jones Introduction to machine learning and pattern recognition Lecture 2 Coryn Bailer-Jones http://www.mpia.de/homes/calj/mlpr_mpia2008.html 1 1 Last week... supervised and unsupervised methods need adaptive

More information

A novel k-nn approach for data with uncertain attribute values

A novel k-nn approach for data with uncertain attribute values A novel -NN approach for data with uncertain attribute values Asma Trabelsi 1,2, Zied Elouedi 1, and Eric Lefevre 2 1 Université de Tunis, Institut Supérieur de Gestion de Tunis, LARODEC, Tunisia trabelsyasma@gmail.com,zied.elouedi@gmx.fr

More information

CRITERIA REDUCTION OF SET-VALUED ORDERED DECISION SYSTEM BASED ON APPROXIMATION QUALITY

CRITERIA REDUCTION OF SET-VALUED ORDERED DECISION SYSTEM BASED ON APPROXIMATION QUALITY International Journal of Innovative omputing, Information and ontrol II International c 2013 ISSN 1349-4198 Volume 9, Number 6, June 2013 pp. 2393 2404 RITERI REDUTION OF SET-VLUED ORDERED DEISION SYSTEM

More information

Improvement of Process Failure Mode and Effects Analysis using Fuzzy Logic

Improvement of Process Failure Mode and Effects Analysis using Fuzzy Logic Applied Mechanics and Materials Online: 2013-08-30 ISSN: 1662-7482, Vol. 371, pp 822-826 doi:10.4028/www.scientific.net/amm.371.822 2013 Trans Tech Publications, Switzerland Improvement of Process Failure

More information

A new Approach to Drawing Conclusions from Data A Rough Set Perspective

A new Approach to Drawing Conclusions from Data A Rough Set Perspective Motto: Let the data speak for themselves R.A. Fisher A new Approach to Drawing Conclusions from Data A Rough et Perspective Zdzisław Pawlak Institute for Theoretical and Applied Informatics Polish Academy

More information

Environment Protection Engineering MATRIX METHOD FOR ESTIMATING THE RISK OF FAILURE IN THE COLLECTIVE WATER SUPPLY SYSTEM USING FUZZY LOGIC

Environment Protection Engineering MATRIX METHOD FOR ESTIMATING THE RISK OF FAILURE IN THE COLLECTIVE WATER SUPPLY SYSTEM USING FUZZY LOGIC Environment Protection Engineering Vol. 37 2011 No. 3 BARBARA TCHÓRZEWSKA-CIEŚLAK* MATRIX METHOD FOR ESTIMATING THE RISK OF FAILURE IN THE COLLECTIVE WATER SUPPLY SYSTEM USING FUZZY LOGIC Collective water

More information

Chapter 2 Rough Set Theory

Chapter 2 Rough Set Theory Chapter 2 Rough Set Theory Abstract This chapter describes the foundations for rough set theory. We outline Pawlak s motivating idea and give a technical exposition. Basics of Pawlak s rough set theory

More information

The Abnormal Electricity Consumption Detection System Based on the Outlier Behavior Pattern Recognition

The Abnormal Electricity Consumption Detection System Based on the Outlier Behavior Pattern Recognition 2017 International Conference on Energy, Power and Environmental Engineering (ICEPEE 2017) ISBN: 978-1-60595-456-1 The Abnormal Electricity Consumption Detection System Based on the Outlier Behavior Pattern

More information

Linear Models. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.

Linear Models. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis. Linear Models DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_fall17/index.html Carlos Fernandez-Granda Linear regression Least-squares estimation

More information

CHAPTER V TYPE 2 FUZZY LOGIC CONTROLLERS

CHAPTER V TYPE 2 FUZZY LOGIC CONTROLLERS CHAPTER V TYPE 2 FUZZY LOGIC CONTROLLERS In the last chapter fuzzy logic controller and ABC based fuzzy controller are implemented for nonlinear model of Inverted Pendulum. Fuzzy logic deals with imprecision,

More information

ROUGH SETS THEORY AND DATA REDUCTION IN INFORMATION SYSTEMS AND DATA MINING

ROUGH SETS THEORY AND DATA REDUCTION IN INFORMATION SYSTEMS AND DATA MINING ROUGH SETS THEORY AND DATA REDUCTION IN INFORMATION SYSTEMS AND DATA MINING Mofreh Hogo, Miroslav Šnorek CTU in Prague, Departement Of Computer Sciences And Engineering Karlovo Náměstí 13, 121 35 Prague

More information

Course Outline MODEL INFORMATION. Bayes Decision Theory. Unsupervised Learning. Supervised Learning. Parametric Approach. Nonparametric Approach

Course Outline MODEL INFORMATION. Bayes Decision Theory. Unsupervised Learning. Supervised Learning. Parametric Approach. Nonparametric Approach Course Outline MODEL INFORMATION COMPLETE INCOMPLETE Bayes Decision Theory Supervised Learning Unsupervised Learning Parametric Approach Nonparametric Approach Parametric Approach Nonparametric Approach

More information

Belief Classification Approach based on Dynamic Core for Web Mining database

Belief Classification Approach based on Dynamic Core for Web Mining database Third international workshop on Rough Set Theory RST 11 Milano, Italy September 14 16, 2010 Belief Classification Approach based on Dynamic Core for Web Mining database Salsabil Trabelsi Zied Elouedi Larodec,

More information

Effect of Rule Weights in Fuzzy Rule-Based Classification Systems

Effect of Rule Weights in Fuzzy Rule-Based Classification Systems 506 IEEE TRANSACTIONS ON FUZZY SYSTEMS, VOL. 9, NO. 4, AUGUST 2001 Effect of Rule Weights in Fuzzy Rule-Based Classification Systems Hisao Ishibuchi, Member, IEEE, and Tomoharu Nakashima, Member, IEEE

More information

Classification Based on Logical Concept Analysis

Classification Based on Logical Concept Analysis Classification Based on Logical Concept Analysis Yan Zhao and Yiyu Yao Department of Computer Science, University of Regina, Regina, Saskatchewan, Canada S4S 0A2 E-mail: {yanzhao, yyao}@cs.uregina.ca Abstract.

More information

Supervised Learning: Linear Methods (1/2) Applied Multivariate Statistics Spring 2012

Supervised Learning: Linear Methods (1/2) Applied Multivariate Statistics Spring 2012 Supervised Learning: Linear Methods (1/2) Applied Multivariate Statistics Spring 2012 Overview Review: Conditional Probability LDA / QDA: Theory Fisher s Discriminant Analysis LDA: Example Quality control:

More information

Quantization of Rough Set Based Attribute Reduction

Quantization of Rough Set Based Attribute Reduction A Journal of Software Engineering and Applications, 0, 5, 7 doi:46/sea05b0 Published Online Decemer 0 (http://wwwscirporg/ournal/sea) Quantization of Rough Set Based Reduction Bing Li *, Peng Tang, Tommy

More information

Application of rough sets in E-commerce consumer behavior prediction

Application of rough sets in E-commerce consumer behavior prediction Vol.53 (IM 214), pp.255-26 http://dx.doi.org/1.14257/astl.214.53.53 Application of rough sets in E-commerce consumer behavior prediction Yanrong Zhang 1, Zhijie Zhao 1, Jing Yu 2, Kun Wang 1 1 ollege of

More information

Lecture 9: Large Margin Classifiers. Linear Support Vector Machines

Lecture 9: Large Margin Classifiers. Linear Support Vector Machines Lecture 9: Large Margin Classifiers. Linear Support Vector Machines Perceptrons Definition Perceptron learning rule Convergence Margin & max margin classifiers (Linear) support vector machines Formulation

More information

L5 Support Vector Classification

L5 Support Vector Classification L5 Support Vector Classification Support Vector Machine Problem definition Geometrical picture Optimization problem Optimization Problem Hard margin Convexity Dual problem Soft margin problem Alexander

More information

X={x ij } φ ik =φ k (x i ) Φ={φ ik } Lecture 11: Information Theoretic Methods. Mutual Information as Information Gain. Feature Transforms

X={x ij } φ ik =φ k (x i ) Φ={φ ik } Lecture 11: Information Theoretic Methods. Mutual Information as Information Gain. Feature Transforms Lecture 11: Information Theoretic Methods Isabelle Guyon guyoni@inf.ethz.ch Mutual Information as Information Gain Book Chapter 6 and http://www.jmlr.org/papers/volume3/torkkola03a/torkkola03a.pdf Feature

More information

Data Mining and Analysis: Fundamental Concepts and Algorithms

Data Mining and Analysis: Fundamental Concepts and Algorithms : Fundamental Concepts and Algorithms dataminingbook.info Mohammed J. Zaki 1 Wagner Meira Jr. 2 1 Department of Computer Science Rensselaer Polytechnic Institute, Troy, NY, USA 2 Department of Computer

More information

MULTIVARIATE HOMEWORK #5

MULTIVARIATE HOMEWORK #5 MULTIVARIATE HOMEWORK #5 Fisher s dataset on differentiating species of Iris based on measurements on four morphological characters (i.e. sepal length, sepal width, petal length, and petal width) was subjected

More information

Parameters to find the cause of Global Terrorism using Rough Set Theory

Parameters to find the cause of Global Terrorism using Rough Set Theory Parameters to find the cause of Global Terrorism using Rough Set Theory Sujogya Mishra Research scholar Utkal University Bhubaneswar-751004, India Shakti Prasad Mohanty Department of Mathematics College

More information

Normalized priority vectors for fuzzy preference relations

Normalized priority vectors for fuzzy preference relations Normalized priority vectors for fuzzy preference relations Michele Fedrizzi, Matteo Brunelli Dipartimento di Informatica e Studi Aziendali Università di Trento, Via Inama 5, TN 38100 Trento, Italy e mail:

More information

Drawing Conclusions from Data The Rough Set Way

Drawing Conclusions from Data The Rough Set Way Drawing Conclusions from Data The Rough et Way Zdzisław Pawlak Institute of Theoretical and Applied Informatics, Polish Academy of ciences, ul Bałtycka 5, 44 000 Gliwice, Poland In the rough set theory

More information

i jand Y U. Let a relation R U U be an

i jand Y U. Let a relation R U U be an Dependency Through xiomatic pproach On Rough Set Theory Nilaratna Kalia Deptt. Of Mathematics and Computer Science Upendra Nath College, Nalagaja PIN: 757073, Mayurbhanj, Orissa India bstract: The idea

More information

Chapter 8 Statistical Quality Control, 7th Edition by Douglas C. Montgomery. Copyright (c) 2013 John Wiley & Sons, Inc.

Chapter 8 Statistical Quality Control, 7th Edition by Douglas C. Montgomery. Copyright (c) 2013 John Wiley & Sons, Inc. 1 Learning Objectives Chapter 8 Statistical Quality Control, 7th Edition by Douglas C. Montgomery. 2 Process Capability Natural tolerance limits are defined as follows: Chapter 8 Statistical Quality Control,

More information