ARPN Journal of Science and Technology All rights reserved.
|
|
- Edwina Simon
- 5 years ago
- Views:
Transcription
1 Rule Induction Based On Boundary Region Partition Reduction with Stards Comparisons Du Weifeng Min Xiao School of Mathematics Physics Information Engineering Jiaxing University Jiaxing 34 China ABSTRACT Prof. Ye Dongyi has pointed out in his paper that the reduction approach proposed by Hu Xiaohua et al. will lead to wrong results under some circumstances. In this paper we find out through analysis that Ye s approach is actually positive region reduction whereas Hu s approach is to make sure that the boundary regions partition are kept unchanged. The main difference between the two approaches is that each use a different stard thus there is no ground to judge which one is correct or wrong. What is more we clarify the relationship among various reduction stards for decision table we also give the relationship between reduction results when there is a strong-week correlation between two reduction stards. Keywords: Decision table; Rule induction; Rough set theory (RST); Reduction stard. INTRODUCTION Knowledge discovery always plays a central role of artificial intelligence. The method of knowledge discovery we discuss here has the following features: ()The pattern or the knowledge is concealed in the data which is abundant incomplete noisy vague; ()The pattern could be understood by people; (3)The pattern must be useful novel; (4)The data processing procedure is unordinary. In hling uncertain problems fuzzy set theory rough set theory (RST) both generalize the classical set theory but the viewpoints between them are different. Fuzzy set describes approximate knowledge using the membership. It mainly hles the fuzzy uncertainty which is inherent in natural languages. Roughness is the result of granularity of knowledge. If the objects having the same description belong to different classifications thus rough uncertainty appears. In fact it doesn t mean that these objects are really identical. It only means that we have limited understing on them. In such level of cognition we recognize some different objects as identical. Rough uncertainty will decrease when the cognition level increases the granularity of knowledge refines. RST is a new mathematical approach to hle uncertainty incomplete information. It was initially proposed by Polish mathematician Pawlak [] in 98. After nearly thirty years research development it has made great progress in theoretical aspects applications. Especially it obtained broad attentions after its successful use in knowledge discovery. Now it has applied to a broad domain such as artificial intelligence knowledge discovery in database pattern recognition failure detection etc. It is without doubt one of the most challenging areas of modern computer applications nowadays a new important rapidly growing area of research applications. Knowledge redact is one of the main contents of RST. As we all know each attribute in knowledge base (information system) is not identically important. Even some attributes are redundant. Knowledge reduct is to desert irrelevant or unimportant attributes in keeping the classification capability to the knowledge base [3]. So in information system deserting some attributes will not influence its classification capability. We only need to hold the subsets which can constitute reduct thus the new information system will have the same classification capability with the original one. A natural idea is to search all the subsets to acquire all the reducts. But unfortunately searching all the subsets of some set is an NP problem so it is infeasible practically. In 99 Prof. Skowron of Warsaw University introduced discernibility matrix discernibility function [4]. He pointed out all the conjuncts of the minimal disjunctive normal form of the discernibility function are all the reducts of the attributes set. Though the algorithm to transform the conjunctive normal form into disjunctive normal form of the discernibility function is still exponentially complex. But the method is simple clear easy to operate besides its complexity of computation can be reduced greatly using absorption law of Boolean expression. It is feasible when the scale of the problem is not too large. To date this is the best approach to get all the correct reducts it can be a criterion to test other heuristic algorithm. And besides the thought of this method has important significance. In applications Pawlak model is often represented by an information system or a decision table in which each row represents an object each column represents an attribute. In a decision table the attributes are generally classified into conditions decision (Generally only one column represents decision). If any object in the decision table with the same values of conditions must have the same values of 43
2 decision hence the values of decision are defined by the values of conditions or to say the decisions are consistent with the conditions. Now it is obvious that a decision table is said to be inconsistent if two objects with the same values of conditions may have different values of decisions. From this statement we can see that inconsistent decision table is more complex general than consistent decision table. Inconsistent decision table is maybe more common in practical use real-life situations because some data in our database are polluted by noise some data are contradictive because of our limitation. So it is more important necessary for us to investigate the inconsistent there have been some methods applied to the reduction of inconsistent decision tables already. These methods are different the results they get are also different of course there are some relationships among them. Positive domain reduction distribution reduction maximal distribution reduction distributive reduction so on are based on different rules respectively. The strength of reduction is different. The relationship among them is complex. In paper [5] Prof. Ye pointed out that the reduction approach introduced by Hu et al. [6] will give wrong result in some situation. In this paper we come to a conclusion by analyzing that Ye s reduction approach is positive region reduction Hu s approach is to keep boundary region partition unchangeable virtually. They are just different stards. The remainder of this paper is organized as follows: The next section introduces some fundamental concepts like partition equivalence relation rough set approximation information system indiscernibility relation inconsistent decision table various reduction stards so on. Section 3 presents that Ye s approach is positive region reduction. Property of Hu s approach is introduced in Section 4. Section 5 gives out the relationship of all the reduction stards. An example is illustrated in Section 6 to show how to acquire certain rules uncertain rules. The last section concludes the paper.. FUNDAMENTAL CONCEPTS. Partition Equivalence Relation Definition [7] A partition of a nonempty set is a collection{ }if a. b. c. ; = ( ); =. The subsets in collection are called the blocks of the partition. Definition : a. b. c. A relation on a set is reflexive if ( ) for all that is if for all ; A relation on a set is symmetric if whenever arb then bra; A relation on a set is transitive if whenever arbbrc thenarc. Definition 3: A relation R on a set is called an equivalence relation if it is reflexive symmetric transitive. Theorem.A partition on would generate an equivalence relation on also an equivalence relation Ron would generatea partition on. This partition will be denoted by / the blocks of the partitionare traditionally called equivalence classes of.. Rough Set Its Approximation Rough set theory believes that knowledge is essentially a kind of capability of classification. The capability of classification incarnates the knowledge who owns. Definition 4[7] isa nonempty set of objects is an equivalence relation on = ( ) is called approximation space equivalence classes generated by = ( ) is denoted as / = { }. In the Pawlak model = ( ) the equivalence relation in the equation characterizes the classification to universe. We can express the concepts of universe once we have such knowledge. When the concepts can be presented accurately by the knowledge in knowledge base they are called accurate concepts or accurate sets or they are called rough concepts or rough sets. Rough sets can be approached by two accurate sets the lower upper approximation. They are defined as follows: Definition 5: space denote ( )is [7] approximation ( ) = { }() ( ) = { }() ( )is called lower approximation of ( ) is called upper approximation of. If ( ) ( ) is called rough set. 44
3 .3 Information System Indiscernibility Relation According to RST the original knowledge can be expressed in an information system (e.g. see Table ) in which each row represents an object each column represents an attribute. Table : Information system Therefore knowledge can be described in an information system as follows: Definition 6: [7] A decision table can be denoted by a 4-tuple = ( ) where : Nonempty finite set of all objects be called universe; : Nonempty finite set of all the condition attributes; = { } is domain of attribute ; : is information function it gives an information value for every attribute of every ( ) object i.e. information function is denoted as ( ) sometimes. If it does not lead to confusion decision table can be commonly denoted as = ( ) briefly. Table illustrates the information of six objects that are characterized with three attributes ( ). It can be easily seen every attribute in a decision table corresponds an equivalence relation. As for the corresponding equivalence relation is: ( ) ( ) = ( )(3) Each attributes set also leads to an equivalence relation. As for the corresponding equivalence relation is: ( ) ( ) = ( )(4) The starting point of RST is the indiscernibility relation. The indiscernibility relation identifies objects having the same properties i.e. objects having the same properties are indiscernible consequently are treated as identical. In other words the indiscernibility relation leads to clustering of elements into granules of indiscernible objects. In RST these granules called elementary sets (concepts) are basic building blocks (concepts) of knowledge about the universe. Considering specific attributes the objects are indiscernible according to the available information. For example as shown in Table the attribute of objects is identical; hence these three objects are indiscernible based on the attribute. In other words the objects described by the identical data of considered attributes are indiscernible. A set of all indiscernible objects with respect to specific considered attributes is called an elementary set. Let be a nonempty subset of the set of all attributesi.e.. In particular -indiscernibility relation denotedby ind( ) defines that are -indiscernible with respect to as follows: ( ) ind( ) ( ) = ( )(5) Obviously the indiscernibility relation on.that is are -indiscernible if considering only subset of theattributes. The -indiscernibility relation will induce the -elementaryset in. Moreover the family of all equivalence classes definedby the relation is ind( ) denoted by /ind( ). ind( )is an equivalencerelation A partition of the universe can be generated based on indiscernibility relations; thus the universe can be decomposed into blocks of indiscernible objects i.e. elementary sets. For example the attribute B = {a } will two elementary sets { } { }. Similarly both the attributes = { } the elementary sets of{ } { } { } will be generated. Hence induce /ind( ) = { } { } { }. In order to discuss the reduct we define as follows: is a family of equivalence relation if ind( ) = ind( { })(6) Holds we say is redundant or or is If each is dependent. necessary. is necessary is independent Suppose that if is independent ind( ) = ind( ) then is a reduct of. 45
4 { As for the table { }(e.g. see Table ). } is the only reduct of Table : Reducted information system S by As then is also an equivalence relation which is called indiscernibility relation on denoted as ind( ). decisions are defined by the values of conditions or to say the decisions are consistent with the conditions. For example table3 is a consistent decision table. Now it is obvious that a decision table is said to be inconsistent if ind( ) ind( ) doesn t hold. Hence two objects with the same values of conditions may have the different values of decisions. From this statement we can see that inconsistent decision table is more complex general than consistent decision table. Consistent decision table is a special kind of inconsistent decision table. For example table4 is an inconsistent decision table because objects have the same values {} of conditions{ } but the decision values is different. Table 4: Inconsistent decision table As to two knowledge bases = ( ) = ( ) when ind( ) = ind( ) we say that is equivalent denoted as. For = ( ) = ( { }) example presents the let information system shown in table = ( ) = ( { })presents the information system shown in table then..4.5 Various Reduction Stards Inconsistent decision table [7] Definition 7: Decision table is a kind of information system. In decision table the attributes are generally classified into conditions decision (e.g. in Table 3 the three features define the conditions describes thedecision). y b. Table 3: Consistent decision table What is an inconsistent decision table? As we all know a decision table is an information system ( ) such that = { } { } = where are nonempty sets. Elements in are said to be conditions is called decision elements in may be interpreted to be objects. A decision table is said to be consistent if ind( ) ind( ) holds. In other words any two objects with the same values of conditions must have the same values of decisions. Hence the values of is called including degree on satisfies a. c. Let ( ) be a poset ; = ; if it. = ( ) is a decision table are equivalence relations on respectively from condition attributes set decision attribute set denote: = { }(7) Denote: ={ = }(8) ( = )(9) Then is including degree on. Definition 8: If is a set satisfying some property any proper subset of doesn t satisfy such property is called minimal set satisfying such property. Denoting: 46
5 pos ( ) = ( )= of ( ) () () ( )is called generalized decision distributive function about attributes set in the decision table. ( )= ( )= = ( ) = Definition 9: table a. b. c. d. e. f..6 (3) () (4) ( ) = = max [7] log log (5) = ( )is a decision ( )= ( ) if is called positive region minimal positive region is called positive region reduct; ( ) = ( ) if is called distribution minimal distribution consistent set is called distribution reduct; if ( ) = ( ) is called greatest distribution minimal greatest distribution is called greatest distribution reduct; ( ) = ( ) if is called distributive minimal distributive consistent set is called distributive reduct; if = is called approximate consistent set minimal approximate is called approximate reduct; ( ) = ( ) is called entropy if minimal entropy is called entropy reduct. Discernibility Function Matrix Discernibility Definition : = ( )is original decision table is called some of S if B keeps some properties of decision table[7] then the reduced decision table is denoted by = ( ). Definition.The discernibility matrix of a decision table is an n-ordered phalanx ( = ) the element is: ( )(6) The discernibility function of decision table is defined as follows: = ( )(7) If we regard its element as boolean variable the discernibility function is a boolean equation. We have that all the conjuncts of the minimal disjunctive normal form of the discernibility function are all the reducts of S. ( )is different in every reduction stard Hu etc.[6] put forward that ( ) = { ( ) ( ) ( ) ( )}(8) Prof. Ye Dongyi pointed out in his paper that the reduction approach is wrong in some situation. In his paper Prof. Wang Guoyin[8] discussed Hu s approach Ye s approach algebra view information view their relationship of rough sets theory. He pointed out that all the reduction stards are identical in consistent decision tables but in inconsistent decision tables all the reduction stards give different results commonly. 3. YE S APPROACH BE REGION REDUCTION POSITIVE In papers [4 7] the discernibility condition about objects of positive region reduction is given the elements in discernibility matrix is: ( ) = { ( ) ( ) ( )}(9) ( )satisfies: ( ) ( ) ( ) ( )() ( ) ( ) ( ) In papers [9 ] it has been proved that ( ) without changing the matter of positive region reduction where ( ) satisfies: ( ) can be transformed into ( ) ( ) ( ) ( ) () The conclusion is completely equivalent with the condition introduced by Ye in his paper [5] their forms are very consistent. The condition introduced by Ye in his paper is now denoted in the symbol of this paper: ( ) ( ) min{ ( ) ( ) } = () 47
6 Now we just need to prove that: Lemma: ( )isequivalent ( ) min{ ( ) ( ) } =. with Proof: ( ) we get we have ( ) = samely or we have ( ) = so min{ ( ) ( ) } =. In fact there are more other reduction stards. The so-called algebra view in Wang s paper is the traditional positive region reduction stard. Such stard can guarantee equivalent certain rules before after reduction but generally speaking uncertain rules are not same. The algebra view of rough sets also have other reduction stards such as distribution reduction approximate reduction distributive reduction maximum distributive reduction. In papers [~3] we had discussed their relationship their logic characteristic. Therein Hu s approach can be regarded as some reduction stard the different reduction results between them is just because of different reduction stards. To clarify its meaning we should analyze the property logic characteristic of Hu s approach. 4. PROPERTY OF HU S APPROACH Lemma. if we have ( ) = ( ) ( ) = ([ ] ) = ( ) = ([ ] ). Proof: then If ( ) or ([ ] ) Hu reduct we have ( ) = ( ) from lemma we have ( ) = ([ ] ) = ( ) = ([ ] ). Theorem 3: is Hu reduct of decision table = ( ) if ( ) then =. Proof is obvious. Now we need prove namely have we prove it by contradiction suppose that we have because of ( ) ( ) ( ) from the definition of Hu reduct we inevitably have ( ) ( ) so this is contradictory with the suppose!. So the proposition is proved we have = From theorem theorem 3 Hu reduction merges some equivalent classes in positive region but the partition of boundary region keeps unchangeable so in this meaning Hu reduction can be called boundary region partition reduction. Following is the sketch map of boundary region partition reduction. Fig. (a) gives out the situation before reduction of a 4-class decision table. Theoretically in the extreme situation after boundary region partition reduction the lower approximation of decision class may be even reducted into class at least (Fig. (c)). As to n-class decision table in the extreme situation after boundary region partition reduction the lower approximation of decision class may be reducted into n classes at least. Of course due to the practical situation of each attribute in concrete decision table such extreme-situation reduct is unlikely to happen. In a general way the reduct is shown as in Fig. (b). ( ) ( ) this is Contradictory with the condition so we have ( ) = ([ ] ) = ( ) = ( ) ([ ] ) = ( ) as we have ( ) = ( ) so ( ) = ( ) thus ( ) = ([ ] ). (a) initial situation Theorem : is Hu reduct of decision table = ( ) if [ ] = then ( ) = ([ ] ) = ( ) ( ) =. Proof = from the definition of 48
7 boundary region partition positive region (b) reduct under general situation Fig : Relationship between boundary region partition positive region of inconsistent decision table 5. (c) reduct under extreme situation possibly happening Fig : Boundary region partition reduction 5. RELATIONSHIP ABOUT SEVERAL REDUCTION STANDARDS OF INCONSISTENT DECISION TABLE 5. Relationship between boundary region partition positive region Relationship between boundary region partition distributive Definition. = ( )isa decision ( ) = ( ) if then table is called distributive. If is distributive any proper subset of is not distributive is called distributive reduct. Theorem4. = ( )isa decision table then boundary region partition must be distributive. Proof: It can be proved respectively under the following two conditions: a. b. If equivalence classes after reduction are in positive region ie. ( ) = we have ( ) = ( ); If equivalence classes after reduction are in boundary region ie. ( ) according to the definition of boundary region partition we have the conclusion that boundary region partition keeps unchanged ie. = then we have ( ) = ( ) The expressions of ( ) are different with different reduction stards. The element in discernibility matrix of the boundary region partition reduction proposed by Hu etc.[6] is: So boundary region partition must even be distributive. In papers[4] the discernibility condition of positive region reduction about objects was given out. The element in discernibility matrix is: In papers[ 4] we have gotten the relationship about several other reduction stards including the relationship about Boundary Region Partition Reduction stard other reduction stards discussed in this paper then the relationship among all the discussed s is listed in Fig. 3: ( ) = { ( ) ( ) ( ) ( )}(3) ( ) = { ( ) ( ) ( )}(4) where ( )satisfies: ( ) ( ) ( ( ) ( ))(5) It is apparent that the element pair being discerned in positive region reduction will be inevitably discerned in Boundary region partition reduction. So boundary region partition must be positive region. The relationship is given in Fig.. 49
8 distribution(entr opy) positive region distributive(approximate) boundary region partition Fig 3: Relationship among all the discussed consistent sets of inconsistent decision table LOGIC CHARACTERISTIC OF BOUNDARY REGION PARTITION REDUCTION Rule Acquisition of Decision Table = ( )Is a decision table from S we can get the following decision rules Rule( ): ( ) ( ) Where ( ) is the premise of Rule( ) ( ) is the conclusion of Rule( ). If Rule( ) Rule( ) have the same premise. So we can define the decision rules about the equivalent classes of objects. / = { }is partition of U with regard to conditional attributes set for / denote ( ) = { ( ) } = { ( ) } as d attribute Values set of the elements in. So as for we can get the following ( ) decision rules: Rule( The P(Rule( ): ( ) rule precision )) is defined as ( Rule( )) = (rule () confidence) Where = { ( ) = }. So the rule precision of ( ) is the proportion of the objects with decision attribute value in. If (Rule( )) = Rule( ) is called certain rule if (Rule( )) < Rule( ) is called uncertain rule. maximal distribution 6. Rule Acquisition Based On Boundary Region Partition Reduction Theorem5: is a boundary region partition reduction of = ( ) then one certain rule inducted from the reduced decision table = ( ) corresponds several certain rule(s) of the original decision table S. ( ) = { }is a Proof: partition of. As for ( ( ) = ) the reduced decision table = ( ) can induct one certain decision rule: Rule( ): ( ) ( ) S can produce corresponding certain decision rule(s): Rule( ): ( ) ( ) ( )= Because of ( ) these rules have the same conclusion with Rule( ). The formula above can change into: Rule( ): ( ) Where withrule( ) ( ) ( ) ( ) has the same premise ( ) ( ) Includes ( ) values of the attributes it just corresponds several certain rule(s) of the original decision table S. Theorem6. is a boundary region partition reduction of = ( ) then the uncertain rule inducted from the reduced decision table 5
9 =( ) one-to-one corresponds uncertain rule of the original decision table S. the Proof: From theorem 3 if ( ) =. Suppose ( ) = ( ) = { ( ) } = { } [ ] As for the reduced decision table ( ) can produce m uncertain rules: Rule : ( ) ( ) Rule( ) = Rule precision where ={ ( )= }. = As for the original decision table = ( ) from it can produce the following m uncertain rules: Rule [y] : ( ) ( Rule precisionp Rule( ) = 6.3 ). Illustrative Example Example : Consider the following decision table (e.g. see Table 5): Table 5: Decision Table S. Where ] ] ] ] ): ( ): ( ): ( ): ( = { 6} is objects set ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( Rule( ): ( ) ( ) ( Rule( ): ( ) ( ) ( Rule( ): ( ) ( ) ( Rule( ): ( ) ( ) ( the following certain rule: Rule( ): ( ) ( ) ( ) ) ) ) ) It is obvious that the reduced decision table produces same uncertain rules with the original decision table As for certain rules Rule( ) is merged by {Rule( ) = 56}. 7. CONCLUSIONS In this paper we find by analysis that the reduction approach introduced by Ye Dongyi is positive region reduction virtually the reduction approach introduced by Hu Xiaohua etc. is to keep boundary region partition unchanged. Thus both reduction approaches are just different in reduction stard. Then we analyze the relationship about several reduction stards get two results:. boundary region partition must be positive region consistent set;. boundary region partition must even be distributive. By analyzing the logic characteristic of boundary region partition reduction stard we come to a conclusion that one certain rule produced from boundary region partition reduction decision table corresponds several certain rule(s) of the original decision table each uncertain rule produced from boundary region partition reduction decision table corresponds to one uncertain rule of the original decision table. ACKNOWLEDGEMENTS = { } is conditional attributes set is decision attribute. From S we can get the following uncertain rules: Rule([ Rule([ Rule([ Rule([ boundary region partition reduct of S from the reduced decision table S we can get the following uncertain rules: ) ) ) ) And the following certain rules: Rule( ): ( ) ( ) ( ) ( ) Rule( ): ( ) ( ) ( ) ( ) By calculating discernibility matrix discernibility function we can get that = { This work was partially supported by Zhejiang province fatal project (priority subjects) key industrial project (Grant No: 8C) the National Nature Science Foundation of China (Grant No: ) the Specialized Research Fund for the Doctoral Program of Higher Education of China(No.6637) the Provincial Nature Science Foundation of Zhejiang (Grant No: LYA9 LYF9). REFERENCES } is [] Z. Pawlak(98). Rough sets. International Journal of Computer Information Science[J] 34~356 [] Z. Pawlak(99). Rough sets: Theoretical Aspects of Reasoning about Data [M]. Boston: Kluwer Academic Publishers. 5
10 [3] Zhang Wenxiu Wu Weizhi Liang Jiye Li Deyu(). Rough sets theory approach [M] Beijing:Science Press 58~86. [4] A. Skowron C. Rauszer(99). The discernibility matrices functions in information systems [J] in: R. Slowinski (Ed.). Intelligent Decision Support Hbook of Applications Advances of the Rough Sets Theory. Dordrecht: Kluwer Academic Publishers 33~36. Southwest Jiaotong University Doctor Degree Dissertation. [] Du Weifeng Qin Keyun(5). The relationship of positive domain reduction to other reductions of inconsistent decision tables.journal of Hainan Normal College 8 8~. [] Du Weifeng Yang Li(8). A Brief Analysis about the Basic Reduction Stards of Decision Table. Computational Intelligence in Decision Control (Book Series: World Scientific Proceedings Series on Computer Engineering Information Science) The 8th International FLINS Conference on Computational Intelligence in Decision Control 575~58. Qin Keyun Du Weifeng(6). The Logic Characteristic of Knowledge Reduction. Computer Engineering Applications 4. [5] Ye Dongyi Chen Zhaojiong(). A new discernibility matrix the computation of a core.acta Electronica Sinica 3 86~88. [6] Hu Xiaohua CerconeN(995). Learning in relational databases: a rough set approach [J]. Computational Intelligence 33~337. [3] Zhang Wenxiu Liang Yi Wu Weizhi(3). Information system knowledge discovery [M] Beijing:Science Press 48. [4] Du Weifeng Qin Keyun(8). A Brief Analysis about Boundary Region Partition Reduction Stard. Computer Engineering Applications 44. [5] KeyunQin ZhengPei WeifengDu(5). The relationship among several knowledge reduction approaches. Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science) v 363 n PART I Fuzzy Systems Knowledge Discovery: Second International Conference FSKD 5. Proceedings 3~4. [6] Bernard Kolman Robert C. Busby Sharon Cutler Ross (5). Discrete Mathematical Structures (Fifth Edition). Higher Education Press. [7] [8] Wang Guoyin(3). Calculation Methods for Core Attributes of Decision Table. Chinese Journal of Computers 6 6~65. [9] Du Weifeng Qin Keyun (6). The Improvement to Condition in Discernibility Function of Positive Reduct of Decision Table. Computer Engineering Applications4 6~8. [] Du Weifeng (6). Application of Rough Set Theory in Chinese Text Categorization. 5
A new Approach to Drawing Conclusions from Data A Rough Set Perspective
Motto: Let the data speak for themselves R.A. Fisher A new Approach to Drawing Conclusions from Data A Rough et Perspective Zdzisław Pawlak Institute for Theoretical and Applied Informatics Polish Academy
More informationCRITERIA REDUCTION OF SET-VALUED ORDERED DECISION SYSTEM BASED ON APPROXIMATION QUALITY
International Journal of Innovative omputing, Information and ontrol II International c 2013 ISSN 1349-4198 Volume 9, Number 6, June 2013 pp. 2393 2404 RITERI REDUTION OF SET-VLUED ORDERED DEISION SYSTEM
More information2 WANG Jue, CUI Jia et al. Vol.16 no", the discernibility matrix is only a new kind of learning method. Otherwise, we have to provide the specificatio
Vol.16 No.1 J. Comput. Sci. & Technol. Jan. 2001 Investigation on AQ11, ID3 and the Principle of Discernibility Matrix WANG Jue (Ξ ±), CUI Jia ( ) and ZHAO Kai (Π Λ) Institute of Automation, The Chinese
More informationResearch Article Uncertainty Analysis of Knowledge Reductions in Rough Sets
e Scientific World Journal, Article ID 576409, 8 pages http://dx.doi.org/10.1155/2014/576409 Research Article Uncertainty Analysis of Knowledge Reductions in Rough Sets Ying Wang 1 and Nan Zhang 2 1 Department
More informationEasy Categorization of Attributes in Decision Tables Based on Basic Binary Discernibility Matrix
Easy Categorization of Attributes in Decision Tables Based on Basic Binary Discernibility Matrix Manuel S. Lazo-Cortés 1, José Francisco Martínez-Trinidad 1, Jesús Ariel Carrasco-Ochoa 1, and Guillermo
More informationClassification Based on Logical Concept Analysis
Classification Based on Logical Concept Analysis Yan Zhao and Yiyu Yao Department of Computer Science, University of Regina, Regina, Saskatchewan, Canada S4S 0A2 E-mail: {yanzhao, yyao}@cs.uregina.ca Abstract.
More informationComputers and Mathematics with Applications
Computers and Mathematics with Applications 59 (2010) 431 436 Contents lists available at ScienceDirect Computers and Mathematics with Applications journal homepage: www.elsevier.com/locate/camwa A short
More informationMETRIC BASED ATTRIBUTE REDUCTION IN DYNAMIC DECISION TABLES
Annales Univ. Sci. Budapest., Sect. Comp. 42 2014 157 172 METRIC BASED ATTRIBUTE REDUCTION IN DYNAMIC DECISION TABLES János Demetrovics Budapest, Hungary Vu Duc Thi Ha Noi, Viet Nam Nguyen Long Giang Ha
More informationThe Fourth International Conference on Innovative Computing, Information and Control
The Fourth International Conference on Innovative Computing, Information and Control December 7-9, 2009, Kaohsiung, Taiwan http://bit.kuas.edu.tw/~icic09 Dear Prof. Yann-Chang Huang, Thank you for your
More informationA PRIMER ON ROUGH SETS:
A PRIMER ON ROUGH SETS: A NEW APPROACH TO DRAWING CONCLUSIONS FROM DATA Zdzisław Pawlak ABSTRACT Rough set theory is a new mathematical approach to vague and uncertain data analysis. This Article explains
More informationHierarchical Structures on Multigranulation Spaces
Yang XB, Qian YH, Yang JY. Hierarchical structures on multigranulation spaces. JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY 27(6): 1169 1183 Nov. 2012. DOI 10.1007/s11390-012-1294-0 Hierarchical Structures
More informationA Generalized Decision Logic in Interval-set-valued Information Tables
A Generalized Decision Logic in Interval-set-valued Information Tables Y.Y. Yao 1 and Qing Liu 2 1 Department of Computer Science, University of Regina Regina, Saskatchewan, Canada S4S 0A2 E-mail: yyao@cs.uregina.ca
More informationarxiv: v1 [cs.ai] 25 Sep 2012
Condition for neighborhoods in covering based rough sets to form a partition arxiv:1209.5480v1 [cs.ai] 25 Sep 2012 Abstract Hua Yao, William Zhu Lab of Granular Computing, Zhangzhou Normal University,
More informationARTICLE IN PRESS. Information Sciences xxx (2016) xxx xxx. Contents lists available at ScienceDirect. Information Sciences
Information Sciences xxx (2016) xxx xxx Contents lists available at ScienceDirect Information Sciences journal homepage: www.elsevier.com/locate/ins Three-way cognitive concept learning via multi-granularity
More informationResearch on Complete Algorithms for Minimal Attribute Reduction
Research on Complete Algorithms for Minimal Attribute Reduction Jie Zhou, Duoqian Miao, Qinrong Feng, and Lijun Sun Department of Computer Science and Technology, Tongji University Shanghai, P.R. China,
More informationApplication of Rough Set Theory in Performance Analysis
Australian Journal of Basic and Applied Sciences, 6(): 158-16, 1 SSN 1991-818 Application of Rough Set Theory in erformance Analysis 1 Mahnaz Mirbolouki, Mohammad Hassan Behzadi, 1 Leila Karamali 1 Department
More informationBulletin of the Transilvania University of Braşov Vol 10(59), No Series III: Mathematics, Informatics, Physics, 67-82
Bulletin of the Transilvania University of Braşov Vol 10(59), No. 1-2017 Series III: Mathematics, Informatics, Physics, 67-82 IDEALS OF A COMMUTATIVE ROUGH SEMIRING V. M. CHANDRASEKARAN 3, A. MANIMARAN
More informationHome Page. Title Page. Page 1 of 35. Go Back. Full Screen. Close. Quit
JJ II J I Page 1 of 35 General Attribute Reduction of Formal Contexts Tong-Jun Li Zhejiang Ocean University, China litj@zjou.edu.cn September, 2011,University of Milano-Bicocca Page 2 of 35 Objective of
More informationDrawing Conclusions from Data The Rough Set Way
Drawing Conclusions from Data The Rough et Way Zdzisław Pawlak Institute of Theoretical and Applied Informatics, Polish Academy of ciences, ul Bałtycka 5, 44 000 Gliwice, Poland In the rough set theory
More informationMathematical Approach to Vagueness
International Mathematical Forum, 2, 2007, no. 33, 1617-1623 Mathematical Approach to Vagueness Angel Garrido Departamento de Matematicas Fundamentales Facultad de Ciencias de la UNED Senda del Rey, 9,
More informationIndex. C, system, 8 Cech distance, 549
Index PF(A), 391 α-lower approximation, 340 α-lower bound, 339 α-reduct, 109 α-upper approximation, 340 α-upper bound, 339 δ-neighborhood consistent, 291 ε-approach nearness, 558 C, 443-2 system, 8 Cech
More informationFeature Selection with Fuzzy Decision Reducts
Feature Selection with Fuzzy Decision Reducts Chris Cornelis 1, Germán Hurtado Martín 1,2, Richard Jensen 3, and Dominik Ślȩzak4 1 Dept. of Mathematics and Computer Science, Ghent University, Gent, Belgium
More informationBanacha Warszawa Poland s:
Chapter 12 Rough Sets and Rough Logic: A KDD Perspective Zdzis law Pawlak 1, Lech Polkowski 2, and Andrzej Skowron 3 1 Institute of Theoretical and Applied Informatics Polish Academy of Sciences Ba ltycka
More informationParameters to find the cause of Global Terrorism using Rough Set Theory
Parameters to find the cause of Global Terrorism using Rough Set Theory Sujogya Mishra Research scholar Utkal University Bhubaneswar-751004, India Shakti Prasad Mohanty Department of Mathematics College
More informationHigh Frequency Rough Set Model based on Database Systems
High Frequency Rough Set Model based on Database Systems Kartik Vaithyanathan kvaithya@gmail.com T.Y.Lin Department of Computer Science San Jose State University San Jose, CA 94403, USA tylin@cs.sjsu.edu
More informationResearch Article The Uncertainty Measure of Hierarchical Quotient Space Structure
Mathematical Problems in Engineering Volume 2011, Article ID 513195, 16 pages doi:10.1155/2011/513195 Research Article The Uncertainty Measure of Hierarchical Quotient Space Structure Qinghua Zhang 1,
More informationResearch Article Special Approach to Near Set Theory
Mathematical Problems in Engineering Volume 2011, Article ID 168501, 10 pages doi:10.1155/2011/168501 Research Article Special Approach to Near Set Theory M. E. Abd El-Monsef, 1 H. M. Abu-Donia, 2 and
More informationOn Rough Set Modelling for Data Mining
On Rough Set Modelling for Data Mining V S Jeyalakshmi, Centre for Information Technology and Engineering, M. S. University, Abhisekapatti. Email: vsjeyalakshmi@yahoo.com G Ariprasad 2, Fatima Michael
More informationMinimal Attribute Space Bias for Attribute Reduction
Minimal Attribute Space Bias for Attribute Reduction Fan Min, Xianghui Du, Hang Qiu, and Qihe Liu School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu
More informationOn Improving the k-means Algorithm to Classify Unclassified Patterns
On Improving the k-means Algorithm to Classify Unclassified Patterns Mohamed M. Rizk 1, Safar Mohamed Safar Alghamdi 2 1 Mathematics & Statistics Department, Faculty of Science, Taif University, Taif,
More informationRough Sets, Rough Relations and Rough Functions. Zdzislaw Pawlak. Warsaw University of Technology. ul. Nowowiejska 15/19, Warsaw, Poland.
Rough Sets, Rough Relations and Rough Functions Zdzislaw Pawlak Institute of Computer Science Warsaw University of Technology ul. Nowowiejska 15/19, 00 665 Warsaw, Poland and Institute of Theoretical and
More informationROUGH set methodology has been witnessed great success
IEEE TRANSACTIONS ON FUZZY SYSTEMS, VOL. 14, NO. 2, APRIL 2006 191 Fuzzy Probabilistic Approximation Spaces and Their Information Measures Qinghua Hu, Daren Yu, Zongxia Xie, and Jinfu Liu Abstract Rough
More informationFoundations of Classification
Foundations of Classification J. T. Yao Y. Y. Yao and Y. Zhao Department of Computer Science, University of Regina Regina, Saskatchewan, Canada S4S 0A2 {jtyao, yyao, yanzhao}@cs.uregina.ca Summary. Classification
More informationENTROPIES OF FUZZY INDISCERNIBILITY RELATION AND ITS OPERATIONS
International Journal of Uncertainty Fuzziness and Knowledge-Based Systems World Scientific ublishing Company ENTOIES OF FUZZY INDISCENIBILITY ELATION AND ITS OEATIONS QINGUA U and DAEN YU arbin Institute
More informationNested Epistemic Logic Programs
Nested Epistemic Logic Programs Kewen Wang 1 and Yan Zhang 2 1 Griffith University, Australia k.wang@griffith.edu.au 2 University of Western Sydney yan@cit.uws.edu.au Abstract. Nested logic programs and
More informationON SOME PROPERTIES OF ROUGH APPROXIMATIONS OF SUBRINGS VIA COSETS
ITALIAN JOURNAL OF PURE AND APPLIED MATHEMATICS N. 39 2018 (120 127) 120 ON SOME PROPERTIES OF ROUGH APPROXIMATIONS OF SUBRINGS VIA COSETS Madhavi Reddy Research Scholar, JNIAS Budhabhavan, Hyderabad-500085
More informationA Boolean Lattice Based Fuzzy Description Logic in Web Computing
A Boolean Lattice Based Fuzzy Description Logic in Web omputing hangli Zhang, Jian Wu, Zhengguo Hu Department of omputer Science and Engineering, Northwestern Polytechnical University, Xi an 7007, hina
More informationAndrzej Skowron, Zbigniew Suraj (Eds.) To the Memory of Professor Zdzisław Pawlak
Andrzej Skowron, Zbigniew Suraj (Eds.) ROUGH SETS AND INTELLIGENT SYSTEMS To the Memory of Professor Zdzisław Pawlak Vol. 1 SPIN Springer s internal project number, if known Springer Berlin Heidelberg
More informationA Logical Formulation of the Granular Data Model
2008 IEEE International Conference on Data Mining Workshops A Logical Formulation of the Granular Data Model Tuan-Fang Fan Department of Computer Science and Information Engineering National Penghu University
More informationInterpreting Low and High Order Rules: A Granular Computing Approach
Interpreting Low and High Order Rules: A Granular Computing Approach Yiyu Yao, Bing Zhou and Yaohua Chen Department of Computer Science, University of Regina Regina, Saskatchewan, Canada S4S 0A2 E-mail:
More informationOn rule acquisition in incomplete multi-scale decision tables
*Manuscript (including abstract) Click here to view linked References On rule acquisition in incomplete multi-scale decision tables Wei-Zhi Wu a,b,, Yuhua Qian c, Tong-Jun Li a,b, Shen-Ming Gu a,b a School
More informationRough Set Model Selection for Practical Decision Making
Rough Set Model Selection for Practical Decision Making Joseph P. Herbert JingTao Yao Department of Computer Science University of Regina Regina, Saskatchewan, Canada, S4S 0A2 {herbertj, jtyao}@cs.uregina.ca
More informationFuzzy Propositional Logic for the Knowledge Representation
Fuzzy Propositional Logic for the Knowledge Representation Alexander Savinov Institute of Mathematics Academy of Sciences Academiei 5 277028 Kishinev Moldova (CIS) Phone: (373+2) 73-81-30 EMAIL: 23LSII@MATH.MOLDOVA.SU
More informationarxiv: v1 [math.lo] 20 Oct 2007
ULTRA LI -IDEALS IN LATTICE IMPLICATION ALGEBRAS AND MTL-ALGEBRAS arxiv:0710.3887v1 [math.lo] 20 Oct 2007 Xiaohong Zhang, Ningbo, Keyun Qin, Chengdu, and Wieslaw A. Dudek, Wroclaw Abstract. A mistake concerning
More information1.4 Equivalence Relations and Partitions
24 CHAPTER 1. REVIEW 1.4 Equivalence Relations and Partitions 1.4.1 Equivalence Relations Definition 1.4.1 (Relation) A binary relation or a relation on a set S is a set R of ordered pairs. This is a very
More informationA Rough Set Interpretation of User s Web Behavior: A Comparison with Information Theoretic Measure
A Rough et Interpretation of User s Web Behavior: A Comparison with Information Theoretic easure George V. eghabghab Roane tate Dept of Computer cience Technology Oak Ridge, TN, 37830 gmeghab@hotmail.com
More informationAPPLICATION FOR LOGICAL EXPRESSION PROCESSING
APPLICATION FOR LOGICAL EXPRESSION PROCESSING Marcin Michalak, Michał Dubiel, Jolanta Urbanek Institute of Informatics, Silesian University of Technology, Gliwice, Poland Marcin.Michalak@polsl.pl ABSTRACT
More informationLecture 4: Proposition, Connectives and Truth Tables
Discrete Mathematics (II) Spring 2017 Lecture 4: Proposition, Connectives and Truth Tables Lecturer: Yi Li 1 Overview In last lecture, we give a brief introduction to mathematical logic and then redefine
More informationA Scientometrics Study of Rough Sets in Three Decades
A Scientometrics Study of Rough Sets in Three Decades JingTao Yao and Yan Zhang Department of Computer Science University of Regina [jtyao, zhang83y]@cs.uregina.ca Oct. 8, 2013 J. T. Yao & Y. Zhang A Scientometrics
More informationAn algorithm for induction of decision rules consistent with the dominance principle
An algorithm for induction of decision rules consistent with the dominance principle Salvatore Greco 1, Benedetto Matarazzo 1, Roman Slowinski 2, Jerzy Stefanowski 2 1 Faculty of Economics, University
More informationUncertain Satisfiability and Uncertain Entailment
Uncertain Satisfiability and Uncertain Entailment Zhuo Wang, Xiang Li Department of Mathematical Sciences, Tsinghua University, Beijing, 100084, China zwang0518@sohu.com, xiang-li04@mail.tsinghua.edu.cn
More informationApplications of Some Topological Near Open Sets to Knowledge Discovery
IJACS International Journal of Advanced Computer Science Applications Vol 7 No 1 216 Applications of Some Topological Near Open Sets to Knowledge Discovery A S Salama Tanta University; Shaqra University
More informationQuantization of Rough Set Based Attribute Reduction
A Journal of Software Engineering and Applications, 0, 5, 7 doi:46/sea05b0 Published Online Decemer 0 (http://wwwscirporg/ournal/sea) Quantization of Rough Set Based Reduction Bing Li *, Peng Tang, Tommy
More informationRough Sets and Conflict Analysis
Rough Sets and Conflict Analysis Zdzis law Pawlak and Andrzej Skowron 1 Institute of Mathematics, Warsaw University Banacha 2, 02-097 Warsaw, Poland skowron@mimuw.edu.pl Commemorating the life and work
More informationRough Set Approach for Generation of Classification Rules for Jaundice
Rough Set Approach for Generation of Classification Rules for Jaundice Sujogya Mishra 1, Shakti Prasad Mohanty 2, Sateesh Kumar Pradhan 3 1 Research scholar, Utkal University Bhubaneswar-751004, India
More informationMatching Index of Uncertain Graph: Concept and Algorithm
Matching Index of Uncertain Graph: Concept and Algorithm Bo Zhang, Jin Peng 2, School of Mathematics and Statistics, Huazhong Normal University Hubei 430079, China 2 Institute of Uncertain Systems, Huanggang
More informationOn the Structure of Rough Approximations
On the Structure of Rough Approximations (Extended Abstract) Jouni Järvinen Turku Centre for Computer Science (TUCS) Lemminkäisenkatu 14 A, FIN-20520 Turku, Finland jjarvine@cs.utu.fi Abstract. We study
More informationClassification of Voice Signals through Mining Unique Episodes in Temporal Information Systems: A Rough Set Approach
Classification of Voice Signals through Mining Unique Episodes in Temporal Information Systems: A Rough Set Approach Krzysztof Pancerz, Wies law Paja, Mariusz Wrzesień, and Jan Warcho l 1 University of
More informationResearch Article A Variable Precision Attribute Reduction Approach in Multilabel Decision Tables
e Scientific World Journal, Article ID 359626, 7 pages http://dx.doi.org/10.1155/2014/359626 Research Article A Variable Precision Attribute Reduction Approach in Multilabel Decision Tables Hua Li, 1,2
More informationSIMPLIFIED MARGINAL LINEARIZATION METHOD IN AUTONOMOUS LIENARD SYSTEMS
italian journal of pure and applied mathematics n. 30 03 (67 78) 67 SIMPLIFIED MARGINAL LINEARIZATION METHOD IN AUTONOMOUS LIENARD SYSTEMS Weijing Zhao Faculty of Electronic Information and Electrical
More informationData Analysis - the Rough Sets Perspective
Data Analysis - the Rough ets Perspective Zdzisław Pawlak Institute of Computer cience Warsaw University of Technology 00-665 Warsaw, Nowowiejska 15/19 Abstract: Rough set theory is a new mathematical
More informationLogic, Sets, and Proofs
Logic, Sets, and Proofs David A. Cox and Catherine C. McGeoch Amherst College 1 Logic Logical Operators. A logical statement is a mathematical statement that can be assigned a value either true or false.
More informationLecture 1: Lattice(I)
Discrete Mathematics (II) Spring 207 Lecture : Lattice(I) Lecturer: Yi Li Lattice is a special algebra structure. It is also a part of theoretic foundation of model theory, which formalizes the semantics
More informationComparison of Rough-set and Interval-set Models for Uncertain Reasoning
Yao, Y.Y. and Li, X. Comparison of rough-set and interval-set models for uncertain reasoning Fundamenta Informaticae, Vol. 27, No. 2-3, pp. 289-298, 1996. Comparison of Rough-set and Interval-set Models
More informationcse303 ELEMENTS OF THE THEORY OF COMPUTATION Professor Anita Wasilewska
cse303 ELEMENTS OF THE THEORY OF COMPUTATION Professor Anita Wasilewska LECTURE 1 Course Web Page www3.cs.stonybrook.edu/ cse303 The webpage contains: lectures notes slides; very detailed solutions to
More informationPAC Learning. prof. dr Arno Siebes. Algorithmic Data Analysis Group Department of Information and Computing Sciences Universiteit Utrecht
PAC Learning prof. dr Arno Siebes Algorithmic Data Analysis Group Department of Information and Computing Sciences Universiteit Utrecht Recall: PAC Learning (Version 1) A hypothesis class H is PAC learnable
More informationComputational Intelligence, Volume, Number, VAGUENES AND UNCERTAINTY: A ROUGH SET PERSPECTIVE. Zdzislaw Pawlak
Computational Intelligence, Volume, Number, VAGUENES AND UNCERTAINTY: A ROUGH SET PERSPECTIVE Zdzislaw Pawlak Institute of Computer Science, Warsaw Technical University, ul. Nowowiejska 15/19,00 665 Warsaw,
More informationIndex. Cambridge University Press Relational Knowledge Discovery M E Müller. Index. More information
s/r. See quotient, 93 R, 122 [x] R. See class, equivalence [[P Q]]s, 142 =, 173, 164 A α, 162, 178, 179 =, 163, 193 σ RES, 166, 22, 174 Ɣ, 178, 179, 175, 176, 179 i, 191, 172, 21, 26, 29 χ R. See rough
More informationRough Sets for Uncertainty Reasoning
Rough Sets for Uncertainty Reasoning S.K.M. Wong 1 and C.J. Butz 2 1 Department of Computer Science, University of Regina, Regina, Canada, S4S 0A2, wong@cs.uregina.ca 2 School of Information Technology
More informationROUGH SET THEORY FOR INTELLIGENT INDUSTRIAL APPLICATIONS
ROUGH SET THEORY FOR INTELLIGENT INDUSTRIAL APPLICATIONS Zdzisław Pawlak Institute of Theoretical and Applied Informatics, Polish Academy of Sciences, Poland, e-mail: zpw@ii.pw.edu.pl ABSTRACT Application
More informationNotes on Rough Set Approximations and Associated Measures
Notes on Rough Set Approximations and Associated Measures Yiyu Yao Department of Computer Science, University of Regina Regina, Saskatchewan, Canada S4S 0A2 E-mail: yyao@cs.uregina.ca URL: http://www.cs.uregina.ca/
More informationFoundations of Mathematics MATH 220 FALL 2017 Lecture Notes
Foundations of Mathematics MATH 220 FALL 2017 Lecture Notes These notes form a brief summary of what has been covered during the lectures. All the definitions must be memorized and understood. Statements
More informationHarvard CS 121 and CSCI E-121 Lecture 22: The P vs. NP Question and NP-completeness
Harvard CS 121 and CSCI E-121 Lecture 22: The P vs. NP Question and NP-completeness Harry Lewis November 19, 2013 Reading: Sipser 7.4, 7.5. For culture : Computers and Intractability: A Guide to the Theory
More informationREDUCTS AND ROUGH SET ANALYSIS
REDUCTS AND ROUGH SET ANALYSIS A THESIS SUBMITTED TO THE FACULTY OF GRADUATE STUDIES AND RESEARCH IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF SCIENCE IN COMPUTER SCIENCE UNIVERSITY
More informationModel Complexity of Pseudo-independent Models
Model Complexity of Pseudo-independent Models Jae-Hyuck Lee and Yang Xiang Department of Computing and Information Science University of Guelph, Guelph, Canada {jaehyuck, yxiang}@cis.uoguelph,ca Abstract
More informationTransversal and Function Matroidal Structures of Covering-Based Rough Sets
Transversal and Function Matroidal Structures of Covering-Based Rough Sets Shiping Wang 1, William Zhu 2,,andFanMin 2 1 School of Mathematical Sciences, University of Electronic Science and Technology
More informationRough Set Approaches for Discovery of Rules and Attribute Dependencies
Rough Set Approaches for Discovery of Rules and Attribute Dependencies Wojciech Ziarko Department of Computer Science University of Regina Regina, SK, S4S 0A2 Canada Abstract The article presents an elementary
More informationUncertain Fuzzy Rough Sets. LI Yong-jin 1 2
Uncertain Fuzzy Rough Sets LI Yong-jin 1 2 (1. The Institute of Logic and Cognition, Zhongshan University, Guangzhou 510275, China; 2. Department of Mathematics, Zhongshan University, Guangzhou 510275,
More informationUNIVERSITY OF PUNE, PUNE BOARD OF STUDIES IN MATHEMATICS SYLLABUS. F.Y.BSc (Computer Science) Paper-I Discrete Mathematics First Term
UNIVERSITY OF PUNE, PUNE 411007. BOARD OF STUDIES IN MATHEMATICS SYLLABUS F.Y.BSc (Computer Science) Paper-I Discrete Mathematics First Term 1) Finite Induction (4 lectures) 1.1) First principle of induction.
More informationRough operations on Boolean algebras
Rough operations on Boolean algebras Guilin Qi and Weiru Liu School of Computer Science, Queen s University Belfast Belfast, BT7 1NN, UK Abstract In this paper, we introduce two pairs of rough operations
More informationNear approximations via general ordered topological spaces M.Abo-Elhamayel Mathematics Department, Faculty of Science Mansoura University
Near approximations via general ordered topological spaces MAbo-Elhamayel Mathematics Department, Faculty of Science Mansoura University Abstract ough set theory is a new mathematical approach to imperfect
More informationPUBLICATIONS OF CECYLIA RAUSZER
PUBLICATIONS OF CECYLIA RAUSZER [CR1] Representation theorem for semi-boolean algebras I, Bull. Acad. Polon. Sci., Sér. Sci. Math. Astronom. Phys. 19(1971), 881 887. [CR2] Representation theorem for semi-boolean
More informationApproximate Boolean Reasoning: Foundations and Applications in Data Mining
Approximate Boolean Reasoning: Foundations and Applications in Data Mining Hung Son Nguyen Institute of Mathematics, Warsaw University Banacha 2, 02-097 Warsaw, Poland son@mimuw.edu.pl Table of Contents
More informationROUGH SETS THEORY AND DATA REDUCTION IN INFORMATION SYSTEMS AND DATA MINING
ROUGH SETS THEORY AND DATA REDUCTION IN INFORMATION SYSTEMS AND DATA MINING Mofreh Hogo, Miroslav Šnorek CTU in Prague, Departement Of Computer Sciences And Engineering Karlovo Náměstí 13, 121 35 Prague
More informationConcept Lattices in Rough Set Theory
Concept Lattices in Rough Set Theory Y.Y. Yao Department of Computer Science, University of Regina Regina, Saskatchewan, Canada S4S 0A2 E-mail: yyao@cs.uregina.ca URL: http://www.cs.uregina/ yyao Abstract
More informationOptimization Models for Detection of Patterns in Data
Optimization Models for Detection of Patterns in Data Igor Masich, Lev Kazakovtsev, and Alena Stupina Siberian State University of Science and Technology, Krasnoyarsk, Russia is.masich@gmail.com Abstract.
More informationEnsembles of classifiers based on approximate reducts
Fundamenta Informaticae 34 (2014) 1 10 1 IOS Press Ensembles of classifiers based on approximate reducts Jakub Wróblewski Polish-Japanese Institute of Information Technology and Institute of Mathematics,
More informationBhubaneswar , India 2 Department of Mathematics, College of Engineering and
www.ijcsi.org 136 ROUGH SET APPROACH TO GENERATE CLASSIFICATION RULES FOR DIABETES Sujogya Mishra 1, Shakti Prasad Mohanty 2, Sateesh Kumar Pradhan 3 1 Research scholar, Utkal University Bhubaneswar-751004,
More informationWarm-Up Problem. Write a Resolution Proof for. Res 1/32
Warm-Up Problem Write a Resolution Proof for Res 1/32 A second Rule Sometimes throughout we need to also make simplifications: You can do this in line without explicitly mentioning it (just pretend you
More informationMachine Learning 2010
Machine Learning 2010 Decision Trees Email: mrichter@ucalgary.ca -- 1 - Part 1 General -- 2 - Representation with Decision Trees (1) Examples are attribute-value vectors Representation of concepts by labeled
More informationCS 486: Applied Logic Lecture 7, February 11, Compactness. 7.1 Compactness why?
CS 486: Applied Logic Lecture 7, February 11, 2003 7 Compactness 7.1 Compactness why? So far, we have applied the tableau method to propositional formulas and proved that this method is sufficient and
More informationLecture 13: Soundness, Completeness and Compactness
Discrete Mathematics (II) Spring 2017 Lecture 13: Soundness, Completeness and Compactness Lecturer: Yi Li 1 Overview In this lecture, we will prvoe the soundness and completeness of tableau proof system,
More information8. Reductio ad absurdum
8. Reductio ad absurdum 8.1 A historical example In his book, The Two New Sciences, Galileo Galilea (1564-1642) gives several arguments meant to demonstrate that there can be no such thing as actual infinities
More informationDiscovery of Concurrent Data Models from Experimental Tables: A Rough Set Approach
From: KDD-95 Proceedings. Copyright 1995, AAAI (www.aaai.org). All rights reserved. Discovery of Concurrent Data Models from Experimental Tables: A Rough Set Approach Andrzej Skowronl* and Zbigniew Suraj2*
More informationSemantic Rendering of Data Tables: Multivalued Information Systems Revisited
Semantic Rendering of Data Tables: Multivalued Information Systems Revisited Marcin Wolski 1 and Anna Gomolińska 2 1 Maria Curie-Skłodowska University, Department of Logic and Cognitive Science, Pl. Marii
More informationDecision tables and decision spaces
Abstract Decision tables and decision spaces Z. Pawlak 1 Abstract. In this paper an Euclidean space, called a decision space is associated with ever decision table. This can be viewed as a generalization
More informationAn Absorbing Markov Chain Model for Problem-Solving
American Journal of Applied Mathematics and Statistics, 2016, Vol. 4, No. 6, 173-177 Available online at http://pubs.sciepub.com/ajams/4/6/2 Science and Education Publishing DOI:10.12691/ajams-4-6-2 An
More informationSome Properties of a Set-valued Homomorphism on Modules
2012, TextRoad Publication ISSN 2090-4304 Journal of Basic and Applied Scientific Research www.textroad.com Some Properties of a Set-valued Homomorphism on Modules S.B. Hosseini 1, M. Saberifar 2 1 Department
More informationSimilarity-based Classification with Dominance-based Decision Rules
Similarity-based Classification with Dominance-based Decision Rules Marcin Szeląg, Salvatore Greco 2,3, Roman Słowiński,4 Institute of Computing Science, Poznań University of Technology, 60-965 Poznań,
More informationQualifying Exam in Machine Learning
Qualifying Exam in Machine Learning October 20, 2009 Instructions: Answer two out of the three questions in Part 1. In addition, answer two out of three questions in two additional parts (choose two parts
More information