International Journal of Approximate Reasoning

Size: px
Start display at page:

Download "International Journal of Approximate Reasoning"

Transcription

1 International Journal of Approximate Reasoning 53 (22) Contents lists available at SciVerse ScienceDirect International Journal of Approximate Reasoning journal homepage: Rough sets based matrix approaches with dynamic attribute variation in set-valued information systems Junbo Zhang a, Tianrui Li a,,daruan b,c,dunliu d, a School of Information Science and Technology, Southwest Jiaotong University, Chengdu 63, China b Belgian Nuclear Research Centre (SCK CEN), Boeretang 2, 24 Mol, Belgium c Department of Applied Mathematics & Computer Science, Ghent University, 9 Gent, Belgium d School of Economics and Management, Southwest Jiaotong University, Chengdu 63, China ARTICLE INFO Article history: Received 22 June 2 Received in revised form 9 October 2 Accepted 3 January 22 Available online 9 January 22 Keywords: Rough sets Knowledge discovery Matrix Set-valued information systems ABSTRACT Set-valued information systems are generalized models of single-valued information systems. The attribute set in the set-valued information system may evolve over time when new information arrives. Approximations of a concept by rough set theory need updating for knowledge discovery or other related tasks. Based on a matrix representation of rough set approximations, a basic vector H(X) is induced from the relation matrix. Four cut matrices of H(X), denotedbyh [μ,ν] (X), H (μ,ν] (X), H [μ,ν) (X) and H (μ,ν) (X), are derived for the approximations, positive, boundary and negative regions intuitively. The variation of the relation matrix is discussed while the system varies over time. The incremental approaches for updating the relation matrix are proposed to update rough set approximations. The algorithms corresponding to the incremental approaches are presented. Extensive experiments on different data sets from UCI and user-defined data sets show that the proposed incremental approaches effectively reduce the computational time in comparison with the non-incremental approach. 22 Elsevier Inc. All rights reserved.. Introduction Rough set theory (RST) was proposed by Pawlak [ 4] in the early 98s. It is a powerful mathematical tool to describe the dependencies among attributes, evaluate the significance of attributes, and derive decision rules. Nowadays, many rough sets based-approaches have been successfully applied in machine learning and data mining. They have been found to be particularly useful for rule induction [5 7] and feature subset selection [8 2]. An information system is a quadruple IS = (U, A, V, f ), where U is a non-empty finite set of objects; A is a non-empty finite set of attributes; V = a A V a and V a is a domain of attribute a; f : U A V is an information function such that f (x, a) V a for every x U, a A. In many practical issues, it happens that some of the attribute values for an object are set-valued, which are always used to characterize uncertain information and missing information in the information system [3]. For an information system IS = (U, A, V, f ), if each attribute has a unique attribute value, then IS with V = a A V a is called a single-valued information system; if an information system is not a single-valued information system, it is called a set-valued (multi-valued) information system [4,5]. Due to uncertainty and incompleteness of information, there exist a lot of missing data in the information system, called incomplete information system [6,7]. Such information systems can also be regarded as a special case of set-valued information systems, in which all missing values can be represented by the set of all possible values of each attribute [5]. Corresponding author. addresses: JunboZhang86@63.com, jbzhang86@gmail.com (J. Zhang), trli@swjtu.edu.cn (T. Li), druan@sckcen.be, da.ruan@ugent.be (D. Ruan), newton83@63.com (D. Liu) X/$ - see front matter 22 Elsevier Inc. All rights reserved.

2 J. Zhang et al. / International Journal of Approximate Reasoning 53 (22) Due to the dynamic characteristics of data collection, an information system varies with time. It retrains the system from scratch when the information system varies, which is known as a non-incremental approach [8]. However, the nonincremental approach is often very costly or even intractable. Alternatively, it can apply an incremental updating scheme, which is an effective method to maintain knowledge dynamically, to avoid unnecessary computations by utilizing the previous data structures or results. It has successfully been used to data analysis in the real-time applications and the applications with limited memory or computation ability. Since an information system is consisted of the attributes (features), the objects (instances), and the domain of attributes values, some incremental updating approaches with respect to rough sets have been proposed in such three aspects. Incremental updating approaches under the variation of the attribute set. Chan proposed an incremental method for updating the approximations of a concept based on the lower and upper boundary sets [9]. Li et al. presented an incremental method for updating approximations of a concept in an incomplete information system through the characteristic relation [2]. Qian et al. introduced a theoretic framework based on RST, called positive approximation, which could be used to accelerate a heuristic process of attribute reduction [9]. Cheng presented two incremental methods for fast computing the rough fuzzy approximations based on boundary sets and cut sets, respectively [2]. Incremental updating approaches under the variation of the object set. Shan and Ziarko presented a discernibility-matrix based incremental methodology to find all maximally generalized rules [5]. Liu et al. proposed an incremental model and approach as well as its algorithm for inducing interesting knowledge when the object set varies over time [6]. Furthermore, in business intelligent information systems, Liu et al. presented an optimization incremental approach as well as its algorithm for inducing interesting knowledge [7]. Incremental updating approaches under the variation of the attribute values. Chen et al. proposed an incremental algorithm for updating the approximations of a concept under coarsening or refining of attributes values [22]. Generally, the computation of approximations is a necessary step in knowledge representation and reduction based on rough sets. Approximations may further be applied to data mining and machine learning related work. To our best knowledge, most incremental algorithms mainly focus on the single-valued information systems, but not consider the situation of the set-valued information system. In order to compute and update approximations effectively in the set-valued information systems, it is interesting and desirable to investigate incremental approaches in this kind of information systems. To incrementally update the approximations of a set, the boundary set of a set was taken as a springboard [9,2]. Besides, Cheng took the boundary set and cut set as springboards [2]. By using the inner product method and matrix method, Liu proposed a unified axiomatic system to characterize the upper approximation operations [23]. In this paper, we define a basic vector generated by the relation matrix to drive the lower and upper approximations directly in the set-valued information system. Then we present approaches for updating the lower and upper approximations incrementally by use of the variation of the relation matrix. The remainder of this paper is organized as follows. Section 2 introduces the basic concepts of RST in the set-valued information system. Section 3 proposes the matrix characterizations of the lower and upper approximations. Section 4 presents two approaches for updating the relation matrix and gives some illustrative examples. Section 5 proposes the nonincremental and incremental algorithms. In the Section 6, the performances of non-incremental and incremental methods are evaluated on UCI and user-defined data. The paper ends with conclusions in Section Preliminaries In this section, we outline basic concepts, notations and results of set-valued information systems [,4,5,24]. Let IS = (U, A, V, f ) be a set-valued information system, where U is a non-empty finite set of objects; A is a non-empty finite set of attributes; V is the set of attributes values; f is a mapping from U A to V, where f : U A 2 V is a set-valued mapping. Table illustrates a set-valued information system, where U ={x, x 2, x 3, x 4, x 5, x 6 }, A ={a, a 2, a 3, a 4 } and V ={,, 2}. A set-valued decision information system (U, C {d}, V, f ) is a special case of a set-valued information system, where U is a non-empty finite set of objects; C is a non-empty finite set of condition attributes and d is a decision attribute with C {d} =Ø; V = V C V d, where V C is the set of condition attribute values and V d is the set of decision attribute values; Table A set-valued information system. U a a 2 a 3 a 4 x {} {} {, 2} {, 2} x 2 {,, 2} {,, 2} {, 2} {,, 2} x 3 {, 2} {} {, 2} {, 2} x 4 {, } {, 2} {, 2} {} x 5 {, 2} {, 2} {, 2} {} x 6 {} {} {, } {, }

3 622 J. Zhang et al. / International Journal of Approximate Reasoning 53 (22) f is a mapping from U (C {d}) to V such that f : U C 2 V C is a set-valued mapping and f : U {d} V d is a single-valued mapping. Following are two common ways to present a semantic interpretation of the set-valued information systems [4,5,25 28]. Type A: For x U, b A, f (x, b) is interpreted conjunctively. For instance, suppose b is the attribute speaking a language", then f (x, b) ={Chinese, Korean, Japanese} can be interpreted as: x speaks Chinese, Korean, and Japanese. Furthermore, when considering the attribute feeding habits" of animals, if we denote the attribute value of herbivore as " and carnivore as ", then animals possessing attribute value {, } are considered as possessing both herbivorous and carnivorous nature. Type B: For x U, b A, f (x, b) is interpreted disjunctively. For instance, suppose b is the attribute speaking a language", then f (x, b) ={Chinese, Korean, Japanese} can be interpreted as: x speaks Chinese, Korean or Japanese, and x can speak only one of them. Incomplete information systems with some unknown attribute values or partial known attribute values [6,7] are such a type of set-valued information systems. Definition [4,29]. In the set-valued information system (U, A, V, f ), forb A, the tolerance relation T b is defined as: T b ={(x, y) f (x, b) f (y, b) = }. () and for B A, the tolerance relation is defined as follows: ={(x, y) b B, f (x, b) f (y, b) = }= b B T b. (2) When (x, y),wecallx and y are indiscernible or x tolerant with y w.r.t. B.Let (x) ={y y U, y x},wecall (x) thetoleranceclassforx w.r.t.. Example. A set-valued information system is presented in Table. Let B = A. Here, tolerance classes of objects in U can be computed by Definition. Therefore, (x ) ={x, x 2, x 4 }, (x 2 ) ={x, x 2, x 3, x 4, x 5, x 6 }, (x 3 ) = (x 6 ) = {x 2, x 3, x 5, x 6 }, (x 4 ) ={x, x 2, x 5, x 6 }, (x 5 ) ={x 2, x 3, x 4, x 5, x 6 }. Definition 2 [29]. Given a set-valued information system IS = (U, A, V, f ), X U, B A, the lower and upper approximations of X in terms of tolerance relation are defined as (X) ={x U (x) X} (X) ={x U (x) X = } (3) (4) Intuitively, these two approximations divide the universe U into three disjoint regions: the positive region POS TB (X),the boundary region BND TB (X) and the negative region NEG TB (X), respectively. POS TB (X) = (X) BND TB (X) = (X) (X) (5) NEG TB (X) = U (X) The positive region POS TB (X) is defined by the lower approximation, the boundary region BND TB (X) is defined by the difference between the upper and lower approximations, and the negative region NEG TB (X) is defined by the complement of the upper approximation. Example 2 (Continuation of Example ). Let X ={x, x 2, x 3, x 4, x 5 }.Since (x ) X, (x 2 ) X, (x 3 ) X, (x 4 ) X, (x 5 ) X and (x 6 ) X, then (X) ={x, x 4 };since i {, 2, 3, 4, 5, 6}, (x i ) X =,then (X) = {x, x 2, x 3, x 4, x 5, x 6 }.Hence,POS TB (X) = (X) ={x, x 4 }, BND TB (X) = (X) (X) ={x 2, x 3, x 5, x 6 } and NEG TB (X) = U (X) = Ø. However, the above definitions are not robust enough for tolerating the noisy samples in real applications. Followed by the idea of probabilistic rough sets [3], the rough sets can also be generalized by introducing a measure of inclusion degree. Given two sets X and Y in the universe U, X s inclusion degree in Y is defined as I(X, Y) = X Y, X where stands for the cardinal number of a set and X = Ø.

4 J. Zhang et al. / International Journal of Approximate Reasoning 53 (22) Definition 3. Given a set-valued information system IS = (U, A, V, f ), X U, B A, the(α, β)-probabilistic lower and upper approximation of X in terms of tolerance relation are defined as T (α,β) B (X) ={x U I( (x), X) α} T (α,β) B (X) ={x U I( (x), X) >β} (6) (7) Similarly, the (α, β)-probabilistic positive, boundary and negative regions can be defined by the (α, β)-probabilistic lower and upper approximations, respectively. POS (α,β) (X) ={x U I( (x), X) α} BND (α,β) (X) ={x U β <I( (x), X) <α} NEG (α,β) (X) ={x U I( (x), X) β} Clearly, the parameters α and β allow certain acceptable level of errors, and the probabilistic neighborhood rough set model makes the process of decision making more reasonable. Specially, it degrades to the classical rough model if α = and β =. In addition, the values of α and β can be automatically computed by using the Bayesian minimum conditional risk criterion in the decision-theoretic rough set model [3]. (8) 3. Matrix representation of the approximations In this section, we give the matrix representation of the lower and upper approximations in the set-valued information system. We propose a matrix approach for computing the approximations in the set-valued information system. Definition 4. Given two μ ν matrices Y = (y ij ) μ ν and Z = (z ij ) μ ν, preference relations " and " aredefinedas follows respectively. () Preference relations ": Y Z if and only if y ij z ij, i =, 2,...,μ, j =, 2,...,ν; (2) Preference relations ": Y Z if and only if y ij z ij, i =, 2,...,μ, j =, 2,...,ν; (3) Minimum min: min(y, Z) = (min(y ij, z ij )) μ ν. Definition 5 [23]. Let U ={x, x 2,...,x n }, and X be a subset of U. The characteristic function G(X) = (g, g 2,...,g n ) T (T denotes the transpose operation) is defined as, x i X g i = (9), x i / X where G(X) assigns to an element that belongs to X and to an element that does not belong to X. Example 3. Let U ={x, x 2, x 3, x 4, x 5, x 6 } and X ={x, x 2, x 3, x 4, x 5 }.ThenG(X) = (,,,,, ) T. Definition 6 [32]. Given a set-valued information system IS = (U, A, V, f ). Letb A and T b be a tolerance relation on U, M T b n n = (ϕ ij ) n n be an n n matrix representing T b, called the relation matrix w.r.t. b. Then, (x i, x j ) T b ϕ ij = (), (x i, x j ) T b Definition 7 [32]. Let B A and be a tolerance relation on U, M n n = (m ij ) n n be an n n matrix representing, called the relation matrix w.r.t. B.Then, (x i, x j ) m ij = (), (x i, x j ) Corollary [33]. Let M n n = (m ij ) n n and be a tolerance relation on U. Then m ii = and m ij = m ji, i, j n.

5 624 J. Zhang et al. / International Journal of Approximate Reasoning 53 (22) Lemma. Let b, b 2 AandT {b,b 2 },T b,t b2 be the tolerance relation on U. Then M T {b,b 2 } n n { } = min M T b n n, M T b 2 n n. Corollary 2. Let B A. Then M n n = min b B M T b n n. Example 4. We continue Example.LetB = A ={a, a 2, a 3, a 4 }.Then M T a M T a 3, M T a 4, M T a 2 { } Here, it is easy to verify that M min M T a 6 6, MT a 2 6 6, MT a 3 6 6, MT a , and M. Definition 8. Let B A and be a tolerance relation on U, n n be an induced diagonal matrix of M n n = (m ij ) n n.then ( n n = diag,,..., ) ( ) = diag (2) λ λ 2 λ n where λ i = n j= m ij, i n. Corollary 3. n n = (x ) (x 2 ) (x n ) n, j= m j n,..., j= m 2j n j= m nj and T B (x i ) n, i n. Example 5 Continuation of Example 4. Here we compute the induced diagonal matrix of M 6 6 by Definition 8. diag(/3, /6, /4, /4, /5, /4). Definition 9. The n-column vector called basic vector, denoted by H(X), isdefinedas: H(X) = n n (M n n G(X)) = n n n (3) where is dot product of matrices and n = M n n G(X). Corollary 4. Suppose H(X) = (h, h 2,, h n ) T.Let n n = diag( λ, λ 2,..., λ ),M n n n = (m ij ) n n,g(x) = (g, g 2,...,g n ) T, n = (ω,ω 2,...,ω n ) T,whereλ i = n j= m ij, i n. Then, i {, 2,...,n}, ω i = n j= m ijg j, h i = ω n i j= λ i = m ijg j and h i. n j= m ij

6 J. Zhang et al. / International Journal of Approximate Reasoning 53 (22) Definition. Let μ ν. Four cut matrices of H(X), denotedbyh [μ,ν] (X), H (μ,ν] (X), H [μ,ν) (X) and H (μ,ν) (X), are defined as follows. () H [μ,ν] (X) = (h i ) n h =, μ h i ν i, else (4) (2) H (μ,ν] (X) = (h i ) n h =, μ<h i ν i, else (5) (3) H [μ,ν) (X) = (h i ) n h =, μ h i <ν i, else (6) (4) H (μ,ν) (X) = (h i ) n h =, μ<h i <ν i, else (7) Remark. These four cut matrices are Boolean matrices. Lemma 2. Given any subset X U in a set-valued information system IS = (U, A, V, f ), whereu ={x, x 2,...,x n }.B A and is a tolerance relation on U. H(X) = (h, h 2,...,h n ) T is the basic vector. Then the lower and upper approximations of X in the set-valued information system can be computed from the cut matrix of H(X) as follows. () The n-column boolean vector G( (X)) of the lower approximation (X): G( (X)) = H [,] (X) (8) (2) The n-column boolean vector G( (X)) of the upper approximation (X): G( (X)) = H (,] (X) (9) Proof. Suppose G(X) = (g, g 2,...,g n ) T, G( (X)) = (g, g 2,...,g n )T, H [,] (X) = (h, h 2,...,h n )T. () ": i {, 2,...,n},ifg i =, namely, x i (X),then (x i ) X. Therefore, x j (x i ),wehave(x j, x i ) n j= and x j X, which means m ij =, g j =. That is, m ij = m ij g j. According to Corollary 4 and Definition, h i = m ijg j = n j= m ij and h = i. Hence, i {, 2,...,n}, g i h i, that is, G((X)) H [,] (X). ": i {, 2,, n}, ifh n j= i =, then h i =, namely, m ijg j n = j= m h i =. Obviously, j {, 2,...,n}, ifm ij =, ij then g j =, in other words, x j (x i ),wehavex j X. Then, (x i ) X, namely, x i (X) and g i =. Thus i {, 2,...,n}, g i h i. Therefore, G((X)) H [,] (X). Thus, G( (X)) = H [,] (X). (2) The proof is similar to that of (). Corollary 5. The positive region POS TB (X), the boundary region BND TB (X), and the negative region NEG TB (X) can also be generated from the cut matrix of H(X), respectively as follows. () The n-column boolean vector G(POS TB (X)) of the positive region: G(POS TB (X)) = H [,] (X) (2) (2) The n-column boolean vector G(BND TB (X)) of the boundary region: G(BND TB (X)) = H (,) (X) (2)

7 626 J. Zhang et al. / International Journal of Approximate Reasoning 53 (22) (3) The n-column boolean vector G(NEG TB (X)) of the negative region: G(NEG TB (X)) = H [,] (X) (22) Proof. The proof is similar to that of Lemma 2. Example 6 Continuation of Examples 4 and 5. Since X and M 6 6 have been constructed in Examples 3 and 4, respectively. Then H(X) = 6 6 (M 6 6 X) = diag(/3, /6, /4, /4, /5, /4) (3, 5, 3, 4, 4, 3)T = (, 5/6, 3/4,, 4/5, 3/4) T. We obtain n-column boolean vectors of approximations from H(X) as follows. H(X) G( (X)) = H [,] (X) 5/6 3/4 4/5 3/4 G( (X)) = H (,] (X) G(POS TB (X)) = H [,] (X) G(BND TB (X)) = H (,) (X) G(NEG TB (X)) = H [,] (X) In other words, (X) ={x, x 4 }, (X) ={x, x 2, x 3, x 4, x 5, x 6 }, POS TB (X) ={x, x 4 }, BND TB (X) ={x 2, x 3, x 5, x 6 } and NEG TB (X) = Ø, respectively. Corollary 6. The (α, β)-probabilistic lower and upper approximations and (α, β)-probabilistic positive, boundary and negative regions are generated from the cut matrix of H(X) as follows. () The n-column boolean vector G(T (α,β) B (X)) of the lower approximation T (α,β) B (X): ( G T (α,β) B (X) ) = H [α,] (X) (23) (α,β) (α,β) (2) The n-column boolean vector G( (X)) of the upper approximation TB (X): ( ) (α,β) G (X) = H (β,] (X) (3) The n-column boolean vector G(POS (α,β) ( G POS (α,β) (X) ) = H [α,] (X) (4) The n-column boolean vector G(BND (α,β) ( G BND (α,β) (X) ) = H (α,β) (X) (5) The n-column boolean vector G(NEG (α,β) ( G NEG (α,β) (X) ) = H [,β] (X) (X)) of the positive region: (X)) of the boundary region: (X)) of the negative region: (24) (25) (26) (27) Proof. The proof is similar to that of Lemma Updating approximations in set-valued information systems incrementally based on the matrix In this section, we consider the problem of updating approximations of a subset X of U in terms of variation of the attribute set in set-valued information systems. The variation of the attribute set, also called the attribute generalization [9], contains two situations, adding attributes and removing attributes, respectively. Since the lower approximation, upper approximation, positive, boundary and negative regions can be induced from the basic vector H(X) directly.the key step for computing basic vector H(X) is to construct the relation matrix. Thus, it reduces running time if we can renew the relation

8 J. Zhang et al. / International Journal of Approximate Reasoning 53 (22) matrix with an incremental updating strategy rather than reconstructing it from scratch. In the following, we discuss how to update the relation matrix incrementally while the attribute set varies. Lemma 3. Let P and Q be two attribute sets. Then M T P n n M T P Q n n,mt Q n n M T P Q n n. 4.. Updating approximations incrementally when adding an attribute set Corollary 7. Let P, Q AandQ P =. Suppose M T P Q n n = (m ) ij n n is a relation matrix representing T P Q,andM T P n n = (m ij ) n n a relation matrix representing T P. The relation matrix by adding Q to P can be updated as: m ij =, m ij = (x i, x j ) T Q, m ij = (x i, x j ) T Q Remark 2. When adding Q to P, m ij is constant while m ij =. Example 7 (Continuation of Example 4). Let P ={a, a 4 }, Q ={a 2, a 3 }. Here, we update the matrix by adding Q to P. According to Definition 7, wehavem T P. According to Corollary 7, ifm ij =, then m ij =. Moreover, m ii = and m ij = m ji by Corollary. Hence, we just update the m ij which value is equal to in M T P and i n, i < j n. Hence,M T P Q, where the four s are changed. Corollary 8. Let P, Q A, Q P = and T P ( diag,,..., λ λ 2 λ n exclusive disjunction (exclusive or). n n ( ) = diag λ, λ 2,..., λ n,whereλ i (28) 6 6 = n j= m ij. Suppose T P Q n n = ).Thenλ i λ i, i n. Moreover, λ i = λ i n j= m ij m ij,where is the logical operation Proof. According to Remark 2, m ij is constant while m ij = when adding Q to P.Hence,m ij is constant or m ij changes from to.(a)whenm ij is constant, λ i is constant; (b) When m ij changes from to, λ i decreases because λ i = n j= m ij.thus, = λ i n j= m ij m ij. λ i Example 8 (Continuation of Example 7). T P ( ) diag λ, λ 2,, λ 6 = diag(/3, /6, /5, /6, /5, /5). After adding Q to P, by Corollary 8, wehaveλ = λ = 3, λ 2 = λ 2 = 6, λ 3 = λ 3 = 4, λ 4 = λ 4 2 = 4, λ 5 = λ 5 = 5 and λ 6 = λ 6 = 4. Hence, T P Q 6 6 = diag(/3, /6, /4, /4, /5, /4). Corollary 9. Let P, Q A, Q P = and T P n = MT P n n G(X) = (ω,ω 2,...,ω n ) T,whereω i = n j= m ijg j and G(X) = {g, g 2,...,g n }. Suppose T P Q n n = (ω,ω 2,...,ω n ) T.Thenω i ω i, i n, and ω i = ω i n j= ((m ij m ) ij g j).

9 628 J. Zhang et al. / International Journal of Approximate Reasoning 53 (22) Proof. The proof is similar to that of Corollary 8. Example 9 (Continuation of Example 7). Since G(X) = (,,,,, ) T,then T P 6 = MT P 6 6 G(X) = (ω,ω 2,,ω 6 ) T = (3, 5, 4, 5, 4, 4) T. When adding Q to P, according to Corollary 9, ω = ω = 3, ω 2 = ω 2 = 5, ω 3 = ω 3 = 3, ω 4 = ω 4 2 = 4, ω 5 = ω 5 = 4 and ω 6 = ω 6 = 3. Thus, T P Q 6 = (3, 5, 3, 4, 4, 3)T. In the subsection, we discuss about how to update M T P n n, T P n n and T P n when adding an attribute set Q to P. Then, H(X) can be obtained by Corollary 4 and approximations can be computed from the cut matrix of H(X) Updating approximations incrementally when deleting an attribute set Corollary. Let Q P A. Suppose M T P Q n n = (m ) ij n n is a relation matrix representing T P Q,andM T P n n = (m ij ) n n is a relation matrix representing T P. The relation matrix by deleting Q from P can be updated as: m, m ij = (x i, x j ) T P Q ij = (29), m ij = (x i, x j ) T P Q Remark 3. When deleting Q from P, m ij is constant while m ij =. Example. We continue Example 4.LetP ={a, a 2, a 3, a 4 }, Q ={a 2, a 3 }. Here, we update the matrix by deleting Q from P. ByExample4, M T P. Then, m ii = and m ij = m ji by Corollary. Furthermore, according to Corollary, m ij = when m ij =. Therefore, we just update the m ij whose value is equal to in M T P i n, i < j n.wehavem T P Q, where the four s are changed. λ n 6 6 and ( ) Corollary. Let Q P Aand T P n n = diag λ, λ 2,..., λ n,whereλ i = ( n j= m ij. Suppose T P Q n n = diag,,..., ) λ λ 2.Thenλ i λ i, i n. Moreover, λ i = λ i + n j= m ij m ij. Proof. The proof is similar to that of Corollary 8. Example (Continuation of Example ). T P ( ) diag λ, λ 2,..., λ 6 = diag(/3, /6, /4, /4, /5, /4). After deleting Q from P, by Corollary,wehaveλ = λ = 3, λ 2 = λ 2 = 6, λ 3 = λ 3 + = 5, λ 4 = λ = 6, λ 5 = λ 5 = 5 and λ 6 = λ 6 + = 5. Hence, T P Q 6 6 = diag(/3, /6, /5, /6, /5, /5).

10 Corollary 2. Let Q P Aand T P n = MT P {g, g 2,...,g n }. Suppose T P Q n n = ( ω,ω 2,...,ω n Proof. The proof is similar to that of Corollary 8. J. Zhang et al. / International Journal of Approximate Reasoning 53 (22) = n j= m ijg j and G(X) = n n G(X) = (ω,ω 2,...,ω n ) T,whereω i ) T.Thenω i ω i, i n, and ω i = ω i + n j= ((m ij m ) ij g j). Example 2 (Continuation of Example ). Since G(X) = (,,,,, ) T,then T P 6 = MT P 6 6 G(X) = (ω,ω 2,, ω 6 ) T = (3, 5, 3, 4, 4, 3) T.WhendeletingQ from P, according to Corollary 2, ω = ω = 3, ω 2 = ω 2 = 5, ω 3 = ω 3 + = 3, ω 4 = ω = 4, ω 5 = ω 5 = 4 and ω 6 = ω 6 + = 3. Thus, T P Q 6 = (3, 5, 4, 5, 4, 4)T. In this subsection, we proposed the methods of updating M T P n n, T P n n and T P n whendeletinganattributesetq from P. Then, H(X) can be obtained by Corollary 4 and approximations can be computed from the cut matrix of H(X). 5. Development of static and dynamic algorithms for computing approximations based on the matrix In this section, we design static and dynamic algorithms for computing approximations based on the matrix in the set-valued decision information system corresponding to Sections 3 and 4, respectively. 5.. Algorithm (The static algorithm for computing approximations based on the matrix) Algorithm : The static algorithm for computing approximations of decision classes in the set-valued decision information system based on the matrix () Input: () A set-valued decision information system with a condition attribute set P and a decision attribute d; (2) The thresholds α and β. Output: The lower and upper approximations of the decision classes in the set-valued decision information system. begin 2 3 for i = to n do m ii = ; // Construct the relation matrix with respect to T P, M T P n n = (m ij) n n // According Corollary, m ii = 4 5 for j = i + to n do if (x i, x j ) T P then m ij = m ji = ; // According to Definitions and 7 6 else m ij = m ji = ; 7 end 8 end 9 for i = to n do // Compute the induced diagonal matrix of M T P n n : = diag(/λ, /λ 2,, /λ n ) λ i = n j= m ij ; end 2 From the set-valued decision information system, 3 for each class X k in the decision attribute do // Compute the approximations of the decision classes 4 Construct the n-column Boolean vector of X k : X k = (g k, g k2,...,g kn ) T ; 5 for i = to n do 6 ω ki = n j= m ij g kj ; // Compute intermediate values = (ω ki ) 7 end 8 Let H(X k ) = (h k, h k2,...,h kn ) T ; 9 for i = to n do 2 h ki = ω ki λ ; i 2 end 22 Compute the n-column Boolean vector of lower approximation: G(T P (α,β) (X k )) = H [α,] (X k ); 23 Compute the n-column Boolean vector of upper approximation: G(T P (α,β) (Xk )) = H (β,] (X k ); // Compute H(X k ) 24 Output the approximations T (α,β) (α,β) P (X k ) and T P (Xk ) by G(T (α,β) (α,β) P (X k )) and G(T P (Xk )), respectively. 25 end 26 end Algorithm (see Algorithm ) is a static (non-incremental) algorithm for computing approximations of the decision classes based on the matrix while the set-valued decision information system is constant. Steps 2-8 are to construct the relation matrix M T P n n by Definition 7 and Corollary. According to Definition 8, steps 9- are to compute the induced diagonal matrix. Step 4 is to construct the n-column Boolean vector of a concept X. Steps 5-7 are to compute the intermediate

11 63 J. Zhang et al. / International Journal of Approximate Reasoning 53 (22) values. Steps 8-2 are to compute the basic vector H(X) by Definition 9. Steps are to compute approximations by Corollary Algorithm DAUAM-A (The dynamic algorithm for updating approximations based on the matrix when adding an attribute set) Algorithm 2: The dynamic algorithm for updating approximations based on the matrix when adding an attribute set (DAUAM-A) Input: () A set-valued decision information system with a condition attribute set P and a decision attribute d; (2) The thresholds α and β; (3) Intermediate results: M = (m ij ) n n (relation matrix), = diag(/λ i ) (induced matrix), = (ω ki ), X k = (g ki ) T (n-column Boolean vector of X k ), which have been computed by Algorithm ; (4) Adding an attribute set Q and Q P = Ø. Output: Updated lower and upper approximations of the decision classes in the set-valued decision information system. begin 2 P P Q; // An attribute set Q is added to the attribute set P 3 for i = to n do 4 m ii = ; // According Corollary, m ii = is constant 5 for j = i + to n do 6 if m ij == then 7 m ij =m ji =;// According to Corollary 7, when adding Q to P, m ij is constant while m ij = 8 else 9 if (x i, x j ) T Q then m ij = m ji = ; // Update relation matrix: M = (m ij ) n n λ i = λ i andλ j = λ j ; // Update induced matrix: = diag(/λ i ) 2 3 for each k do if g ki == then ω ki = ω ki ; // Update intermediate result: = (ω ki ) 4 if g kj == then ω kj = ω kj ; 5 end 6 else 7 m ij = m ji = ; // According to Corollary 7 8 end 9 end 2 end 2 end 22 From the set-valued decision information system, 23 for each class X k in the decision attribute do // Compute the approximations of the decision classes 24 Let H(X k ) = (h k, h k2,...,h kn ) T ; 25 for i = to n do // Compute H(X k ) 26 h ki = ω ki λ ; i 27 end 28 Compute the n-column Boolean vector of lower approximation: G(T P (α,β) (X k )) = H [α,] (X k ); 29 Compute the n-column Boolean vector of upper approximation: G(T P (α,β) (Xk )) = H (β,] (X k ); 3 Output the approximations T (α,β) (α,β) P (X k ) and T P (Xk ) by G(T (α,β) (α,β) P (X k )) and G(T P (Xk )), respectively. 3 end 32 end Algorithm DAUAM-A (see Algorithm 2) is a dynamic (incremental) algorithm for updating approximations while adding an attribute set. Step 2 is to update the condition attribute set P. Steps 3-2 are to update intermediate results: M = (m ij ) n n (relation matrix), = diag(/λ i ) (induced matrix), = (ω ki ) when adding some attributes, which are the key steps in the Algorithm DAUAM-A. Specially, Step is to update the relation matrix, Step is to update the induced matrix and Steps 2-5 are to update the intermediate result according to Corollaries 7, 8 and 9, respectively. Steps 22-3 are to compute basic vectors of the decision classes and approximations by Corollary Algorithm DAUAM-D (The dynamic algorithm for updating approximations based on the matrix when deleting an attribute set) Algorithm DAUAM-D (see Algorithm 3) is a dynamic (incremental) algorithm for updating approximations while deleting an attribute set. Step 2 is to update the condition attribute set P. Steps 3-2 are to update intermediate results: M = (m ij ) n n (relation matrix), = diag(/λ i ) (induced matrix), = (ω ki ) when deleting some attributes, which are the key steps in the Algorithm DAUAM-A. Specially, Step is to update the relation matrix, Step is to update the induced matrix and

12 J. Zhang et al. / International Journal of Approximate Reasoning 53 (22) Algorithm 3: The dynamic algorithm for updating approximations based on the matrix when deleting an attribute set (DAUAM-D) Input: () A set-valued decision information system with a condition attribute set P and a decision attribute d; (2) The thresholds α and β; (3) Intermediate result: M = (m ij ) n n (relation matrix), = diag(/λ i ) (induced matrix), = (ω ki ), X k = (g ki ) T (n-column Boolean vector of X k ), which have been computed by Algorithm ; (4) Deleting an attribute set Q and Q P. Output: Updated lower and upper approximations of the decision classes in the set-valued decision information system. begin 2 P P Q; // An attribute set Q is deleted from the attribute set P 3 for i = to n do 4 m ii = ; // According Corollary, m ii = is constant 5 for j = i + to n do 6 if m ij = then 7 m ij =m ji =;// According to Corollary, when deleting Q, m ij is constant while m ij = 8 else 9 if (x i, x j ) T P Q then m ij = m ji = ; // Update relation matrix: M = (m ij ) n n λ i = λ i + andλ j = λ j + ; // Update induced matrix: = diag(/λ i ) 2 3 for each k do if g ki == then ω ki = ω ki + ; // Update intermediate result: = (ω ki ) 4 if g kj == then ω kj = ω kj + ; 5 end 6 else 7 m ij = m ji = ; // According to Corollary 8 end 9 end 2 end 2 end 22 From the set-valued decision information system, 23 for each class X k in the decision attribute do // Compute the approximations of the decision classes 24 Let H(X k ) = (h k, h k2,...,h kn ) T ; 25 for i = to n do // Compute H(X k ) 26 h ki = ω ki λ ; i 27 end 28 Compute the n-column Boolean vector of lower approximation: G(T P (α,β) (X k )) = H [α,] (X k ); 29 Compute the n-column Boolean vector of upper approximation: G(T P (α,β) (Xk )) = H (β,] (X k ); 3 Output the approximations T (α,β) (α,β) P (X k ) and T P (Xk ) by G(T (α,β) (α,β) P (X k )) and G(T P (Xk )), respectively. 3 end 32 end Steps 2-5 are to update the intermediate result according to Corollaries, and 2, respectively. Steps 22-3 are to compute basic vectors of the decision classes and approximations by Corollary Experimental evaluations To test performance of Algorithms, DAUAM-A and DAUAM-D, we download five data sets from the machine learning data repository, University of California at Irvine [34], which are all symbolic data with missing values. Here, all missing values are represented by the set of all possible values of each attribute, then the incomplete information system is regarded as a special case of the set-valued information system [5]. Besides, the set-valued data generator is developed and four different set-valued data sets are generated. All these data sets are outlined in Table 2. In what follows, we compare the computational times of non-incremental and incremental algorithms. To distinguish the computational times, we divide each of these nine data sets into five parts of equal size. The first part is regarded as the st data set, the combination of the st data set and the second part is viewed as the 2 nd data set, the combination of the 2 nd data set and the third part is regarded as the 3 rd data set,, the combination of all five parts is viewed as the 5 th data set. These data sets can be used to calculate the computational time used by non-incremental and incremental algorithms and show it vis-a-vis the sizes of the universe. In addition, by Algorithms, DAUAM-A and DAUAM-D, since the (α, β)- probabilistic approximations can be computed from H(X) with two thresholds α, β directly, then different thresholds cost the same operations. Thus, we test performance of these three algorithms with the constant thresholds α = and β =. The computations are conducted on a PC with Windows 7 and Inter(R) Core(TM)2 CPU P84 and 2 GB memory. Algorithms are coded in C++ and the software being used is Microsoft Visual 28.

13 632 J. Zhang et al. / International Journal of Approximate Reasoning 53 (22) Table 2 A description of data sets. Data sets Abbreviation Samples Attributes Classes Source Congressional voting records CVR UCI 2 Mushroom Mushroom UCI 3 Soybean-large Soybean UCI 4 Audiology (Standardized) Audiology UCI 5 Dermatology Dermatology UCI 6 Set-valued data one SVD Data generator 7 Set-valued data two SVD2 2 4 Data generator 8 Set-valued data three SVD3 5 8 Data generator 9 Set-valued data four SVD Data generator Table 3 A description of adding attribute set. Data sets Attributes Original attribute set Adding attribute set CVR 6 {a 7, a 8,...,a 6 } {a, a 2,...,a 6 } 2 Mushroom 22 {a, a 2,...,a 5 } {a 6, a 7,...,a 22 } 3 Soybean 35 {a, a 2,...,a 3 } {a 3, a 32,...,a 35 } 4 Audiology 69 {a, a 2,...,a 2, a 3, a 32,...,a 69 } {a 2, a 22,, a 3 } 5 Dermatology 34 {a, a 2,...,a 27 } {a 28, a 29,...,a 34 } 6 SVD 5 {a, a 2, a 3 } {a 4, a 5 } 7 SVD2 {a, a 2,...,a } {a, a 2,...,a } 8 SVD3 {a, a 2, a 3, a 7,...,a } {a 4, a 5, a 6 } 9 SVD4 8 {a, a 3, a 4, a 6, a 7 } {a 2, a 5, a 8 } (a) CVR (b) Mushroom (c) Soybean (d) Audiology (e) Dermatology (f) SVD (g) SVD (h) SVD (i) SVD4 Fig.. A comparison of Algorithm and DAUAM-A versus the size of data when adding an attribute set.

14 J. Zhang et al. / International Journal of Approximate Reasoning 53 (22) Table 4 A description of deleting attribute set. Data sets Attributes Original attribute set Deleting attribute set CVR 6 {a, a 2,...,a 6 } {a, a 2,...,a 6 } 2 Mushroom 22 {a, a 2,...,a 22 } {a 6, a 7,...,a 22 } 3 Soybean 35 {a, a 2,...,a 35 } {a 3, a 32,...,a 35 } 4 Audiology 69 {a, a 2,...,a 69 } {a 2, a 22,...,a 3 } 5 Dermatology 34 {a, a 2,...,a 34 } {a 28, a 29,...,a 34 } 6 SVD 5 {a, a 2, a 3, a 4, a 5 } {a 4, a 5 } 7 SVD2 {a, a 2,...,a } {a, a 2,...,a } 8 SVD3 {a, a 2,...,a } {a 4, a 5, a 6 } 9 SVD4 8 {a, a 2,...,a 8 } {a 2, a 5, a 8 } (a) CVR (b) Mushroom (c) Soybean (d) Audiology (e) Dermatology (f) SVD (g) SVD (h) SVD (i) SVD4 Fig. 2. A comparison of Algorithm and DAUAM-D versus the size of data when deleting an attribute set. 6.. and DAUAM-A We compare Algorithm with DAUAM-A on the nine data sets shown in Table 2 when adding an attribute set to the original attribute set. The original and adding attribute sets and the experimental results of these nine data sets are shown in Table 3 and Figure, respectively. In Figure, we display detailed change trend line of each of two algorithms with the increasing size of the data set. In each of these sub-figures (a) (i), the x-coordinate pertains to the size of the data set (the five data sets starting from the smallest one), while the y-coordinate concerns the computational time. From Figure, the computational time of each of these two algorithms increases with the increasing size of data. As shown in each sub-figureof Figure, Algorithm DAUAM-A consistently is faster than Algorithm on updating approximations

15 634 J. Zhang et al. / International Journal of Approximate Reasoning 53 (22) when the same attribute set is added to the original attribute set in the set-valued decision information system. Moreover, the differences become larger and larger when the size of the data set increases and DAUAM-D We compare Algorithm with DAUAM-D on the nine data sets shown in Table 2 whendeletinganattributeset.the original and deleting attribute sets and the experimental results of these nine data sets are shown in Table 4 and Figure 2, respectively. The computational time of each of these two algorithms increases with the increasing size of data from Figure 2. In each sub-figure of Figure 2, Algorithm DAUAM-D is faster than Algorithm on updating approximations when the same attribute set is deleted from the original attribute set in the set-valued decision information system. Furthermore, the differences become larger and larger when the size of the data set increases. 7. Conclusions In this paper, we defined a basic vector H(X) and four cut matrices of H(X) which were used to derive the approximations, positive, boundaryand negativeregions intuitively. They canalsobeusedtoderivethe (α, β)-probabilistic approximations, positive, boundary and negative regions directly. The key for generating H(X) is to construct the relation matrix. We presented a basic method for constructing the relation matrix. Since the system will vary with time, the attribute set may vary simultaneously. Then, we obtained an incremental approach for updating the relation matrix by investigating the properties about the variation of the relation matrix while attribute set varies. We also proposed the corresponding algorithms for updating approximations. Experimental studies pertaining to different UCI and user-defined data sets showed that the proposed algorithms effectively reduce computational time of updating approximations. Our future work will focus on the time complexity of the proposed approach based on the matrix and try to extend our method to incomplete information systems. Acknowledgments This work is supported by the National Science Foundation of China (Nos , and 67), the Youth Social Science Foundation of the Chinese Education Commission (No. YJC6327), the Fundamental Research Funds for the Central Universities (No. SWJTUZT8) and the Doctoral Innovation Foundation of Southwest Jiaotong University (No. 22ZJB), the Young Software Innovation Foundation of Sichuan Province (No. 2-7), China. The authors also thank Xiaoguang Gu for preparing the manuscript. References [] Z. Pawlak, Rough Sets: Theoretical Aspects of Reasoning about Data, System Theory, Kluwer Academic Publishers, Dordrecht, 99. [2] Z. Pawlak, A. Skowron, Rudiments of rough sets, Information Sciences 77 () (27) [3] Z. Pawlak, A. Skowron, Rough sets and boolean reasoning, Information Sciences 77 () (27) [4] Z. Pawlak, A. Skowron, Rough sets: some extensions, Information Sciences 77 () (27) [5] N. Shan, W. Ziarko, Data-based acquisition and incremental modification of classification rules, Computational Intelligence (2) (995) [6] D. Liu, T. Li, D. Ruan, W. Zou, An incremental approach for inducing knowledge from dynamic information systems, Fundamenta Informaticae 94 (2) (29) [7] D. Liu, T. Li, D. Ruan, J. Zhang, Incremental learning optimization on knowledge discovery in dynamic business intelligent systems, Journal of Global Optimization 5 (2) [8] Q. Hu, D. Yu, J. Liu, C. Wu, Neighborhood rough set based heterogeneous feature subset selection, Information Sciences 78 (8) (28) [9] Y. Qian, J. Liang, W. Pedrycz, C. Dang, Positive approximation: an accelerator for attribute reduction in rough set theory, Artificial Intelligence 74 (9-) (2) [] T. Yang, Q. Li, Reduction about approximation spaces of covering generalized rough sets, International Journal of Approximate Reasoning 5 (3) (2) [] Y. Shi, L. Yao, J. Xu, Aprobabilitymaximizationmodelbased onroughapproximationand itsapplicationto the inventoryproblem, InternationalJournal of Approximate Reasoning 52 (2) (2) [2] J. Qian, D. Miao, Z. Zhang, W. Li, Hybrid approaches to attribute reduction based on indiscernibility and discernibility relation, International Journal of Approximate Reasoning 52 (2) (2) [3] W. Zhang, J. Ma, S. Fan, Variable threshold concept lattices, Information Sciences 77 (22) (27) [4] Y. Guan, H. Wang, Set-valued information systems, Information Sciences 76 (7) (26) [5] Y. Qian, C. Dang, J. Liang, D. Tang, Set-valued ordered information systems, Information Sciences 79 (6) (29) [6] M. Kryszkiewicz, Rough set approach to incomplete information systems, Information Sciences 2 ( 4) (998) [7] M. Kryszkiewicz, Rules in incomplete information systems, Information Sciences 3 (3 4) (999) [8] R. Michalski, Knowledge repair mechanisms: evolution vs. revolution, in: Proceedings of ICML 85, 985, pp [9] C. Chan, A rough set approach to attribute generalization in data mining, Information Sciences 7 ( 4) (998) [2] T. Li, D. Ruan, W. Geert, J. Song, Y. Xu, A rough sets based characteristic relation approach for dynamic attribute generalization in data mining, Knowledge-Based Systems 2 (5) (27) [2] Y. Cheng, The incremental method for fast computing the rough fuzzy approximations, Data & Knowledge Engineering 7 () (2) 84. [22] H. Chen, T. Li, S. Qiao, D. Ruan, A rough set based dynamic maintenance approach for approximations in coarsening and refining attribute values, International Journal of Intelligent Systems 25 (2) [23] G. Liu, The axiomatization of the rough set upper approximation operations, Fundamenta Informaticae 69 (3) (26)

16 J. Zhang et al. / International Journal of Approximate Reasoning 53 (22) [24] Y. Leung, W. Wu, W. Zhang, Knowledge acquisition in incomplete information systems: a rough set approach, European Journal of Operational Research 68 () (26) [25] E. Orlowska, Z. Pawlak, Representation of nondeterministic information, Theoretical Computer Science 29 ( 2) (984) [26] I. Duntsch, G. Gediga, E. Orlowska, Relational attribute systems, International Journal of Human-Computer Studies 55 (3) (2) [27] W. Lipski, On databases with incomplete information, Journal of the ACM 26 (98) 4 7. [28] E. Orlowska, Semantic analysis of inductive reasoning, Theoretical Computer Science 43 (986) [29] W. Zhang, J. Mi, Incomplete information system and its optimal selections, Computers & Mathematics with Applications 48 (5 6) (24) [3] Y.Y. Yao, Probabilistic rough set approximations, International Journal of Approximate Reasoning 49 (2) (28) [3] Y.Y. Yao, S.K.M. Wong, A decision theoretic framework for approximating concepts, International Journal of Man-Machine Studies 37 (6) (992) [32] G. Liu, Axiomatic systems for rough sets and fuzzy rough sets, International Journal of Approximate Reasoning 48 (3) (28) [33] B. Kolman, R.C. Busby, S.C. Ross, Discrete Mathematical Structures, fifth ed., Prentice-Hall, Inc., Upper Saddle River, NJ, USA, 23. [34] D. Newman, S. Hettich, C. Blake, C. Merz, UCI Repository of Machine Learning Databases, University of California, Department of Information and Computer Science, Irvine, CA, mlearn/mlrepository.html.

ARPN Journal of Science and Technology All rights reserved.

ARPN Journal of Science and Technology All rights reserved. Rule Induction Based On Boundary Region Partition Reduction with Stards Comparisons Du Weifeng Min Xiao School of Mathematics Physics Information Engineering Jiaxing University Jiaxing 34 China ABSTRACT

More information

Classification Based on Logical Concept Analysis

Classification Based on Logical Concept Analysis Classification Based on Logical Concept Analysis Yan Zhao and Yiyu Yao Department of Computer Science, University of Regina, Regina, Saskatchewan, Canada S4S 0A2 E-mail: {yanzhao, yyao}@cs.uregina.ca Abstract.

More information

A Generalized Decision Logic in Interval-set-valued Information Tables

A Generalized Decision Logic in Interval-set-valued Information Tables A Generalized Decision Logic in Interval-set-valued Information Tables Y.Y. Yao 1 and Qing Liu 2 1 Department of Computer Science, University of Regina Regina, Saskatchewan, Canada S4S 0A2 E-mail: yyao@cs.uregina.ca

More information

Research on Complete Algorithms for Minimal Attribute Reduction

Research on Complete Algorithms for Minimal Attribute Reduction Research on Complete Algorithms for Minimal Attribute Reduction Jie Zhou, Duoqian Miao, Qinrong Feng, and Lijun Sun Department of Computer Science and Technology, Tongji University Shanghai, P.R. China,

More information

CRITERIA REDUCTION OF SET-VALUED ORDERED DECISION SYSTEM BASED ON APPROXIMATION QUALITY

CRITERIA REDUCTION OF SET-VALUED ORDERED DECISION SYSTEM BASED ON APPROXIMATION QUALITY International Journal of Innovative omputing, Information and ontrol II International c 2013 ISSN 1349-4198 Volume 9, Number 6, June 2013 pp. 2393 2404 RITERI REDUTION OF SET-VLUED ORDERED DEISION SYSTEM

More information

Notes on Rough Set Approximations and Associated Measures

Notes on Rough Set Approximations and Associated Measures Notes on Rough Set Approximations and Associated Measures Yiyu Yao Department of Computer Science, University of Regina Regina, Saskatchewan, Canada S4S 0A2 E-mail: yyao@cs.uregina.ca URL: http://www.cs.uregina.ca/

More information

Minimal Attribute Space Bias for Attribute Reduction

Minimal Attribute Space Bias for Attribute Reduction Minimal Attribute Space Bias for Attribute Reduction Fan Min, Xianghui Du, Hang Qiu, and Qihe Liu School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu

More information

A Scientometrics Study of Rough Sets in Three Decades

A Scientometrics Study of Rough Sets in Three Decades A Scientometrics Study of Rough Sets in Three Decades JingTao Yao and Yan Zhang Department of Computer Science University of Regina [jtyao, zhang83y]@cs.uregina.ca Oct. 8, 2013 J. T. Yao & Y. Zhang A Scientometrics

More information

Concept Lattices in Rough Set Theory

Concept Lattices in Rough Set Theory Concept Lattices in Rough Set Theory Y.Y. Yao Department of Computer Science, University of Regina Regina, Saskatchewan, Canada S4S 0A2 E-mail: yyao@cs.uregina.ca URL: http://www.cs.uregina/ yyao Abstract

More information

Hierarchical Structures on Multigranulation Spaces

Hierarchical Structures on Multigranulation Spaces Yang XB, Qian YH, Yang JY. Hierarchical structures on multigranulation spaces. JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY 27(6): 1169 1183 Nov. 2012. DOI 10.1007/s11390-012-1294-0 Hierarchical Structures

More information

Computers and Mathematics with Applications

Computers and Mathematics with Applications Computers and Mathematics with Applications 59 (2010) 431 436 Contents lists available at ScienceDirect Computers and Mathematics with Applications journal homepage: www.elsevier.com/locate/camwa A short

More information

ENTROPIES OF FUZZY INDISCERNIBILITY RELATION AND ITS OPERATIONS

ENTROPIES OF FUZZY INDISCERNIBILITY RELATION AND ITS OPERATIONS International Journal of Uncertainty Fuzziness and Knowledge-Based Systems World Scientific ublishing Company ENTOIES OF FUZZY INDISCENIBILITY ELATION AND ITS OEATIONS QINGUA U and DAEN YU arbin Institute

More information

Naive Bayesian Rough Sets

Naive Bayesian Rough Sets Naive Bayesian Rough Sets Yiyu Yao and Bing Zhou Department of Computer Science, University of Regina Regina, Saskatchewan, Canada S4S 0A2 {yyao,zhou200b}@cs.uregina.ca Abstract. A naive Bayesian classifier

More information

Rough Set Model Selection for Practical Decision Making

Rough Set Model Selection for Practical Decision Making Rough Set Model Selection for Practical Decision Making Joseph P. Herbert JingTao Yao Department of Computer Science University of Regina Regina, Saskatchewan, Canada, S4S 0A2 {herbertj, jtyao}@cs.uregina.ca

More information

Feature Selection with Fuzzy Decision Reducts

Feature Selection with Fuzzy Decision Reducts Feature Selection with Fuzzy Decision Reducts Chris Cornelis 1, Germán Hurtado Martín 1,2, Richard Jensen 3, and Dominik Ślȩzak4 1 Dept. of Mathematics and Computer Science, Ghent University, Gent, Belgium

More information

Rough operations on Boolean algebras

Rough operations on Boolean algebras Rough operations on Boolean algebras Guilin Qi and Weiru Liu School of Computer Science, Queen s University Belfast Belfast, BT7 1NN, UK Abstract In this paper, we introduce two pairs of rough operations

More information

METRIC BASED ATTRIBUTE REDUCTION IN DYNAMIC DECISION TABLES

METRIC BASED ATTRIBUTE REDUCTION IN DYNAMIC DECISION TABLES Annales Univ. Sci. Budapest., Sect. Comp. 42 2014 157 172 METRIC BASED ATTRIBUTE REDUCTION IN DYNAMIC DECISION TABLES János Demetrovics Budapest, Hungary Vu Duc Thi Ha Noi, Viet Nam Nguyen Long Giang Ha

More information

A new Approach to Drawing Conclusions from Data A Rough Set Perspective

A new Approach to Drawing Conclusions from Data A Rough Set Perspective Motto: Let the data speak for themselves R.A. Fisher A new Approach to Drawing Conclusions from Data A Rough et Perspective Zdzisław Pawlak Institute for Theoretical and Applied Informatics Polish Academy

More information

ARTICLE IN PRESS. Information Sciences xxx (2016) xxx xxx. Contents lists available at ScienceDirect. Information Sciences

ARTICLE IN PRESS. Information Sciences xxx (2016) xxx xxx. Contents lists available at ScienceDirect. Information Sciences Information Sciences xxx (2016) xxx xxx Contents lists available at ScienceDirect Information Sciences journal homepage: www.elsevier.com/locate/ins Three-way cognitive concept learning via multi-granularity

More information

Research Article Uncertainty Analysis of Knowledge Reductions in Rough Sets

Research Article Uncertainty Analysis of Knowledge Reductions in Rough Sets e Scientific World Journal, Article ID 576409, 8 pages http://dx.doi.org/10.1155/2014/576409 Research Article Uncertainty Analysis of Knowledge Reductions in Rough Sets Ying Wang 1 and Nan Zhang 2 1 Department

More information

arxiv: v1 [cs.ai] 25 Sep 2012

arxiv: v1 [cs.ai] 25 Sep 2012 Condition for neighborhoods in covering based rough sets to form a partition arxiv:1209.5480v1 [cs.ai] 25 Sep 2012 Abstract Hua Yao, William Zhu Lab of Granular Computing, Zhangzhou Normal University,

More information

Uncertain Logic with Multiple Predicates

Uncertain Logic with Multiple Predicates Uncertain Logic with Multiple Predicates Kai Yao, Zixiong Peng Uncertainty Theory Laboratory, Department of Mathematical Sciences Tsinghua University, Beijing 100084, China yaok09@mails.tsinghua.edu.cn,

More information

Research Article The Uncertainty Measure of Hierarchical Quotient Space Structure

Research Article The Uncertainty Measure of Hierarchical Quotient Space Structure Mathematical Problems in Engineering Volume 2011, Article ID 513195, 16 pages doi:10.1155/2011/513195 Research Article The Uncertainty Measure of Hierarchical Quotient Space Structure Qinghua Zhang 1,

More information

Interpreting Low and High Order Rules: A Granular Computing Approach

Interpreting Low and High Order Rules: A Granular Computing Approach Interpreting Low and High Order Rules: A Granular Computing Approach Yiyu Yao, Bing Zhou and Yaohua Chen Department of Computer Science, University of Regina Regina, Saskatchewan, Canada S4S 0A2 E-mail:

More information

Computational Intelligence, Volume, Number, VAGUENES AND UNCERTAINTY: A ROUGH SET PERSPECTIVE. Zdzislaw Pawlak

Computational Intelligence, Volume, Number, VAGUENES AND UNCERTAINTY: A ROUGH SET PERSPECTIVE. Zdzislaw Pawlak Computational Intelligence, Volume, Number, VAGUENES AND UNCERTAINTY: A ROUGH SET PERSPECTIVE Zdzislaw Pawlak Institute of Computer Science, Warsaw Technical University, ul. Nowowiejska 15/19,00 665 Warsaw,

More information

Easy Categorization of Attributes in Decision Tables Based on Basic Binary Discernibility Matrix

Easy Categorization of Attributes in Decision Tables Based on Basic Binary Discernibility Matrix Easy Categorization of Attributes in Decision Tables Based on Basic Binary Discernibility Matrix Manuel S. Lazo-Cortés 1, José Francisco Martínez-Trinidad 1, Jesús Ariel Carrasco-Ochoa 1, and Guillermo

More information

MODAL, NECESSITY, SUFFICIENCY AND CO-SUFFICIENCY OPERATORS. Yong Chan Kim

MODAL, NECESSITY, SUFFICIENCY AND CO-SUFFICIENCY OPERATORS. Yong Chan Kim Korean J. Math. 20 2012), No. 3, pp. 293 305 MODAL, NECESSITY, SUFFICIENCY AND CO-SUFFICIENCY OPERATORS Yong Chan Kim Abstract. We investigate the properties of modal, necessity, sufficiency and co-sufficiency

More information

Multigranulation rough set: A multiset based strategy

Multigranulation rough set: A multiset based strategy Multigranulation rough set: A multiset based strategy Xibei Yang 1,2, Suping Xu 1, Huili Dou 1, Xiaoning Song 3, Hualong Yu 1, Jingyu Yang 4 1 School of Computer Science and Engineering, Jiangsu University

More information

Rough Sets, Rough Relations and Rough Functions. Zdzislaw Pawlak. Warsaw University of Technology. ul. Nowowiejska 15/19, Warsaw, Poland.

Rough Sets, Rough Relations and Rough Functions. Zdzislaw Pawlak. Warsaw University of Technology. ul. Nowowiejska 15/19, Warsaw, Poland. Rough Sets, Rough Relations and Rough Functions Zdzislaw Pawlak Institute of Computer Science Warsaw University of Technology ul. Nowowiejska 15/19, 00 665 Warsaw, Poland and Institute of Theoretical and

More information

Quantization of Rough Set Based Attribute Reduction

Quantization of Rough Set Based Attribute Reduction A Journal of Software Engineering and Applications, 0, 5, 7 doi:46/sea05b0 Published Online Decemer 0 (http://wwwscirporg/ournal/sea) Quantization of Rough Set Based Reduction Bing Li *, Peng Tang, Tommy

More information

Knowledge Discovery. Zbigniew W. Ras. Polish Academy of Sciences, Dept. of Comp. Science, Warsaw, Poland

Knowledge Discovery. Zbigniew W. Ras. Polish Academy of Sciences, Dept. of Comp. Science, Warsaw, Poland Handling Queries in Incomplete CKBS through Knowledge Discovery Zbigniew W. Ras University of orth Carolina, Dept. of Comp. Science, Charlotte,.C. 28223, USA Polish Academy of Sciences, Dept. of Comp.

More information

Selected Algorithms of Machine Learning from Examples

Selected Algorithms of Machine Learning from Examples Fundamenta Informaticae 18 (1993), 193 207 Selected Algorithms of Machine Learning from Examples Jerzy W. GRZYMALA-BUSSE Department of Computer Science, University of Kansas Lawrence, KS 66045, U. S. A.

More information

2 WANG Jue, CUI Jia et al. Vol.16 no", the discernibility matrix is only a new kind of learning method. Otherwise, we have to provide the specificatio

2 WANG Jue, CUI Jia et al. Vol.16 no, the discernibility matrix is only a new kind of learning method. Otherwise, we have to provide the specificatio Vol.16 No.1 J. Comput. Sci. & Technol. Jan. 2001 Investigation on AQ11, ID3 and the Principle of Discernibility Matrix WANG Jue (Ξ ±), CUI Jia ( ) and ZHAO Kai (Π Λ) Institute of Automation, The Chinese

More information

Index. C, system, 8 Cech distance, 549

Index. C, system, 8 Cech distance, 549 Index PF(A), 391 α-lower approximation, 340 α-lower bound, 339 α-reduct, 109 α-upper approximation, 340 α-upper bound, 339 δ-neighborhood consistent, 291 ε-approach nearness, 558 C, 443-2 system, 8 Cech

More information

Home Page. Title Page. Page 1 of 35. Go Back. Full Screen. Close. Quit

Home Page. Title Page. Page 1 of 35. Go Back. Full Screen. Close. Quit JJ II J I Page 1 of 35 General Attribute Reduction of Formal Contexts Tong-Jun Li Zhejiang Ocean University, China litj@zjou.edu.cn September, 2011,University of Milano-Bicocca Page 2 of 35 Objective of

More information

On rule acquisition in incomplete multi-scale decision tables

On rule acquisition in incomplete multi-scale decision tables *Manuscript (including abstract) Click here to view linked References On rule acquisition in incomplete multi-scale decision tables Wei-Zhi Wu a,b,, Yuhua Qian c, Tong-Jun Li a,b, Shen-Ming Gu a,b a School

More information

The Fourth International Conference on Innovative Computing, Information and Control

The Fourth International Conference on Innovative Computing, Information and Control The Fourth International Conference on Innovative Computing, Information and Control December 7-9, 2009, Kaohsiung, Taiwan http://bit.kuas.edu.tw/~icic09 Dear Prof. Yann-Chang Huang, Thank you for your

More information

Rough Set Approaches for Discovery of Rules and Attribute Dependencies

Rough Set Approaches for Discovery of Rules and Attribute Dependencies Rough Set Approaches for Discovery of Rules and Attribute Dependencies Wojciech Ziarko Department of Computer Science University of Regina Regina, SK, S4S 0A2 Canada Abstract The article presents an elementary

More information

Splitting a Logic Program Revisited

Splitting a Logic Program Revisited Splitting a Logic Program Revisited Jianmin Ji School of Computer Science and Technology University of Science and Technology of China Hefei 230027, China jianmin@ustc.edu.cn Hai Wan and Ziwei Huo and

More information

Comparison of Rough-set and Interval-set Models for Uncertain Reasoning

Comparison of Rough-set and Interval-set Models for Uncertain Reasoning Yao, Y.Y. and Li, X. Comparison of rough-set and interval-set models for uncertain reasoning Fundamenta Informaticae, Vol. 27, No. 2-3, pp. 289-298, 1996. Comparison of Rough-set and Interval-set Models

More information

Inference of A Minimum Size Boolean Function by Using A New Efficient Branch-and-Bound Approach From Examples

Inference of A Minimum Size Boolean Function by Using A New Efficient Branch-and-Bound Approach From Examples Published in: Journal of Global Optimization, 5, pp. 69-9, 199. Inference of A Minimum Size Boolean Function by Using A New Efficient Branch-and-Bound Approach From Examples Evangelos Triantaphyllou Assistant

More information

Soft Lattice Implication Subalgebras

Soft Lattice Implication Subalgebras Appl. Math. Inf. Sci. 7, No. 3, 1181-1186 (2013) 1181 Applied Mathematics & Information Sciences An International Journal Soft Lattice Implication Subalgebras Gaoping Zheng 1,3,4, Zuhua Liao 1,3,4, Nini

More information

Fuzzy Rough Sets with GA-Based Attribute Division

Fuzzy Rough Sets with GA-Based Attribute Division Fuzzy Rough Sets with GA-Based Attribute Division HUGANG HAN, YOSHIO MORIOKA School of Business, Hiroshima Prefectural University 562 Nanatsuka-cho, Shobara-shi, Hiroshima 727-0023, JAPAN Abstract: Rough

More information

ROUGH SETS THEORY AND DATA REDUCTION IN INFORMATION SYSTEMS AND DATA MINING

ROUGH SETS THEORY AND DATA REDUCTION IN INFORMATION SYSTEMS AND DATA MINING ROUGH SETS THEORY AND DATA REDUCTION IN INFORMATION SYSTEMS AND DATA MINING Mofreh Hogo, Miroslav Šnorek CTU in Prague, Departement Of Computer Sciences And Engineering Karlovo Náměstí 13, 121 35 Prague

More information

Transversal and Function Matroidal Structures of Covering-Based Rough Sets

Transversal and Function Matroidal Structures of Covering-Based Rough Sets Transversal and Function Matroidal Structures of Covering-Based Rough Sets Shiping Wang 1, William Zhu 2,,andFanMin 2 1 School of Mathematical Sciences, University of Electronic Science and Technology

More information

Banacha Warszawa Poland s:

Banacha Warszawa Poland  s: Chapter 12 Rough Sets and Rough Logic: A KDD Perspective Zdzis law Pawlak 1, Lech Polkowski 2, and Andrzej Skowron 3 1 Institute of Theoretical and Applied Informatics Polish Academy of Sciences Ba ltycka

More information

ROUGH set methodology has been witnessed great success

ROUGH set methodology has been witnessed great success IEEE TRANSACTIONS ON FUZZY SYSTEMS, VOL. 14, NO. 2, APRIL 2006 191 Fuzzy Probabilistic Approximation Spaces and Their Information Measures Qinghua Hu, Daren Yu, Zongxia Xie, and Jinfu Liu Abstract Rough

More information

A Rough-fuzzy C-means Using Information Entropy for Discretized Violent Crimes Data

A Rough-fuzzy C-means Using Information Entropy for Discretized Violent Crimes Data A Rough-fuzzy C-means Using Information Entropy for Discretized Violent Crimes Data Chao Yang, Shiyuan Che, Xueting Cao, Yeqing Sun, Ajith Abraham School of Information Science and Technology, Dalian Maritime

More information

A Logical Formulation of the Granular Data Model

A Logical Formulation of the Granular Data Model 2008 IEEE International Conference on Data Mining Workshops A Logical Formulation of the Granular Data Model Tuan-Fang Fan Department of Computer Science and Information Engineering National Penghu University

More information

Pei Wang( 王培 ) Temple University, Philadelphia, USA

Pei Wang( 王培 ) Temple University, Philadelphia, USA Pei Wang( 王培 ) Temple University, Philadelphia, USA Artificial General Intelligence (AGI): a small research community in AI that believes Intelligence is a general-purpose capability Intelligence should

More information

Uncertain Risk Analysis and Uncertain Reliability Analysis

Uncertain Risk Analysis and Uncertain Reliability Analysis Journal of Uncertain Systems Vol.4, No.3, pp.63-70, 200 Online at: www.jus.org.uk Uncertain Risk Analysis and Uncertain Reliability Analysis Baoding Liu Uncertainty Theory Laboratory Department of Mathematical

More information

Near approximations via general ordered topological spaces M.Abo-Elhamayel Mathematics Department, Faculty of Science Mansoura University

Near approximations via general ordered topological spaces M.Abo-Elhamayel Mathematics Department, Faculty of Science Mansoura University Near approximations via general ordered topological spaces MAbo-Elhamayel Mathematics Department, Faculty of Science Mansoura University Abstract ough set theory is a new mathematical approach to imperfect

More information

Determination of α -Resolution for Lattice-Valued First-Order Logic Based on Lattice Implication Algebra

Determination of α -Resolution for Lattice-Valued First-Order Logic Based on Lattice Implication Algebra Determination of α -Resolution for Lattice-Valued First-Order Logic Based on Lattice Implication Algebra Yang Xu Xiaobing Li Jun Liu Da Ruan 3 Department of Mathematics Southwest Jiaotong University Chengdu

More information

Matching Index of Uncertain Graph: Concept and Algorithm

Matching Index of Uncertain Graph: Concept and Algorithm Matching Index of Uncertain Graph: Concept and Algorithm Bo Zhang, Jin Peng 2, School of Mathematics and Statistics, Huazhong Normal University Hubei 430079, China 2 Institute of Uncertain Systems, Huanggang

More information

A Chance-Constrained Programming Model for Inverse Spanning Tree Problem with Uncertain Edge Weights

A Chance-Constrained Programming Model for Inverse Spanning Tree Problem with Uncertain Edge Weights A Chance-Constrained Programming Model for Inverse Spanning Tree Problem with Uncertain Edge Weights 1 Xiang Zhang, 2 Qina Wang, 3 Jian Zhou* 1, First Author School of Management, Shanghai University,

More information

Fusing monotonic decision trees

Fusing monotonic decision trees IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, FINAL VERSION 1 Fusing monotonic decision trees Yuhua Qian, Member, IEEE, Hang Xu, Jiye Liang, Bing Liu, Fellow, IEEE and Jieting Wang Abstract Ordinal

More information

International Journal of Approximate Reasoning

International Journal of Approximate Reasoning International Journal of Approximate Reasoning 52 (2011) 231 239 Contents lists available at ScienceDirect International Journal of Approximate Reasoning journal homepage: www.elsevier.com/locate/ijar

More information

Ensembles of classifiers based on approximate reducts

Ensembles of classifiers based on approximate reducts Fundamenta Informaticae 34 (2014) 1 10 1 IOS Press Ensembles of classifiers based on approximate reducts Jakub Wróblewski Polish-Japanese Institute of Information Technology and Institute of Mathematics,

More information

The matrix approach for abstract argumentation frameworks

The matrix approach for abstract argumentation frameworks The matrix approach for abstract argumentation frameworks Claudette CAYROL, Yuming XU IRIT Report RR- -2015-01- -FR February 2015 Abstract The matrices and the operation of dual interchange are introduced

More information

Characterizing Pawlak s Approximation Operators

Characterizing Pawlak s Approximation Operators Characterizing Pawlak s Approximation Operators Victor W. Marek Department of Computer Science University of Kentucky Lexington, KY 40506-0046, USA To the memory of Zdzisław Pawlak, in recognition of his

More information

On Multi-Class Cost-Sensitive Learning

On Multi-Class Cost-Sensitive Learning On Multi-Class Cost-Sensitive Learning Zhi-Hua Zhou and Xu-Ying Liu National Laboratory for Novel Software Technology Nanjing University, Nanjing 210093, China {zhouzh, liuxy}@lamda.nju.edu.cn Abstract

More information

Foundations of Classification

Foundations of Classification Foundations of Classification J. T. Yao Y. Y. Yao and Y. Zhao Department of Computer Science, University of Regina Regina, Saskatchewan, Canada S4S 0A2 {jtyao, yyao, yanzhao}@cs.uregina.ca Summary. Classification

More information

FUZZY ASSOCIATION RULES: A TWO-SIDED APPROACH

FUZZY ASSOCIATION RULES: A TWO-SIDED APPROACH FUZZY ASSOCIATION RULES: A TWO-SIDED APPROACH M. De Cock C. Cornelis E. E. Kerre Dept. of Applied Mathematics and Computer Science Ghent University, Krijgslaan 281 (S9), B-9000 Gent, Belgium phone: +32

More information

Applied Mathematics Letters

Applied Mathematics Letters Applied Mathematics Letters 24 (2011) 797 802 Contents lists available at ScienceDirect Applied Mathematics Letters journal homepage: wwwelseviercom/locate/aml Model order determination using the Hankel

More information

An artificial chemical reaction optimization algorithm for. multiple-choice; knapsack problem.

An artificial chemical reaction optimization algorithm for. multiple-choice; knapsack problem. An artificial chemical reaction optimization algorithm for multiple-choice knapsack problem Tung Khac Truong 1,2, Kenli Li 1, Yuming Xu 1, Aijia Ouyang 1, and Xiaoyong Tang 1 1 College of Information Science

More information

SOFTWARE ARCHITECTURE DESIGN OF GIS WEB SERVICE AGGREGATION BASED ON SERVICE GROUP

SOFTWARE ARCHITECTURE DESIGN OF GIS WEB SERVICE AGGREGATION BASED ON SERVICE GROUP SOFTWARE ARCHITECTURE DESIGN OF GIS WEB SERVICE AGGREGATION BASED ON SERVICE GROUP LIU Jian-chuan*, YANG Jun, TAN Ming-jian, GAN Quan Sichuan Geomatics Center, Chengdu 610041, China Keywords: GIS; Web;

More information

Discussion About Nonlinear Time Series Prediction Using Least Squares Support Vector Machine

Discussion About Nonlinear Time Series Prediction Using Least Squares Support Vector Machine Commun. Theor. Phys. (Beijing, China) 43 (2005) pp. 1056 1060 c International Academic Publishers Vol. 43, No. 6, June 15, 2005 Discussion About Nonlinear Time Series Prediction Using Least Squares Support

More information

An Approach to Classification Based on Fuzzy Association Rules

An Approach to Classification Based on Fuzzy Association Rules An Approach to Classification Based on Fuzzy Association Rules Zuoliang Chen, Guoqing Chen School of Economics and Management, Tsinghua University, Beijing 100084, P. R. China Abstract Classification based

More information

Two Semantic Issues in a Probabilistic Rough Set Model

Two Semantic Issues in a Probabilistic Rough Set Model Fundamenta Informaticae 108 (2011) 249 265 249 IOS Press Two Semantic Issues in a Probabilistic Rough Set Model Yiyu Yao Department of Computer Science University of Regina Regina, Canada yyao@cs.uregina.ca

More information

ROUGHNESS IN MODULES BY USING THE NOTION OF REFERENCE POINTS

ROUGHNESS IN MODULES BY USING THE NOTION OF REFERENCE POINTS Iranian Journal of Fuzzy Systems Vol. 10, No. 6, (2013) pp. 109-124 109 ROUGHNESS IN MODULES BY USING THE NOTION OF REFERENCE POINTS B. DAVVAZ AND A. MALEKZADEH Abstract. A module over a ring is a general

More information

Soft set theoretical approach to residuated lattices. 1. Introduction. Young Bae Jun and Xiaohong Zhang

Soft set theoretical approach to residuated lattices. 1. Introduction. Young Bae Jun and Xiaohong Zhang Quasigroups and Related Systems 24 2016, 231 246 Soft set theoretical approach to residuated lattices Young Bae Jun and Xiaohong Zhang Abstract. Molodtsov's soft set theory is applied to residuated lattices.

More information

Department of Computer Science, Guiyang University, Guiyang , GuiZhou, China

Department of Computer Science, Guiyang University, Guiyang , GuiZhou, China doi:10.21311/002.31.12.01 A Hybrid Recommendation Algorithm with LDA and SVD++ Considering the News Timeliness Junsong Luo 1*, Can Jiang 2, Peng Tian 2 and Wei Huang 2, 3 1 College of Information Science

More information

VPRSM BASED DECISION TREE CLASSIFIER

VPRSM BASED DECISION TREE CLASSIFIER Computing and Informatics, Vol. 26, 2007, 663 677 VPRSM BASED DECISION TREE CLASSIFIER Jin-Mao Wei, Ming-Yang Wang, Jun-Ping You Institute of Computational Intelligence Key Laboratory for Applied Statistics

More information

Parameters to find the cause of Global Terrorism using Rough Set Theory

Parameters to find the cause of Global Terrorism using Rough Set Theory Parameters to find the cause of Global Terrorism using Rough Set Theory Sujogya Mishra Research scholar Utkal University Bhubaneswar-751004, India Shakti Prasad Mohanty Department of Mathematics College

More information

A Geometric Theory of Feature Selection and Distance-Based Measures

A Geometric Theory of Feature Selection and Distance-Based Measures Proceedings of the Twenty-Fourth International Joint Conference on Artificial Intelligence (IJCAI 25) A Geometric Theory of Feature Selection and Distance-Based Measures Kilho Shin and Adrian Pino Angulo

More information

High Frequency Rough Set Model based on Database Systems

High Frequency Rough Set Model based on Database Systems High Frequency Rough Set Model based on Database Systems Kartik Vaithyanathan kvaithya@gmail.com T.Y.Lin Department of Computer Science San Jose State University San Jose, CA 94403, USA tylin@cs.sjsu.edu

More information

doi: info:doi/ /idt

doi: info:doi/ /idt doi: info:doi/10.3233/idt-140226 Granules for Association Rules and Decision Support in the getrnia System Hiroshi Sakai, Mao Wu, Naoto Yamaguchi, and Michinori Nakata Graduate School of Engineering, Kyushu

More information

Model Complexity of Pseudo-independent Models

Model Complexity of Pseudo-independent Models Model Complexity of Pseudo-independent Models Jae-Hyuck Lee and Yang Xiang Department of Computing and Information Science University of Guelph, Guelph, Canada {jaehyuck, yxiang}@cis.uoguelph,ca Abstract

More information

Time-delay feedback control in a delayed dynamical chaos system and its applications

Time-delay feedback control in a delayed dynamical chaos system and its applications Time-delay feedback control in a delayed dynamical chaos system and its applications Ye Zhi-Yong( ), Yang Guang( ), and Deng Cun-Bing( ) School of Mathematics and Physics, Chongqing University of Technology,

More information

A Three-way Decision Making Approach to Malware Analysis

A Three-way Decision Making Approach to Malware Analysis A Three-way Decision Making Approach to Malware Analysis Mohammad Nauman a, Nouman Azam a and JingTao Yao b a National University of Computer and Emerging Sciences, Peshawar, Pakistan b Department of Computer

More information

Mining Positive and Negative Fuzzy Association Rules

Mining Positive and Negative Fuzzy Association Rules Mining Positive and Negative Fuzzy Association Rules Peng Yan 1, Guoqing Chen 1, Chris Cornelis 2, Martine De Cock 2, and Etienne Kerre 2 1 School of Economics and Management, Tsinghua University, Beijing

More information

Application of Rough Set Theory in Performance Analysis

Application of Rough Set Theory in Performance Analysis Australian Journal of Basic and Applied Sciences, 6(): 158-16, 1 SSN 1991-818 Application of Rough Set Theory in erformance Analysis 1 Mahnaz Mirbolouki, Mohammad Hassan Behzadi, 1 Leila Karamali 1 Department

More information

Stability and hybrid synchronization of a time-delay financial hyperchaotic system

Stability and hybrid synchronization of a time-delay financial hyperchaotic system ISSN 76-7659 England UK Journal of Information and Computing Science Vol. No. 5 pp. 89-98 Stability and hybrid synchronization of a time-delay financial hyperchaotic system Lingling Zhang Guoliang Cai

More information

Andrzej Skowron, Zbigniew Suraj (Eds.) To the Memory of Professor Zdzisław Pawlak

Andrzej Skowron, Zbigniew Suraj (Eds.) To the Memory of Professor Zdzisław Pawlak Andrzej Skowron, Zbigniew Suraj (Eds.) ROUGH SETS AND INTELLIGENT SYSTEMS To the Memory of Professor Zdzisław Pawlak Vol. 1 SPIN Springer s internal project number, if known Springer Berlin Heidelberg

More information

Rough Sets for Uncertainty Reasoning

Rough Sets for Uncertainty Reasoning Rough Sets for Uncertainty Reasoning S.K.M. Wong 1 and C.J. Butz 2 1 Department of Computer Science, University of Regina, Regina, Canada, S4S 0A2, wong@cs.uregina.ca 2 School of Information Technology

More information

Partial job order for solving the two-machine flow-shop minimum-length problem with uncertain processing times

Partial job order for solving the two-machine flow-shop minimum-length problem with uncertain processing times Preprints of the 13th IFAC Symposium on Information Control Problems in Manufacturing, Moscow, Russia, June 3-5, 2009 Fr-A2.3 Partial job order for solving the two-machine flow-shop minimum-length problem

More information

Group Decision-Making with Incomplete Fuzzy Linguistic Preference Relations

Group Decision-Making with Incomplete Fuzzy Linguistic Preference Relations Group Decision-Making with Incomplete Fuzzy Linguistic Preference Relations S. Alonso Department of Software Engineering University of Granada, 18071, Granada, Spain; salonso@decsai.ugr.es, F.J. Cabrerizo

More information

Some remarks on conflict analysis

Some remarks on conflict analysis European Journal of Operational Research 166 (2005) 649 654 www.elsevier.com/locate/dsw Some remarks on conflict analysis Zdzisław Pawlak Warsaw School of Information Technology, ul. Newelska 6, 01 447

More information

Another algorithm for nonnegative matrices

Another algorithm for nonnegative matrices Linear Algebra and its Applications 365 (2003) 3 12 www.elsevier.com/locate/laa Another algorithm for nonnegative matrices Manfred J. Bauch University of Bayreuth, Institute of Mathematics, D-95440 Bayreuth,

More information

Hamilton Index and Its Algorithm of Uncertain Graph

Hamilton Index and Its Algorithm of Uncertain Graph Hamilton Index and Its Algorithm of Uncertain Graph Bo Zhang 1 Jin Peng 1 School of Mathematics and Statistics Huazhong Normal University Hubei 430079 China Institute of Uncertain Systems Huanggang Normal

More information

Alternative Approach to Mining Association Rules

Alternative Approach to Mining Association Rules Alternative Approach to Mining Association Rules Jan Rauch 1, Milan Šimůnek 1 2 1 Faculty of Informatics and Statistics, University of Economics Prague, Czech Republic 2 Institute of Computer Sciences,

More information

A PRIMER ON ROUGH SETS:

A PRIMER ON ROUGH SETS: A PRIMER ON ROUGH SETS: A NEW APPROACH TO DRAWING CONCLUSIONS FROM DATA Zdzisław Pawlak ABSTRACT Rough set theory is a new mathematical approach to vague and uncertain data analysis. This Article explains

More information

Probabilistic Bisimilarity as Testing Equivalence

Probabilistic Bisimilarity as Testing Equivalence Probabilistic Bisimilarity as Testing Equivalence Yuxin Deng a,, Yuan Feng b a Shanghai Key Laboratory of Trustworthy Computing, MOE International Joint Lab of Trustworthy Software, and International Research

More information

An Uncertain Control Model with Application to. Production-Inventory System

An Uncertain Control Model with Application to. Production-Inventory System An Uncertain Control Model with Application to Production-Inventory System Kai Yao 1, Zhongfeng Qin 2 1 Department of Mathematical Sciences, Tsinghua University, Beijing 100084, China 2 School of Economics

More information

Decision Making with Uncertainty Information Based on Lattice-Valued Fuzzy Concept Lattice

Decision Making with Uncertainty Information Based on Lattice-Valued Fuzzy Concept Lattice Journal of Universal Computer Science vol. 16 no. 1 (010 159-177 submitted: 1//09 accepted: 15/10/09 appeared: 1/1/10 J.UCS Decision aking with Uncertainty nformation Based on Lattice-Valued Fuzzy Concept

More information

Soft Matrices. Sanjib Mondal, Madhumangal Pal

Soft Matrices. Sanjib Mondal, Madhumangal Pal Journal of Uncertain Systems Vol7, No4, pp254-264, 2013 Online at: wwwjusorguk Soft Matrices Sanjib Mondal, Madhumangal Pal Department of Applied Mathematics with Oceanology and Computer Programming Vidyasagar

More information

Unifying Version Space Representations: Part II

Unifying Version Space Representations: Part II Unifying Version Space Representations: Part II E.N. Smirnov, I.G. Sprinkhuizen-Kuyper, and H.J. van den Herik IKAT, Department of Computer Science, Maastricht University, P.O.Box 616, 6200 MD Maastricht,

More information

arxiv: v1 [cs.ai] 28 Oct 2013

arxiv: v1 [cs.ai] 28 Oct 2013 Ranking basic belief assignments in decision making under uncertain environment arxiv:30.7442v [cs.ai] 28 Oct 203 Yuxian Du a, Shiyu Chen a, Yong Hu b, Felix T.S. Chan c, Sankaran Mahadevan d, Yong Deng

More information

Decision Tree Learning

Decision Tree Learning Decision Tree Learning Berlin Chen Department of Computer Science & Information Engineering National Taiwan Normal University References: 1. Machine Learning, Chapter 3 2. Data Mining: Concepts, Models,

More information

IN THIS PAPER, we consider a class of continuous-time recurrent

IN THIS PAPER, we consider a class of continuous-time recurrent IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II: EXPRESS BRIEFS, VOL. 51, NO. 4, APRIL 2004 161 Global Output Convergence of a Class of Continuous-Time Recurrent Neural Networks With Time-Varying Thresholds

More information