Constructing Rough Set Based Unbalanced Binary Tree for Feature Selection

Size: px
Start display at page:

Download "Constructing Rough Set Based Unbalanced Binary Tree for Feature Selection"

Transcription

1 hnese Journal of Electroncs Vol.23, No.3, July 2014 onstructng Rough Set Based Unbalanced Bnary Tree for Feature Selecton LU Zhengca 1,QINZheng 1,2,JINQao 1 and LI Shengnan 1 (1.Department of omputer Scence and Technology, Tsnghua Unversty, Bejng , hna) (2.School of Software, Tsnghua Unversty, Bejng , hna) Abstract Feature selecton s one of the challengng problems facng data analyss n areas such as pattern recognton, data mnng, and decson support. Many rough set algorthms for feature selecton have been developed, most of whch are essentally dependent on the defnte nformaton contaned wthn the lower approxmaton. Ths paper proposes a novel approach, called Unbalanced bnary tree based feature selecton (UBT-FS), whch utlzes the ndefnte nformaton contaned wthn rough set boundary regon for reducton. UBT-FS desgns the underlyng mechansm for obtanng the boundary regon from the unbalanced bnary tree and adopts the boundary regon based sgnfcance for determnng the optmal search path as well as the boundary regon based evaluaton crteron for dentfyng feature subsets. These allow UBT-FS to have consderable ablty n fndng an optmal or suboptmal reduct whlst smultaneously achevng obvously better computatonal effcency than other avalable algorthms, whch s also supported by the expermental results. Key words Feature selecton, Rough set theory, Boundary regon, Unbalanced bnary tree. I. Introducton Feature selecton, also called attrbute reducton, s vewed as a common problem n the felds of pattern recognton, data mnng, and decson support [1,2]. Its task s just to elmnate reducble or dspensable attrbutes such that the reduced set retans the dentcal dscernble ablty as that of the unreduced set, consderably decreasng the tme complexty of analyss algorthms and remarkably ncreasng the generalzaton capablty of nduced patterns. Rough set theory (RST) [3] s a powerful tool to deal wth mprecson, uncertanty, and vagueness. Much use has been made of RST to perform feature selecton and many consequent technques have been developed. Generally, they can be dvded nto two man categores: dscernblty matrx based soluton and search based soluton. The former obtans feature subsets by constructng a knd of dscernblty matrx from the consdered dataset and smplfyng the correspondng dscernblty functon [4,5]. It s ntutve, concse and suffcent to fnd the mnmal reduct but suffers from a poor computatonal performance. The latter dscovers a reduct by the use of search strateges, such as exhaustve search, ncomplete search [6], random search [7], and heurstc search [8 10]. Snce the heurstc search strategy has the competence to obtan optmal or suboptmal results along wth an acceptable computatonal complexty, t plays an mportant role n the feature selecton communty. omputatonal effcency prompts research nto heurstc search algorthms [11 17]. houchoulas and Shen [11] proposed the Quck reduct algorthm, whch employs the postve regon as heurstcs for reducton. Meng and Sh [12] put forth a faster verson by usng decomposton and sortng technques to calculate the postve regon. Wth consderaton of the boundary regon as auxlary nformaton, Parthaln et al. [13] presented the dstance metrc asssted feature selecton algorthm, whch combnes the lower approxmaton and the boundary regon to create a new evaluaton functon. Qan et al. [14] took hybrd attrbute measures for feature selecton, whch reflect the sgnfcance of an attrbute n postve regons and boundary regons. Dfferent from the algorthms mentoned above, Qan and Lang [15] ntroduced the concept of combnaton entropy for descrbng the uncertanty of nformaton systems and used ts condton entropy to select a feature subset. Sun et al. [16] utlzed rough entropy based uncertanty measures to evaluate the roughness and accuracy of knowledge, and then constructed a heurstc search algorthm wth low computatonal complexty for feature selecton from ncomplete decson systems. One can observe that most of the exstng heurstc feature selecton algorthms are essentally relant upon the defnte nformaton contaned wthn the lower approxmaton and very lttle work has htherto consdered the ndefnte nformaton contaned wthn the boundary regon as the leadng role for mnmzaton. In ths paper, we propose a novel soluton, called Unbalanced bnary tree based feature selecton (UBT-FS), whch captures the ndefnte nformaton contaned wthn the boundary regon to uncover the potental feature subsets. UBT-FS desgns an effcent mechansm to construct a rough set based unbalanced bnary tree, of whch each Manuscrpt Receved Apr. 2013; Accepted June Ths work s supported by the Specalzed Research Fund for the Doctoral Program of Hgher Educaton of hna (No ).

2 onstructng Rough Set Based Unbalanced Bnary Tree for Feature Selecton 475 nconsstent node corresponds to a certan boundary regon. Ths mechansm, workng n conjuncton wth the boundary regon based sgnfcance and the boundary regon based evaluaton crteron, makes UBT-FS capable of dentfyng the sgnfcant features effcently. The remander of ths paper s organzed as follows. In Secton II, we revew the theoretcal background to RST. Secton III explores the rough set based unbalanced bnary tree. Based on t, UBT-FS s desgned n Secton IV. In Secton V, some experments are practced to valdate the effectveness of UBT-FS. Fnally, we gve a concse concluson n Secton VI. II. Background Let S = (U, D) be an ncomplete decson system, where U s a non-empty fnte object set (the unverse), s a condton attrbute set, and D s a decson attrbute set. For each a D, there s a mappng f : U V a. V a s the value doman of a. V = {V a a } { } (where stands for mssng values) and V D = {V d d D} are the value domans of and D, respectvely. For any subset P, there exsts an assocated tolerance relaton SIM(P ): SIM(P )={(u, v) U U : a P, f(u, a) =f(v, a) f(u, a) = f(v, a) = } (1) The famly of tolerance blocks generated by SIM(P )s denoted as π P and can be calculated by π P = {X X U/SIM(P )}, wherex s a tolerance block, depctng the collecton of objects whch are possbly ndscernble from each other wth respect to P. If there does not exst another tolerance block Y such that X Y, X s called a maxmal tolerance block [5]. Generally, a tolerance class of an object u, denoted as S P (u), descrbes the maxmal set of objects whch are probably ndstngushable to u wth respect to P,namely,S P (u) = {v U (u, v) SIM(P )}. It can be also expressed wth the maxmal tolerance blocks such that S P (u) = S X π mp (u) X [5], where π mp (u) s the famly of maxmal tolerance blocks contanng u. Furthermore, t s easy to prove that S P (u) = S X π P (u) X, where πp (u) s the famly of tolerance blocks contanng u. onsder a partton π D = {D =1, 2,,j} of U determned by D. D s called the decson class and can be approxmated by a par of precse concepts whch are known as the lower and upper approxmatons: P (D )={u U S P (u) D } (2) P (D )={u U S P (u) D } (3) P (D )sregardedasthemaxmalp -defnable set contaned n D,whereasP (D ) s consdered as the mnmal P -defnable set contanng D. If P (D ) = P (D ), D s an exact set, otherwse t s a rough set. By the dual approxmatons, the unverse of the decson system s parttoned nto two mutually exclusve crsp regons: postve regon and boundary regon, defned respectvely as POS P (π D)= BND P (π D)= j[ P (D ) (4) =1 j[ (P (D ) P (D )) (5) =1 It s apparent that POS P (π D) BND P (π D) = U and POS P (π D) BND P (π D)=. If BND P (π D)= or POS P (π D)=U, wesaythats s consstent, otherwse t s nconsstent. III. Rough Set Based UB-tree The Unbalanced bnary tree (UB-tree) dscussed n ths work s of great sgnfcance to construct an effcent feature selecton algorthm. Ths secton ams to structure a UB-tree usng RST and explore some mportant propertes of t. To ths end, some related concepts are ntroduced. 1. Related concepts Defnton 1 Let S =(U, D), P, andx π P. Then X s sad an Inconsstent tolerance block (IT-block) f λ(x) > 1 or a onsstent tolerance block (T-block), where λ(x) ={f(u, d) u X} and λ(x) s the cardnalty of λ(x). The famly of IT-blocks from π P, denoted by πp nc, s called the nconsstent famly, whle the famly of T-blocks from π P, denoted by πp con, s called the consstent famly. Obvously, πp nc = {X π P λ(x) > 1}, πp con = {X π P λ(x) =1}, πp nc πp con = π P,andπP nc πp con = { }. Defnton 2 Let S =(U, D), P, a P, and X π P. Then the famly of sub-blocks determned by a on X s defned as θ(x, a) ={{u X f(u, a) =b f(u, a) = } b V a}. The nconsstent and consstent sub-famles of θ(x, a) are defned as θ nc (X, a) ={Y θ(x, a) λ(y ) > 1} and θ con (X, a) ={Y θ(x, a) λ(y ) =1}, respectvely. Theorem 1 Let S =(U, D), P, a P,and X πp con.thenanyy θ(x, a) s a T-block. Proof Y θ(x, a) gves Y X. Snce X πp con, λ(x) = 1, whch yelds λ(y ) = 1. Ths means Y s a T-block. The theorem holds. Defnton 3 Let S =(U, D), P, a P,and W a famly of tolerance blocks. Then the famly of sub-blocks determned by a on W s defned as ω(w, a) ={θ(x, a) X W }. The nconsstent and consstent sub-famles of ω(w, a) are defned as ω nc (W, a) ={Y ω(w, a) λ(y ) > 1} and ω con (W, a) ={Y ω(w, a) λ(y ) =1}, respectvely. Theorem 2 Let S =(U, D), P, a P, and W a famly of T-blocks. Then any Y ω(w, a) sa T-block. Proof For any X W, X s a T-block. By Theorem 1, any Z θ(x, a) s also a T-block. ω(w, a) ={θ(x, a) X W }, whch mples that any Y ω(w, a) s a T-block. The theorem holds. Theorem 3 Let S =(U, D), P, a P. Then P {a} = ω nc (πp nc,a). Proof Assume P = {a 1,a 2,,a t}. By Defnton 2 and 3, we have π P = ω( (ω(θ(u, a 1),a 2), ),a t) and

3 476 hnese Journal of Electroncs 2014 π P {a} = ω(ω( (ω(θ(u, a 1),a 2), ),a t),a). Then π P {a} = ω(π P,a)=ω(πP con πp nc,a)=ω(πp con,a) ω(πp nc,a). P {a} ={X ω(πp con,a) ω(πp nc,a) λ(x) > 1} ={X ω(πp con,a) λ(x) > 1} {X ω(πp nc,a) λ(x) > 1} =ω nc (πp con,a) ω nc (πp nc,a) =ω nc (πp nc,a) Thus, P {a} = ωnc (πp nc,a). The theorem holds. 2. UB-tree A UB-tree s a bnary tree n whch the rght (or left) node keeps growng whle the left (or rght) node has exactly no chldren. In ths study, Our objectve s to use RST to buld a UB-tree wth flourshng nconsstent nodes played by nconsstent famles for an ncomplete decson system. Let S =(U, P ), = {c 1,c 2,,c n}, = a 1,a 2,, a n, and = a 1,a 2,,a, where s an arbtrary permutaton of. WesaythatUBT( )saub-treeofs wth respect to, whch s constructed accordng to the followng process: (1) The root of UBT( ) s the famly that contans only the unverse. Snce λ(u) > 1, U s an IT-block. Ths means that the root s an nconsstent node. (2) Let π con and be the consstent and nconsstent chld nodes of the root, respectvely. Then π con 1 = ω con ({U},a 1)and = ω nc ({U},a 1). From Theorem 2, we know that any sub-block derved from π con s also a Tblock. In other words, nconsstent tolerance sub-blocks are dervable only from. So πnc s consdered as the father node for dervaton whle π con s regarded as the leaf node, called a consstent leaf node. (3) Let π con 2 and 2 be the two chldren of the node 1. Accordng to Theorem 3, πcon 2 = ω con ( 1,a2) and 2 = ωnc (,a2). Smlarly, only πnc 2 s selected as the father node for contnuously growng. (4) Lkewse, we can recursvely arrve at π con n and πnc n, representng the two sub-nodes of n 1 such that πcon n = ω con ( n 1,an) andπnc n = ωnc ( n 1,an). Snce πnc n has no chldren, t s called an nconsstent leaf node. Fg.1 vsualzes constructon of UBT( ). Fg. 1. UBT( ) 3. Propertes of UB-tree Theorem 4 Let S =(U, D), = {c 1,c 2,,c n}, and = a 1,a 2,,a n, where s an arbtrary permutaton of. Then each nconsstent node of UBT( ) determnes the boundary regon wth respect to such that BND (π D)= {X },where =< a 1,a 2,,a >. Proof For any u BND (π D), there doesn t exst any decson class D 0(D 0 π D) such that S P (u) D 0. Snce S P (u) = S X π P (u) X, there must exst at least one IT-block X contanng u. Ths means X. In other words, BND (π D) {X }. On the other hand, for any u X (X ), SP (u) = S X π P (u) X. Snce λ(x) > 1, λ(s P (u)) > 1, whch means there doesn t exst any decson class D 0(D 0 π D) such that S P (u) D 0. So u/ POS (π D), mplyng u BND (π D). In other words, {X } BND (πd). Accordng to dscusson stated above, BND (π D)= {X }. The theorem holds. Theorem 4 unravels the relatonshp between the nconsstent node and the boundary regon, whch ndcates that the boundary regon s n fact a unon of IT-blocks from the nconsstent node. Theorem 5 Let S =(U, D), = {c 1,c 2,,c n}, and = a 1,a 2,,a n, where s an arbtrary permutaton of. If j, BND (π D) BND j (π D), where = a 1,a 2,,a, j = a 1,a 2,,a j. Proof Assume k = j, then = j + c j+1, c j+2,,c j+k. Then = ω nc ( (ω nc (ω nc ( j,cj+1), c j+2, ),c j+k ). For any Y ω nc ( j,c j+1), there must exst Z j such that Y Z. Then {Y ω nc ( j,cj+1)} {Z j }. Accordngly, we have {X πnc } {X j }, whch yelds BND (πd) BND j (πd). The theorem holds. The theorem mples that the deeper an nconsstent node s, the narrower the correspondng boundary regon s. Snce each nconsstent node corresponds to one boundary regon, the UB-tree, n fact, offers the sequence of gradually reduced boundary regons, whch portrays n detal the evoluton of boundary regon becomng narrower and narrower untl reachng that of the system. IV. Feature Selecton Based on UB-tree The UB-tree based feature selecton algorthm (UBT- FS) s structured on the UB-tree, marryng boundary regon based sgnfcance and boundary regon based evaluaton crteron. The boundary regon based sgnfcance s defned as Sg(a, P, D) =( BND P (π D) BND P {a} (π D) )/ U, whch expresses how the boundary regon wll be affected f the attrbute a s added nto the set P. It acts as a router to determne the optmal search path. The boundary regon based evaluaton crteron determnes whether an attrbute subset P s a reduct or not by two condtons: (1) BND P (π D)= BND (π D), and (2) BND P (π D) BND (π D) for any P P. It works as the rule of dentfyng feature subsets. Table 1 gves the detaled descrpton of UBT-FS algorthm. UBT-FS starts wth the root and keeps selectng optmal attrbutes to splt the nconsstent node repeatedly. An optmal attrbute s acheved by ths way. Each of unselected

4 onstructng Rough Set Based Unbalanced Bnary Tree for Feature Selecton 477 attrbutes s used to work on the current nconsstent node and then the new nconsstent node s generated, from whch the correspondng boundary regon s derved. By evaluatng the boundary regon based sgnfcance, the attrbute wth the bggest sgnfcance value s selected as the expected one. Termnaton of the loop occurs when the newly generated boundary regon s dentcal wth that of the system. At ths tme, the set of selected attrbutes s a wanted reduct. Table 1. UBT-FS algorthm Input: S =(U, D), where = {c 1,c 2,,c n} and D = {d} Output: a reduct R begn: Let =, = {U}, BND (π D )=U, andp = ompute BND (π D ) Repeat untl BND (π D )=BND (π D )orp = Select the optmal attrbute c opt by the functon SelectOptAttr(π nc,bnd (π D ),P) Use c opt to derve the nconsstent node π nc from +c opt such that = ω nc ( +c opt,c opt) Induce the boundary regon from π nc +c opt such that BND +c opt (π D )= {X Let = + c opt, +c opt } = +c opt, BND (π D )=BND +c opt (π D ), and P = P {c opt} Let R = end Functon SelectOptAttr(,BND (π D ),P) begn: For each attrbute c n P ompute π nc +c = ωnc (π nc,c) ompute BND +c(π D )= {X π nc +c } ompute Sg(c,,D)=( BND (π D ) BND +c(π D ) )/ U Return c opt wth Sg(c opt,,d)=max{sg(c,,d) c P } end There are several hghlghts decoratng UBT-FS. Frst, each nconsstent node, although determned by a certan attrbute set, s dervable only from ts predecessor wth a sngle attrbute, whch greatly mproves the computatonal effcency. Second, every nconsstent node corresponds to a certan boundary regon, makng t easy to create the boundary regon. Thrd, obtanng a seres of boundary regons needs to traverse the entre condton attrbute set just only once. Fnally, the boundary regon s useful to set up the measure of attrbute sgnfcance and the crteron of dentfyng reducts. These advantages allow UBT-FS to acheve tme complexty much less than O( 2 U ). ompared wth exstng feature selecton algorthms for ncomplete decson systems wth tme complextes no less than O( 2 U log U ) [12],UBT-FS acheves an obvously lower tme complexty. V. Expermental Evaluaton In ths secton, we carry out several experments on a personal computer wth Wndows XP, 2.53GHZ PU, and 2.0G memory so as to evaluate the effectveness of UBT-FS. UBT-FS s compared wth three knds of typcal algorthms whch conduct feature selecton by employng dscernblty matrx [4], postve regon [12], and combnaton entropy [15].For convenence, they can be called Dscernblty matrx based feature selecton algorthm (DM-FS), Postve regon based feature selecton algorthm (PR-FS), and ombnaton entropy based feature selecton algorthm (E-FS), respectvely. DM- FS consders dscernblty of tolerance relaton and uses dscernable attrbutes obtaned by parwse comparsons of objects to buld the dscernblty matrx. It has tme complexty no less than O( 2 U 2 ). PR-FS focuses on ndscernblty of tolerance relaton and captures ndscernble objects as tolerance classes to construct postve regon for a feature subset. Its tme complexty s O( 2 U log U ). E-FS ntroduces combnaton entropy as the measure of attrbute sgnfcance. It uses the core attrbutes as the ntal reduct and keeps addng one attrbute of hgh sgnfcance to the set each tme untl the condtonal combnaton entropy of the current set s equal to that of the whole attrbute set. The tme complexty of E-FS s O( 2 U 2 ). The four algorthms (UBT- FS, DM-FS, PR-FS, and E-FS) are appled to eght publcly accessble datasets from UI Repostory of machne learnng databases [18]. The expermental results, ncludng the sze of the selected subset and runnng tme expressed n seconds, are presented n Table 2. Table 2 shows that the sze of selected features by DM-FS s the smallest of four algorthms. Ths ndcates that DM- FS can undoubtedly fnd the mnmal reduct, whle the other three algorthms can t guarantee selecton of the optmal subset. One can also observe that the results of UBT-FS, PR-FS, and E-FS are almost the same and very close to that of DM- FS, whch mples that UBT-FS, PR-FS, and E-FS have the smlar ablty to seek out the optmal or suboptmal feature subsets, though they can t guarantee the best performance every tme. On the other hand, from Table 2, we can observe that when dataset scale s relatvely small, UBT-FS takes tme as much as PR-FS, whle DM-FS and E-FS are more tme consumng. As the ncrease of the dataset scale, DM-FS and E-FS consume too long tme to be acceptable, PR-FS also costs much Table 2. Expermental results by four algorthms DataSet Objects Features DM-FS E-FS PR-FS UBT-FS Sze Run tme Sze Run tme Sze Run tme Sze Run tme Lung cancer Standardzed audology ongressonal votng Balance scale weght Tc-Tac-Toe endgame ar evaluaton hess end-game Nursery

5 478 hnese Journal of Electroncs 2014 tme, and only UBT-FS keeps a relatvely low tme consumpton. Evdently, the runnng tmes of DM-FS, E-FS, and PR-FS ncrease much more rapdly than that of UBT-FS. The dfferences can be llustrated by plottng the ratos of DM- FS, E-FS, and PR-FS to UBT-FS, respectvely, as shown n Fg.2. work on ten new datasets derved from the expanded Standardzed Audology dataset. As the Standardzed Audology dataset contans only 200 objects, randomly generated objects are appended to t untl ts sze s up to Attrbutes of the new dataset are segmented nto ten parts of equal sze. We thnk of the frst part as the frst dataset, treat the unon of the frst dataset and the second part as the second dataset, take the unon of the second dataset and the thrd part as the thrd dataset, and so on. These ten datasets have the same number of objects but dfferent attrbute number. Fg.4 formulates the correspondng expermental results. Fg. 2. Ratos of DM-FS, E-FS, and PR-FS to UBT-FS Fg.2 llustrates that the larger the dataset scale s, the greater the rato s. For example, the rato of PR-FS to UBT- FS on Dataset 3 s almost equal to 1, whle that on Dataset 8 reaches up to 62. Ths means that PR-FS and UBT-FS runnng on the ongressonal Votng dataset almost share the same tme consumpton, whereas the tme consumpton of PR- FS s 62-fold as much as that of UBT-FS when they work on the Nursery dataset. Not to menton DM-FS and E-FS. One can also observe that although the slope of each curve tends to ncrease wth dataset sze, the curves fluctuate sgnfcantly. For example, the rato of DM-FS to UBT-FS on Dataset 2 s hgher than that on Dataset 3. The case arses from dfferent attrbute numbers of datasets (e.g., the Standardzed Audology dataset has 69 attrbutes but the ongressonal Votng dataset has 16). In fact, tme consumpton of a feature selecton algorthm s determned by both the object number and the attrbute number of a dataset. To further demonstrate the nfluence of each ndvdual factor on runnng tme, we desgn the next two experments. To llustrate the nfluence of the object number on runnng tme, we run the four algorthms on ten sub-datasets of the Nursery dataset, whch are generated accordng to the followng rules. Objects of the Nursery dataset are dvded nto ten groups of equal sze. The frst group s consdered as the frst dataset, the combnaton of the frst dataset and the second group s regarded as the second dataset, the combnaton of the second dataset and the thrd group s referred to as the thrd dataset, and so on. Therefore, these ten sub-datasets of the Nursery dataset have the same attrbutes but dfferent number of objects. The correspondng expermental results are represented n Fg.3. Fg.3 shows that the curves correspondng to DM-FS,E- FS, and PR-FS are approxmately quadratc, whle that correspondng to UBT-FS s almost lnear. Ths experment verfes that tme consumpton of DM-FS, E-FS, and PR-FS ncreases much more rapdly than that of UBT-FS when the attrbute number s constant. The other experment ams to demonstrate the nfluence of the attrbute number on runnng tme. The four algorthms Fg. 3. Runnng tme on ten sub-datasets of Nursery dataset Fg. 4. Runnng tme on ten new datasets Fg.4 shows that the curves correspondng to DM-FS and E-FS are approxmately quadratc, whle those correspondng to PR-FS and UBT-FS are almost lnear but the slope of the curve correspondng to PR-FS s steeper than that correspondng to UBT-FS. Ths means tme consumpton of DM- FS, E-FS, and PR-FS ncreases much more rapdly than that of UBT-FS, whch verfes that UBT-FS provdes the least runnng tme when the object number s constant. VI. onclusons Incomplete data are common n real world applcatons. Many feature selecton algorthms avalable for ncomplete decson systems have been proposed. Snce ther tme complextes are not less than O( 2 U log U ), they are ntolerable when dealng wth volumnous data. Ths paper has developed UBT-FS for ncomplete decson systems. Its tme complexty s not more than O( 2 U ). Unlke other algorthms based on the lower approxmaton, UBT-FS adopts the UB-tree technque, whch specalzes n acceleratng calculaton of the boundary regon. Ths mechansm, along wth the use of the boundary regon based sg-

6 onstructng Rough Set Based Unbalanced Bnary Tree for Feature Selecton 479 nfcance to determne the optmal search path as well as the boundary regon based evaluaton crteron to dentfy feature subsets, makes UBT-FS capable of fndng a reduct excellently. Both theoretcal studes and expermental results demonstrate that UBT-FS ndeed outperforms other avalable algorthms wth regard to tme effcency. References [1] J. Zhong, et al., A novel feature selecton method based on probablty latent semantc analyss for hnese text classfcaton, hnese Journal of Electroncs, Vol.20, No.2, pp , [2] B. Leng, et al., MATE: A vsual based 3D shape descrptor, hnese Journal of Electroncs, Vol.18, No , [3] Z. Pawlak, Rough sets, Internatonal Journal of omputer and Informaton Scences, Vol.11, No.5, pp , [4] M. Kryszkewcz, Rough set approach to ncomplete nformaton systems, Informaton Scences, Vol.112, No.1-4, pp.39 49, [5] Y. Leung, D.Y. L, Maxmal consstent block technque for rule acquston n ncomplete nformaton systems, Informaton Scences, Vol.153, No.1, pp , [6] Y.M. hen, et al., A rough set approach to feature selecton based on power set tree, Knowledge-Based Systems, Vol.24, No.2, pp , [7] M.E. ElAlam, A flter model for feature subset selecton based on genetc algorthm, Knowledge-Based Systems, Vol.22, No.5, pp , [8] M. Dash, H.A. Lu, onsstency-based search n feature selecton, Artfcal Intellgence, Vol.151, No.1-2, pp , [9] Y. Qan, et al., Postve approxmaton: An accelerator for attrbute reducton n rough set theory, Artfcal Intellgence, Vol.174, No.9-10, pp , [10] Y.H. Qan, et al., An effcent accelerator for attrbute reducton from ncomplete data n rough set framework, Pattern Recognton, Vol.44, No.8, pp , [11] A. houchoulas, Q. Shen, Rough set-aded keyword reducton for text categorzaton, Appled Artfcal Intellgence, Vol.15, No.9, pp , [12] Z.Q. Meng, Z.Z. Sh, A fast approach to attrbute reducton n ncomplete decson systems wth tolerance relaton-based rough sets, Informaton Scences, Vol.179, No.16, pp , [13] N.M. Parthalán, et al., A dstance measure approach to explorng the rough set boundary regon for attrbute reducton, IEEE Transactons on Knowledge and Data Engneerng, Vol.22, No.3, pp , [14] J. Qan, et al., Hybrd approaches to attrbute reducton based on ndscernblty and dscernblty relaton, Internatonal Journal of Approxmate Reasonng, Vol.52, No.2, pp , [15] Y.H. Qan, J.Y. Lang, ombnaton entropy and combnaton granulaton n rough set theory, Internatonal Journal of Uncertanty, Fuzzness and Knowledge-Based Systems, Vol.16, No.2, pp , [16] L. Sun, et al., Feature selecton usng rough entropy-based uncertanty measures n ncomplete decson systems, Knowledge- Based Systems, Vol.36, No.1, pp , [17] J. Zhou, et al., Analyss of alternatve objectve functons for attrbute reducton n complete decson tables, Soft omputng, Vol.15, No.8, pp , [18] A. Asuncon, D.J. Newman, UI Machne Learnng Repostory, mlearn/ MLRepostory.html, LU Zhengca receved B.S. and M.S. degrees from the Academy of Equpment ommand and Technology, hna, n 1997 and 2000, respectvely. He s currently workng toward the Ph.D. degree n Department of omputer Scence and Technology, Tsnghua Unversty, hna. Hs research nterests nclude machne learnng and data mnng. (Emal: luzc09@mals.tsnghua.edu.cn) QIN Zheng was born n Hunan Provnce, hna, n He s a professor n the School of Software, Tsnghua Unversty, hna. Hs major research nterest ncludes software archtecture, data fuson and artfcal ntellgence. (Emal: qngzh@mal.tsnghua.edu.cn) JIN Qao receved B.S. degree from School of Software, Tsnghua Unversty, hna, n He s currently a M.S. canddate at Department of omputer Scence and Technology, Tsnghua Unversty, Bejng, hna. Hs research nterests nclude data mnng and other artfcal ntellgence based technques. (Emal: jnq07@mals.tsnghua.edu.cn) LI Shengnan receved B.S. degree n Software engneerng from School of Software, NanKa Unversty, hna, n She s currently a Ph.D. canddate at Department of omputer Scence and Technology, Tsnghua Unversty, Bejng, hna. Her research nterests nclude machne learnng and data fuson. (Emal: xaolng4329@gmal.com)

Power law and dimension of the maximum value for belief distribution with the max Deng entropy

Power law and dimension of the maximum value for belief distribution with the max Deng entropy Power law and dmenson of the maxmum value for belef dstrbuton wth the max Deng entropy Bngy Kang a, a College of Informaton Engneerng, Northwest A&F Unversty, Yanglng, Shaanx, 712100, Chna. Abstract Deng

More information

Discretization of Continuous Attributes in Rough Set Theory and Its Application*

Discretization of Continuous Attributes in Rough Set Theory and Its Application* Dscretzaton of Contnuous Attrbutes n Rough Set Theory and Its Applcaton* Gexang Zhang 1,2, Lazhao Hu 1, and Wedong Jn 2 1 Natonal EW Laboratory, Chengdu 610036 Schuan, Chna dylan7237@sna.com 2 School of

More information

A New Evolutionary Computation Based Approach for Learning Bayesian Network

A New Evolutionary Computation Based Approach for Learning Bayesian Network Avalable onlne at www.scencedrect.com Proceda Engneerng 15 (2011) 4026 4030 Advanced n Control Engneerng and Informaton Scence A New Evolutonary Computaton Based Approach for Learnng Bayesan Network Yungang

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

The Study of Teaching-learning-based Optimization Algorithm

The Study of Teaching-learning-based Optimization Algorithm Advanced Scence and Technology Letters Vol. (AST 06), pp.05- http://dx.do.org/0.57/astl.06. The Study of Teachng-learnng-based Optmzaton Algorthm u Sun, Yan fu, Lele Kong, Haolang Q,, Helongang Insttute

More information

A Method for Filling up the Missed Data in Information Table

A Method for Filling up the Missed Data in Information Table A Method for Fllng up the Mssed Data Gao Xuedong, E Xu, L Teke & Zhang Qun A Method for Fllng up the Mssed Data n Informaton Table Gao Xuedong School of Management, nversty of Scence and Technology Beng,

More information

Structure and Drive Paul A. Jensen Copyright July 20, 2003

Structure and Drive Paul A. Jensen Copyright July 20, 2003 Structure and Drve Paul A. Jensen Copyrght July 20, 2003 A system s made up of several operatons wth flow passng between them. The structure of the system descrbes the flow paths from nputs to outputs.

More information

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS Avalable onlne at http://sck.org J. Math. Comput. Sc. 3 (3), No., 6-3 ISSN: 97-537 COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

More information

Interactive Bi-Level Multi-Objective Integer. Non-linear Programming Problem

Interactive Bi-Level Multi-Objective Integer. Non-linear Programming Problem Appled Mathematcal Scences Vol 5 0 no 65 3 33 Interactve B-Level Mult-Objectve Integer Non-lnear Programmng Problem O E Emam Department of Informaton Systems aculty of Computer Scence and nformaton Helwan

More information

Resource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud

Resource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud Resource Allocaton wth a Budget Constrant for Computng Independent Tasks n the Cloud Wemng Sh and Bo Hong School of Electrcal and Computer Engneerng Georga Insttute of Technology, USA 2nd IEEE Internatonal

More information

The Order Relation and Trace Inequalities for. Hermitian Operators

The Order Relation and Trace Inequalities for. Hermitian Operators Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence

More information

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers Psychology 282 Lecture #24 Outlne Regresson Dagnostcs: Outlers In an earler lecture we studed the statstcal assumptons underlyng the regresson model, ncludng the followng ponts: Formal statement of assumptons.

More information

Keyword Reduction for Text Categorization using Neighborhood Rough Sets

Keyword Reduction for Text Categorization using Neighborhood Rough Sets IJCSI Internatonal Journal of Computer Scence Issues, Volume 1, Issue 1, No, January 015 ISSN (rnt): 1694-0814 ISSN (Onlne): 1694-0784 www.ijcsi.org Keyword Reducton for Text Categorzaton usng Neghborhood

More information

A Hybrid Variational Iteration Method for Blasius Equation

A Hybrid Variational Iteration Method for Blasius Equation Avalable at http://pvamu.edu/aam Appl. Appl. Math. ISSN: 1932-9466 Vol. 10, Issue 1 (June 2015), pp. 223-229 Applcatons and Appled Mathematcs: An Internatonal Journal (AAM) A Hybrd Varatonal Iteraton Method

More information

Hongyi Miao, College of Science, Nanjing Forestry University, Nanjing ,China. (Received 20 June 2013, accepted 11 March 2014) I)ϕ (k)

Hongyi Miao, College of Science, Nanjing Forestry University, Nanjing ,China. (Received 20 June 2013, accepted 11 March 2014) I)ϕ (k) ISSN 1749-3889 (prnt), 1749-3897 (onlne) Internatonal Journal of Nonlnear Scence Vol.17(2014) No.2,pp.188-192 Modfed Block Jacob-Davdson Method for Solvng Large Sparse Egenproblems Hongy Mao, College of

More information

Regularized Discriminant Analysis for Face Recognition

Regularized Discriminant Analysis for Face Recognition 1 Regularzed Dscrmnant Analyss for Face Recognton Itz Pma, Mayer Aladem Department of Electrcal and Computer Engneerng, Ben-Guron Unversty of the Negev P.O.Box 653, Beer-Sheva, 845, Israel. Abstract Ths

More information

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009 College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:

More information

Feature Selection: Part 1

Feature Selection: Part 1 CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?

More information

Lecture 12: Discrete Laplacian

Lecture 12: Discrete Laplacian Lecture 12: Dscrete Laplacan Scrbe: Tanye Lu Our goal s to come up wth a dscrete verson of Laplacan operator for trangulated surfaces, so that we can use t n practce to solve related problems We are mostly

More information

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons

More information

A Compression Method of Decision Table Based on Matrix Computation

A Compression Method of Decision Table Based on Matrix Computation A ompresson Method of ecson Table Based on Matrx omputaton Lapeng Luo and Ergen Lu School of Basc Scences, East hna Jaotong Unversty, Nanchang, P.R. hna loulp@tom.com Abstract. A new algorthm of attrbute

More information

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity LINEAR REGRESSION ANALYSIS MODULE IX Lecture - 30 Multcollnearty Dr. Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur 2 Remedes for multcollnearty Varous technques have

More information

Fuzzy Boundaries of Sample Selection Model

Fuzzy Boundaries of Sample Selection Model Proceedngs of the 9th WSES Internatonal Conference on ppled Mathematcs, Istanbul, Turkey, May 7-9, 006 (pp309-34) Fuzzy Boundares of Sample Selecton Model L. MUHMD SFIIH, NTON BDULBSH KMIL, M. T. BU OSMN

More information

Semi-supervised Classification with Active Query Selection

Semi-supervised Classification with Active Query Selection Sem-supervsed Classfcaton wth Actve Query Selecton Jao Wang and Swe Luo School of Computer and Informaton Technology, Beng Jaotong Unversty, Beng 00044, Chna Wangjao088@63.com Abstract. Labeled samples

More information

The Quadratic Trigonometric Bézier Curve with Single Shape Parameter

The Quadratic Trigonometric Bézier Curve with Single Shape Parameter J. Basc. Appl. Sc. Res., (3541-546, 01 01, TextRoad Publcaton ISSN 090-4304 Journal of Basc and Appled Scentfc Research www.textroad.com The Quadratc Trgonometrc Bézer Curve wth Sngle Shape Parameter Uzma

More information

A Network Intrusion Detection Method Based on Improved K-means Algorithm

A Network Intrusion Detection Method Based on Improved K-means Algorithm Advanced Scence and Technology Letters, pp.429-433 http://dx.do.org/10.14257/astl.2014.53.89 A Network Intruson Detecton Method Based on Improved K-means Algorthm Meng Gao 1,1, Nhong Wang 1, 1 Informaton

More information

Pop-Click Noise Detection Using Inter-Frame Correlation for Improved Portable Auditory Sensing

Pop-Click Noise Detection Using Inter-Frame Correlation for Improved Portable Auditory Sensing Advanced Scence and Technology Letters, pp.164-168 http://dx.do.org/10.14257/astl.2013 Pop-Clc Nose Detecton Usng Inter-Frame Correlaton for Improved Portable Audtory Sensng Dong Yun Lee, Kwang Myung Jeon,

More information

A Robust Method for Calculating the Correlation Coefficient

A Robust Method for Calculating the Correlation Coefficient A Robust Method for Calculatng the Correlaton Coeffcent E.B. Nven and C. V. Deutsch Relatonshps between prmary and secondary data are frequently quantfed usng the correlaton coeffcent; however, the tradtonal

More information

Online Classification: Perceptron and Winnow

Online Classification: Perceptron and Winnow E0 370 Statstcal Learnng Theory Lecture 18 Nov 8, 011 Onlne Classfcaton: Perceptron and Wnnow Lecturer: Shvan Agarwal Scrbe: Shvan Agarwal 1 Introducton In ths lecture we wll start to study the onlne learnng

More information

Operating conditions of a mine fan under conditions of variable resistance

Operating conditions of a mine fan under conditions of variable resistance Paper No. 11 ISMS 216 Operatng condtons of a mne fan under condtons of varable resstance Zhang Ynghua a, Chen L a, b, Huang Zhan a, *, Gao Yukun a a State Key Laboratory of Hgh-Effcent Mnng and Safety

More information

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U) Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of

More information

International Journal of Mathematical Archive-3(3), 2012, Page: Available online through ISSN

International Journal of Mathematical Archive-3(3), 2012, Page: Available online through   ISSN Internatonal Journal of Mathematcal Archve-3(3), 2012, Page: 1136-1140 Avalable onlne through www.ma.nfo ISSN 2229 5046 ARITHMETIC OPERATIONS OF FOCAL ELEMENTS AND THEIR CORRESPONDING BASIC PROBABILITY

More information

Valuated Binary Tree: A New Approach in Study of Integers

Valuated Binary Tree: A New Approach in Study of Integers Internatonal Journal of Scentfc Innovatve Mathematcal Research (IJSIMR) Volume 4, Issue 3, March 6, PP 63-67 ISS 347-37X (Prnt) & ISS 347-34 (Onlne) wwwarcournalsorg Valuated Bnary Tree: A ew Approach

More information

Linear Regression Analysis: Terminology and Notation

Linear Regression Analysis: Terminology and Notation ECON 35* -- Secton : Basc Concepts of Regresson Analyss (Page ) Lnear Regresson Analyss: Termnology and Notaton Consder the generc verson of the smple (two-varable) lnear regresson model. It s represented

More information

Notes on Frequency Estimation in Data Streams

Notes on Frequency Estimation in Data Streams Notes on Frequency Estmaton n Data Streams In (one of) the data streamng model(s), the data s a sequence of arrvals a 1, a 2,..., a m of the form a j = (, v) where s the dentty of the tem and belongs to

More information

Module 9. Lecture 6. Duality in Assignment Problems

Module 9. Lecture 6. Duality in Assignment Problems Module 9 1 Lecture 6 Dualty n Assgnment Problems In ths lecture we attempt to answer few other mportant questons posed n earler lecture for (AP) and see how some of them can be explaned through the concept

More information

Lecture 20: Lift and Project, SDP Duality. Today we will study the Lift and Project method. Then we will prove the SDP duality theorem.

Lecture 20: Lift and Project, SDP Duality. Today we will study the Lift and Project method. Then we will prove the SDP duality theorem. prnceton u. sp 02 cos 598B: algorthms and complexty Lecture 20: Lft and Project, SDP Dualty Lecturer: Sanjeev Arora Scrbe:Yury Makarychev Today we wll study the Lft and Project method. Then we wll prove

More information

An improved approach to attribute reduction with covering rough sets

An improved approach to attribute reduction with covering rough sets An mproved approach to attrbute reducton wth coverng rough sets Changzhong Wang a,, Baqng Sun b, Qnhua Hu c a Department of Mathematcs, Boha Unversty, Jnzhou, 121000, PRChna b Economy and management school,

More information

Chapter 8 Indicator Variables

Chapter 8 Indicator Variables Chapter 8 Indcator Varables In general, e explanatory varables n any regresson analyss are assumed to be quanttatve n nature. For example, e varables lke temperature, dstance, age etc. are quanttatve n

More information

Chapter 13: Multiple Regression

Chapter 13: Multiple Regression Chapter 13: Multple Regresson 13.1 Developng the multple-regresson Model The general model can be descrbed as: It smplfes for two ndependent varables: The sample ft parameter b 0, b 1, and b are used to

More information

The Minimum Universal Cost Flow in an Infeasible Flow Network

The Minimum Universal Cost Flow in an Infeasible Flow Network Journal of Scences, Islamc Republc of Iran 17(2): 175-180 (2006) Unversty of Tehran, ISSN 1016-1104 http://jscencesutacr The Mnmum Unversal Cost Flow n an Infeasble Flow Network H Saleh Fathabad * M Bagheran

More information

More metrics on cartesian products

More metrics on cartesian products More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of

More information

DUE: WEDS FEB 21ST 2018

DUE: WEDS FEB 21ST 2018 HOMEWORK # 1: FINITE DIFFERENCES IN ONE DIMENSION DUE: WEDS FEB 21ST 2018 1. Theory Beam bendng s a classcal engneerng analyss. The tradtonal soluton technque makes smplfyng assumptons such as a constant

More information

Speeding up Computation of Scalar Multiplication in Elliptic Curve Cryptosystem

Speeding up Computation of Scalar Multiplication in Elliptic Curve Cryptosystem H.K. Pathak et. al. / (IJCSE) Internatonal Journal on Computer Scence and Engneerng Speedng up Computaton of Scalar Multplcaton n Ellptc Curve Cryptosystem H. K. Pathak Manju Sangh S.o.S n Computer scence

More information

One-sided finite-difference approximations suitable for use with Richardson extrapolation

One-sided finite-difference approximations suitable for use with Richardson extrapolation Journal of Computatonal Physcs 219 (2006) 13 20 Short note One-sded fnte-dfference approxmatons sutable for use wth Rchardson extrapolaton Kumar Rahul, S.N. Bhattacharyya * Department of Mechancal Engneerng,

More information

Chapter 6. Supplemental Text Material

Chapter 6. Supplemental Text Material Chapter 6. Supplemental Text Materal S6-. actor Effect Estmates are Least Squares Estmates We have gven heurstc or ntutve explanatons of how the estmates of the factor effects are obtaned n the textboo.

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

Numerical Heat and Mass Transfer

Numerical Heat and Mass Transfer Master degree n Mechancal Engneerng Numercal Heat and Mass Transfer 06-Fnte-Dfference Method (One-dmensonal, steady state heat conducton) Fausto Arpno f.arpno@uncas.t Introducton Why we use models and

More information

NON-CENTRAL 7-POINT FORMULA IN THE METHOD OF LINES FOR PARABOLIC AND BURGERS' EQUATIONS

NON-CENTRAL 7-POINT FORMULA IN THE METHOD OF LINES FOR PARABOLIC AND BURGERS' EQUATIONS IJRRAS 8 (3 September 011 www.arpapress.com/volumes/vol8issue3/ijrras_8_3_08.pdf NON-CENTRAL 7-POINT FORMULA IN THE METHOD OF LINES FOR PARABOLIC AND BURGERS' EQUATIONS H.O. Bakodah Dept. of Mathematc

More information

On the correction of the h-index for career length

On the correction of the h-index for career length 1 On the correcton of the h-ndex for career length by L. Egghe Unverstet Hasselt (UHasselt), Campus Depenbeek, Agoralaan, B-3590 Depenbeek, Belgum 1 and Unverstet Antwerpen (UA), IBW, Stadscampus, Venusstraat

More information

2.3 Nilpotent endomorphisms

2.3 Nilpotent endomorphisms s a block dagonal matrx, wth A Mat dm U (C) In fact, we can assume that B = B 1 B k, wth B an ordered bass of U, and that A = [f U ] B, where f U : U U s the restrcton of f to U 40 23 Nlpotent endomorphsms

More information

A new construction of 3-separable matrices via an improved decoding of Macula s construction

A new construction of 3-separable matrices via an improved decoding of Macula s construction Dscrete Optmzaton 5 008 700 704 Contents lsts avalable at ScenceDrect Dscrete Optmzaton journal homepage: wwwelsevercom/locate/dsopt A new constructon of 3-separable matrces va an mproved decodng of Macula

More information

Errors for Linear Systems

Errors for Linear Systems Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch

More information

A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS

A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS HCMC Unversty of Pedagogy Thong Nguyen Huu et al. A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS Thong Nguyen Huu and Hao Tran Van Department of mathematcs-nformaton,

More information

A New Scrambling Evaluation Scheme based on Spatial Distribution Entropy and Centroid Difference of Bit-plane

A New Scrambling Evaluation Scheme based on Spatial Distribution Entropy and Centroid Difference of Bit-plane A New Scramblng Evaluaton Scheme based on Spatal Dstrbuton Entropy and Centrod Dfference of Bt-plane Lang Zhao *, Avshek Adhkar Kouch Sakura * * Graduate School of Informaton Scence and Electrcal Engneerng,

More information

Comparison of Regression Lines

Comparison of Regression Lines STATGRAPHICS Rev. 9/13/2013 Comparson of Regresson Lnes Summary... 1 Data Input... 3 Analyss Summary... 4 Plot of Ftted Model... 6 Condtonal Sums of Squares... 6 Analyss Optons... 7 Forecasts... 8 Confdence

More information

Double Layered Fuzzy Planar Graph

Double Layered Fuzzy Planar Graph Global Journal of Pure and Appled Mathematcs. ISSN 0973-768 Volume 3, Number 0 07), pp. 7365-7376 Research Inda Publcatons http://www.rpublcaton.com Double Layered Fuzzy Planar Graph J. Jon Arockaraj Assstant

More information

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg prnceton unv. F 17 cos 521: Advanced Algorthm Desgn Lecture 7: LP Dualty Lecturer: Matt Wenberg Scrbe: LP Dualty s an extremely useful tool for analyzng structural propertes of lnear programs. Whle there

More information

FREQUENCY DISTRIBUTIONS Page 1 of The idea of a frequency distribution for sets of observations will be introduced,

FREQUENCY DISTRIBUTIONS Page 1 of The idea of a frequency distribution for sets of observations will be introduced, FREQUENCY DISTRIBUTIONS Page 1 of 6 I. Introducton 1. The dea of a frequency dstrbuton for sets of observatons wll be ntroduced, together wth some of the mechancs for constructng dstrbutons of data. Then

More information

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4) I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes

More information

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE Analytcal soluton s usually not possble when exctaton vares arbtrarly wth tme or f the system s nonlnear. Such problems can be solved by numercal tmesteppng

More information

MDL-Based Unsupervised Attribute Ranking

MDL-Based Unsupervised Attribute Ranking MDL-Based Unsupervsed Attrbute Rankng Zdravko Markov Computer Scence Department Central Connectcut State Unversty New Brtan, CT 06050, USA http://www.cs.ccsu.edu/~markov/ markovz@ccsu.edu MDL-Based Unsupervsed

More information

STATISTICS QUESTIONS. Step by Step Solutions.

STATISTICS QUESTIONS. Step by Step Solutions. STATISTICS QUESTIONS Step by Step Solutons www.mathcracker.com 9//016 Problem 1: A researcher s nterested n the effects of famly sze on delnquency for a group of offenders and examnes famles wth one to

More information

x = , so that calculated

x = , so that calculated Stat 4, secton Sngle Factor ANOVA notes by Tm Plachowsk n chapter 8 we conducted hypothess tests n whch we compared a sngle sample s mean or proporton to some hypotheszed value Chapter 9 expanded ths to

More information

Message modification, neutral bits and boomerangs

Message modification, neutral bits and boomerangs Message modfcaton, neutral bts and boomerangs From whch round should we start countng n SHA? Antone Joux DGA and Unversty of Versalles St-Quentn-en-Yvelnes France Jont work wth Thomas Peyrn 1 Dfferental

More information

Negative Binomial Regression

Negative Binomial Regression STATGRAPHICS Rev. 9/16/2013 Negatve Bnomal Regresson Summary... 1 Data Input... 3 Statstcal Model... 3 Analyss Summary... 4 Analyss Optons... 7 Plot of Ftted Model... 8 Observed Versus Predcted... 10 Predctons...

More information

A new Approach for Solving Linear Ordinary Differential Equations

A new Approach for Solving Linear Ordinary Differential Equations , ISSN 974-57X (Onlne), ISSN 974-5718 (Prnt), Vol. ; Issue No. 1; Year 14, Copyrght 13-14 by CESER PUBLICATIONS A new Approach for Solvng Lnear Ordnary Dfferental Equatons Fawz Abdelwahd Department of

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they

More information

Maximizing the number of nonnegative subsets

Maximizing the number of nonnegative subsets Maxmzng the number of nonnegatve subsets Noga Alon Hao Huang December 1, 213 Abstract Gven a set of n real numbers, f the sum of elements of every subset of sze larger than k s negatve, what s the maxmum

More information

MMA and GCMMA two methods for nonlinear optimization

MMA and GCMMA two methods for nonlinear optimization MMA and GCMMA two methods for nonlnear optmzaton Krster Svanberg Optmzaton and Systems Theory, KTH, Stockholm, Sweden. krlle@math.kth.se Ths note descrbes the algorthms used n the author s 2007 mplementatons

More information

Transfer Functions. Convenient representation of a linear, dynamic model. A transfer function (TF) relates one input and one output: ( ) system

Transfer Functions. Convenient representation of a linear, dynamic model. A transfer function (TF) relates one input and one output: ( ) system Transfer Functons Convenent representaton of a lnear, dynamc model. A transfer functon (TF) relates one nput and one output: x t X s y t system Y s The followng termnology s used: x y nput output forcng

More information

Amusing Properties of Odd Numbers Derived From Valuated Binary Tree

Amusing Properties of Odd Numbers Derived From Valuated Binary Tree IOSR Journal of Mathematcs (IOSR-JM) e-iss: 78-578, p-iss: 19-765X. Volume 1, Issue 6 Ver. V (ov. - Dec.016), PP 5-57 www.osrjournals.org Amusng Propertes of Odd umbers Derved From Valuated Bnary Tree

More information

A Fast Computer Aided Design Method for Filters

A Fast Computer Aided Design Method for Filters 2017 Asa-Pacfc Engneerng and Technology Conference (APETC 2017) ISBN: 978-1-60595-443-1 A Fast Computer Aded Desgn Method for Flters Gang L ABSTRACT *Ths paper presents a fast computer aded desgn method

More information

Lecture 4: November 17, Part 1 Single Buffer Management

Lecture 4: November 17, Part 1 Single Buffer Management Lecturer: Ad Rosén Algorthms for the anagement of Networs Fall 2003-2004 Lecture 4: November 7, 2003 Scrbe: Guy Grebla Part Sngle Buffer anagement In the prevous lecture we taled about the Combned Input

More information

Foundations of Arithmetic

Foundations of Arithmetic Foundatons of Arthmetc Notaton We shall denote the sum and product of numbers n the usual notaton as a 2 + a 2 + a 3 + + a = a, a 1 a 2 a 3 a = a The notaton a b means a dvdes b,.e. ac = b where c s an

More information

A Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach

A Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach A Bayes Algorthm for the Multtask Pattern Recognton Problem Drect Approach Edward Puchala Wroclaw Unversty of Technology, Char of Systems and Computer etworks, Wybrzeze Wyspanskego 7, 50-370 Wroclaw, Poland

More information

Homework Assignment 3 Due in class, Thursday October 15

Homework Assignment 3 Due in class, Thursday October 15 Homework Assgnment 3 Due n class, Thursday October 15 SDS 383C Statstcal Modelng I 1 Rdge regresson and Lasso 1. Get the Prostrate cancer data from http://statweb.stanford.edu/~tbs/elemstatlearn/ datasets/prostate.data.

More information

Economics 130. Lecture 4 Simple Linear Regression Continued

Economics 130. Lecture 4 Simple Linear Regression Continued Economcs 130 Lecture 4 Contnued Readngs for Week 4 Text, Chapter and 3. We contnue wth addressng our second ssue + add n how we evaluate these relatonshps: Where do we get data to do ths analyss? How do

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

On Similarity Measures of Fuzzy Soft Sets

On Similarity Measures of Fuzzy Soft Sets Int J Advance Soft Comput Appl, Vol 3, No, July ISSN 74-853; Copyrght ICSRS Publcaton, www-csrsorg On Smlarty Measures of uzzy Soft Sets PINAKI MAJUMDAR* and SKSAMANTA Department of Mathematcs MUC Women

More information

arxiv:cs.cv/ Jun 2000

arxiv:cs.cv/ Jun 2000 Correlaton over Decomposed Sgnals: A Non-Lnear Approach to Fast and Effectve Sequences Comparson Lucano da Fontoura Costa arxv:cs.cv/0006040 28 Jun 2000 Cybernetc Vson Research Group IFSC Unversty of São

More information

The Synchronous 8th-Order Differential Attack on 12 Rounds of the Block Cipher HyRAL

The Synchronous 8th-Order Differential Attack on 12 Rounds of the Block Cipher HyRAL The Synchronous 8th-Order Dfferental Attack on 12 Rounds of the Block Cpher HyRAL Yasutaka Igarash, Sej Fukushma, and Tomohro Hachno Kagoshma Unversty, Kagoshma, Japan Emal: {garash, fukushma, hachno}@eee.kagoshma-u.ac.jp

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

Appendix B: Resampling Algorithms

Appendix B: Resampling Algorithms 407 Appendx B: Resamplng Algorthms A common problem of all partcle flters s the degeneracy of weghts, whch conssts of the unbounded ncrease of the varance of the mportance weghts ω [ ] of the partcles

More information

A LINEAR PROGRAM TO COMPARE MULTIPLE GROSS CREDIT LOSS FORECASTS. Dr. Derald E. Wentzien, Wesley College, (302) ,

A LINEAR PROGRAM TO COMPARE MULTIPLE GROSS CREDIT LOSS FORECASTS. Dr. Derald E. Wentzien, Wesley College, (302) , A LINEAR PROGRAM TO COMPARE MULTIPLE GROSS CREDIT LOSS FORECASTS Dr. Derald E. Wentzen, Wesley College, (302) 736-2574, wentzde@wesley.edu ABSTRACT A lnear programmng model s developed and used to compare

More information

Application of B-Spline to Numerical Solution of a System of Singularly Perturbed Problems

Application of B-Spline to Numerical Solution of a System of Singularly Perturbed Problems Mathematca Aeterna, Vol. 1, 011, no. 06, 405 415 Applcaton of B-Splne to Numercal Soluton of a System of Sngularly Perturbed Problems Yogesh Gupta Department of Mathematcs Unted College of Engneerng &

More information

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography CSc 6974 and ECSE 6966 Math. Tech. for Vson, Graphcs and Robotcs Lecture 21, Aprl 17, 2006 Estmatng A Plane Homography Overvew We contnue wth a dscusson of the major ssues, usng estmaton of plane projectve

More information

A New Refinement of Jacobi Method for Solution of Linear System Equations AX=b

A New Refinement of Jacobi Method for Solution of Linear System Equations AX=b Int J Contemp Math Scences, Vol 3, 28, no 17, 819-827 A New Refnement of Jacob Method for Soluton of Lnear System Equatons AX=b F Naem Dafchah Department of Mathematcs, Faculty of Scences Unversty of Gulan,

More information

An Enterprise Competitive Capability Evaluation Based on Rough Sets

An Enterprise Competitive Capability Evaluation Based on Rough Sets An Enterprse Compettve Capablty Evaluaton Based on Rough Sets 59 An Enterprse Compettve Capablty Evaluaton Based on Rough Sets Mng-Chang Lee Department of Informaton Management Fooyn Unversty, Kaohsung,

More information

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur Analyss of Varance and Desgn of Experment-I MODULE VII LECTURE - 3 ANALYSIS OF COVARIANCE Dr Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur Any scentfc experment s performed

More information

Application research on rough set -neural network in the fault diagnosis system of ball mill

Application research on rough set -neural network in the fault diagnosis system of ball mill Avalable onlne www.ocpr.com Journal of Chemcal and Pharmaceutcal Research, 2014, 6(4):834-838 Research Artcle ISSN : 0975-7384 CODEN(USA) : JCPRC5 Applcaton research on rough set -neural network n the

More information

NP-Completeness : Proofs

NP-Completeness : Proofs NP-Completeness : Proofs Proof Methods A method to show a decson problem Π NP-complete s as follows. (1) Show Π NP. (2) Choose an NP-complete problem Π. (3) Show Π Π. A method to show an optmzaton problem

More information

This column is a continuation of our previous column

This column is a continuation of our previous column Comparson of Goodness of Ft Statstcs for Lnear Regresson, Part II The authors contnue ther dscusson of the correlaton coeffcent n developng a calbraton for quanttatve analyss. Jerome Workman Jr. and Howard

More information

Study of Selective Ensemble Learning Methods Based on Support Vector Machine

Study of Selective Ensemble Learning Methods Based on Support Vector Machine Avalable onlne at www.scencedrect.com Physcs Proceda 33 (2012 ) 1518 1525 2012 Internatonal Conference on Medcal Physcs and Bomedcal Engneerng Study of Selectve Ensemble Learnng Methods Based on Support

More information

Lab 2e Thermal System Response and Effective Heat Transfer Coefficient

Lab 2e Thermal System Response and Effective Heat Transfer Coefficient 58:080 Expermental Engneerng 1 OBJECTIVE Lab 2e Thermal System Response and Effectve Heat Transfer Coeffcent Warnng: though the experment has educatonal objectves (to learn about bolng heat transfer, etc.),

More information

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016 U.C. Berkeley CS94: Spectral Methods and Expanders Handout 8 Luca Trevsan February 7, 06 Lecture 8: Spectral Algorthms Wrap-up In whch we talk about even more generalzatons of Cheeger s nequaltes, and

More information

Quantitative Discrimination of Effective Porosity Using Digital Image Analysis - Implications for Porosity-Permeability Transforms

Quantitative Discrimination of Effective Porosity Using Digital Image Analysis - Implications for Porosity-Permeability Transforms 2004, 66th EAGE Conference, Pars Quanttatve Dscrmnaton of Effectve Porosty Usng Dgtal Image Analyss - Implcatons for Porosty-Permeablty Transforms Gregor P. Eberl 1, Gregor T. Baechle 1, Ralf Weger 1,

More information

Difference Equations

Difference Equations Dfference Equatons c Jan Vrbk 1 Bascs Suppose a sequence of numbers, say a 0,a 1,a,a 3,... s defned by a certan general relatonshp between, say, three consecutve values of the sequence, e.g. a + +3a +1

More information