Modify Bayesian Network Structure with Inconsistent Constraints

Size: px
Start display at page:

Download "Modify Bayesian Network Structure with Inconsistent Constraints"

Transcription

1 Modfy Bayesan Network Structure wth Inconsstent Constrants Y Sun and Yun Peng Department of Computer Scence and Electrcal Engneerng Unversty of Maryland Baltmore County Baltmore, MD, Abstract Ths paper presents a theoretcal framework and related methods for ntegratng probablstc knowledge represented as low dmensonal dstrbutons (also called constrants) nto an exstng Bayesan network (BN), even when these constrants are nconsstent wth the structure of the BN due to dependences among relevant varables n the constrants beng absent n the BN. Wthn ths framework, a method has been developed to dentfy structural nconsstences. Methods have also been developed to overcome such nconsstences by modfyng the structure of the exstng BN. 1 Introducton Knowledge ntegraton nvolves modfyng the exstng knowledge base accordng to newly dscovered knowledge. Ths process can be dffcult because peces of new knowledge comng from dfferent sources may conflct wth each other or wth the exstng knowledge base. In ths stuaton we say the peces of new knowledge are nconsstent. In ths paper we focus on a specfc knd of knowledge ntegraton,.e., the dscrete probablstc knowledge ntegraton for uncertan knowledge, where the knowledge base s represented as a dscrete ont probablty dstrbuton (JPD), and peces of new knowledge are represented as lower dmensonal dstrbutons, whch are also called constrants. We are especally nterested n representng the knowledge base usng a Bayesan network (BN) for ts compact representaton of nterdependences (Pearl 1988). Each node n a BN represents a varable n the knowledge base, and arcs between the nodes represent the nterdependences among varables. All the nodes and edges form a drected acyclc graph (DAG) of the BN, whch we refer to as the structure of BN. A condtonal probablty table (CPT) s attached to each node to represent the strength of the nterdependences. Usng BN ntroduces an addtonal challenge for ntegraton,.e., a constrant may conflct wth the structure of the exstng BN when the probablstc dependences among some of the varables n the constrant are absent n the BN. We call ths knd of constrant structural nconsstent constrant. When the structural nconsstency occurs, one can choose to 1) keep the structure of the BN and ntegrate as much as possble of the knowledge n the constrant that s consstent and gnore or reect the nconsstent part of the knowledge n the constrant; 2) modfy the structure of the BN so that the knowledge n the constrant can be completely ntegrated nto the BN; or 3) fnd a trade-off between the above two optons. Exstng methods n ths area manly take the frst opton. The ntegraton of peces of new knowledge s carred out by only updatng the parameters of the exstng BN whle keepng the network structure ntact. The ratonale behnd ths s that n many stuatons the structure of the exstng BN represents more stable and relable aspects of the doman knowledge, and therefore s desrable to not change frequently. In ths paper, we explore the second opton to completely ntegrate the structural nconsstent constrant. The motvaton of dong ths s that when new peces of knowledge are more up-to-date or come from relable sources and the new dependences they brng should be accepted, t s necessary and benefcal to modfy the structure of the exstng BN so that the new constrants, ncludng the new dependences they contan, can be ntegrated nto the BN n ts entrety. The rest of ths paper s organzed as follows. In Secton 2 we ntroduce some background and formally defne the problem we set to solve. In Secton 3 we brefly revew the exstng works related to ths problem. In Secton 4 we explore how to dentfy structural nconsstences between the constrants and the BN. In Secton 5 we nvestgate how to overcome the dentfed structural nconsstences durng knowledge ntegraton. Fnally, we show the effectveness of the proposed methods through experments n Secton 6 and conclude the paper n Secton 7 wth drectons of future research.

2 2 Background and the Problem We wll follow the followng conventons n ths paper. To name a set of varables, we wll use captal talc letters such as X, Y, Z and wth superscrpts such as X, Y, Z k for subsets of varables. To name ther nstantatons, we wll use lower case talc letters such as x, y, z correspondngly. To name an ndvdual varable, we use captal talc letters such as A, B, C. To name ther nstantatons, we wll use lower case talc letters such as a, b, c correspondngly. If the ndvdual varable belongs to a set, we wll use subscrpts to ndcate ts poston n the set, such as X for the th varable n set X, and x for ts nstantaton. P, Q, R are specfcally used to ndcate probablty dstrbutons, and bold P, Q, R are used for sets of dstrbutons. Knowledge ntegraton refers to the process of ncorporatng new knowledge nto an exstng knowledge base. The probablstc knowledge base can be represented usng JPD PX ( ), where X represents the set of varables n the doman. The new knowledge that needs to be ntegrated nto PX ( ) usually comes from more up-to-date or more specfc observatons for a certan perspectve of the doman. It can be represented as a lower-dmensonal probablty dstrbuton over a subset of X, whch s called probablstc constrant, or constrant for short. We use the probablty dstrbuton R ( Y ) to denote the th pece of new knowledge or the th constrant whch s a dstrbuton over the set of varables Y, where Y X. A set of m constrants can be represented as R { R ( Y ),..., R ( Y )}. 1 m Defnton 1 (Constrant Satsfacton) Let PX ( ) be a JPD, and RY ( ) be a constrant, where Y X. PX ( ) s sad to satsfy RY ( ) f P( Y) R( Y),.e., the margnal dstrbuton of PX ( ) on varable set Y equals RY ( ). Defnton 2 (Consstent Constrants) If there s at least one JPD PX ( ) that can satsfy all the constrants n R,.e., R ( Y ) R, P( Y ) R( Y ) holds, then we say constrants n R are consstent. Defnton 3 (Inconsstent Constrants) If there does not exst a JPD that can satsfy all the constrants n R, then we say constrants n R are nconsstent. The probablstc knowledge ntegraton problem s to modfy PX ( ) to satsfy all the constrants n R. Meanwhle, t s desrable to mnmze the modfcaton to PX ( ) durng the ntegraton process, whch can be measured usng some metrcs such as I-dvergence (Csszar 1975; Kullback and Lebler 1951; Vomlel 1999) or ts weghted verson I- aggregate (Vomlel 2003) defned afterwards. Defnton 4 (I-dvergence) Gven two probablty dstrbutons PX ( ) and QX ( ), I-dvergence from PX ( ) to QX ( ) s: 1 m Px ( ) P( x)log f P Q I( P Q) xx Qx ( ) (1) otherwse where P Q means Q domnates P,.e., { X P( X ) 0} ' ' { X Q( X ) 0}. Defnton 5 (I-aggregate) Let P be a JPD, Q { Q1,..., Q m } be a set of dstrbutons, and be a nonnegatve weght for Q. I-aggregate s the weghted sum of I-dvergence for all of the dstrbutons n Q,.e., ( P, Q ) I( P Q) (2) where 0< <1 and 1. BNs have been wdely used for representng large uncertant knowledge bases. For a gven BN G, we use G s to refer to the DAG,.e., the network structure of G, and G P to refer to the set of CPTs. A BN can then be represented as G ( GS, GP), where GS {( X, )}. Here s the set of parents for X, and GP { P( X )}. The followng are some graph defntons based on the DAG of the BN (Geger, Verma and Pearl 1990). Defnton 6 (I-Map) Gven a DAG G s, and a JPD P whch uses the same set of varables as G s, f every ndependency mpled by Gs holds n P, then we say Gs s an I-Map of P (Geger, Verma and Pearl 1990). A complete DAG s an I-Map of any dstrbuton. Defnton 7 (Mnmal I-Map) The mnmal I-Map of P s a DAG G s that by removng any arc fromg s ntroduces ndependences that do not hold n P (Geger, Verma and Pearl 1990). From the defnton of a mnmal I-Map we know that, to represent P, the BN structure Gs needs to be an I-Map of P, f t s not a mnmal I-Map of P. That means Gs can have more dependences than P, but not less (Geger, Verma and Pearl 1990). Ths leads to our defnton for structural nconsstent constrant. Defnton 8 (Structural Inconsstent Constrant) Gven DAG G s over a set of varables X, and a constrant R over a set of varables Y X, f every ndependency mpled by G s nvolvng varables n Y holds n R, then we say R s structurally consstent wth Gs. Otherwse, we say R s structurally nconsstent wth G s, or we say R s a structural nconsstent constrant for G. s Structural nconsstent constrant (defned n Defnton 8) and nconsstent constrants (defned n Defnton 3) refer to dfferent types of nconsstences, and we refer them as Type I and Type II nconsstency, respectvely, n the rest of ths paper. Type I nconsstency n Defnton 3 s defned for a set of constrants. It can arse whether or not the knowledge base s represented as a BN. Type II ncon-

3 sstency n Defnton 8 s defned for an ndvdual constrant wth respect to a partcular BN. It can occur only when the knowledge base s represented as a BN, and some dependency relatons n the constrant are absent n the BN. Fgure 2 below shows some examples of nconsstent constrants for the BN n Fgure 1. Constrants n Fgure 2(a) have Type I nconsstency because R1( A) R2( A). Constrant n Fgure 2(b) has Type II nconsstency because t contradcts wth the dependency relaton n the BN structure of Fgure 1, where B and C are ndependent gven A, whle R3( B, C A) R3( B A) R3( C A). Fgure 1: A 3 Node BN (a) Constrants wth Type I Inconsstency (b) Constrant wth Type II Inconsstency Fgure 2: Inconsstent Constrants Wth the defnton of Type I and Type II nconsstences, the problem we set to solve s formally defned as follows: Gven BN G ( GS, GP) wth JPD PX ( ), and a set of 1 m constrants R { R1 ( Y ),..., Rm ( Y )} wth at least one constrant havng Type II nconsstency, construct a new BN ' ' ' G ( GS, GP) wth probabllty dstrbuton P ' that meet the followng condtons: ' C1: Constrant satsfacton: R ( Y ) R, P ( Y ) R( Y ) ; ' C2: Mnmalty: I( P ( X ) P( X )) s as small as possble. 3 Related Works In ths secton we brefly revew the exstng works related to the problem. IPFP (Iteratve Proportonal Fttng Procedure) (Kruthof 1937) and vrtual evdence method (Pearl 1990) are the two bass methods that support the other methods n ths secton. IPFP s a mathematcal procedure that teratvely modfes a JPD to satsfy a set of constrants whle mantanng mnmum I-dvergence to the orgnal dstrbuton. Vomlel appled IPFP to knowledge ntegraton problems wth consstent constrants (Vomlel 1999). The core of IPFP s I- proecton (Valtorta, Km and Vomlel 2002). Specfcally, for a set of constrants R and an ntal JPD, the IPFP procedure s carred out by teratvely modfyng the current JPD Q ( ) k 1 X accordng to the followng formula, usng one constrant R ( Y ) n R at a tme: R ( Y ) Qk( X ) Qk 1( X ), (3) Q ( Y ) k 1 where k mod m, whch determnes the constrant used at teraton k, and m s the number of constrants n R. Here, Qk s sad to be an I-proecton of Q ( ) k 1 X on the set of all JPDs that satsfy R ( Y ), and as such, Qk has the mnmum I-dvergence from Q ( ) k 1 X among all the JPDs that satsfy R ( Y ). One lmtaton of IPFP-based ntegraton methods s that t works on JPDs, not on BNs. To solve ths problem, Peng and Dng proposed E-IPFP (Peng and Dng 2005) based on IPFP by addng a structure constrant: n Q ( X ) Q ( X ), (4) k k 1 1 where ( X, ) Gs. Qk s the JPD of the BN whose CPTs are extracted from Qk 1 accordng to G s (Peng et al. 2012). When constrants n R are consstent, E-IPFP wll converge to a new BN whch 1) has the same structure as the gven BN, 2) satsfes all the constrants n R, and 3) wth changes to the gven BN mnmzed. Another lmtaton of IPFP s that t wll not converge but oscllates when constrants are nconsstent (Csszar 1975; Peng and Dng 2005; Vomlel 2003). To deal wth nconsstent constrants (.e., Type I), Vomlel proposed CC-IPFP (Vomlel 1999; 2003) and GEMA (Vomlel 2003) by extendng IPFP. Later Peng et al. proposed SMOOTH (Zhang and Peng 2008) whch has better performance than CC-IPFP and GEMA. SMOOTH crcumvents the nconsstency by makng b-drectonal modfcatons durng the ntegraton process. When the knowledge base s represented as a BN, SMOOTH can be extended to E-IPFP- SMOOTH (Peng and Zhang 2010), whch can deal wth both Type I and Type II nconsstences by modfyng the constrants. Ths s acceptable when constrants are consdered to contan some nose. But when new dependency relatons are consdered relable, E-IPFP-SMOOTH wll not be able to ntegrate ths nformaton nto the revsed BN snce the structure of BN s not changed.

4 Next we brefly dscuss vrtual evdence method, not only because t plays an mportant role n BN belef update wth uncertan evdences, but also because t serves as a bass for our proposed methods n Secton 5. Researchers have dentfed 3 knds of evdences n BN belef update: hard evdence, vrtual evdence, and soft evdence (Pan, Peng, and Dng 2006). Hard evdence specfes the partcular state the varable s n, soft evdence specfes the probablty dstrbuton for the states the varable s n (Valtorta, Km, and Vomlel 2002), and vrtual evdence specfes the lkelhood rato for the uncertanty of the observatons of the states the varable s n (Pearl 1990). For example, f one beleves that the event represented by varable A occurs wth probablty p, and does not occur wth probablty 1 p, then the lkelhood rato can be represented as L( A) p : (1 p), whch does not necessarly need to be specfc probabltes. For belef update wth vrtual evdence, Pearl proposed the vrtual evdence method (Pearl 1990). For vrtual evdence on varable A wth lkelhood rato L(A), ths method extends the gven BN by creatng a bnary vrtual nodev as the chld of A, wth state v standng for the event that A ahas occurred, and the CPT of V satsfyng the followng equaton: P( v A a) : P( v A a) LA ( ). (5) It then treats v as a hard evdence by nstantatngv to v. Ths wll update the belef n the gven BN, and the updated BN wll satsfy the gven vrtual evdence. For belef update wth soft evdence, Chan and Darwche extended the vrtual evdence method by frst convertng each soft evdence to a vrtual evdence (Chan and Darwche 2005). Let PX ( ) be a dstrbuton and RY ( ) be a soft evdence, where Y X. We use y (1),..., y ( l ) to represent all the possble nstantatons of Y whch form a mutually exclusve and exhaustve set of events. RY ( ) can then be converted to a vrtual evdence wth the followng lkelhood rato: R( y(1) ) R( y(2) ) R( y( l) ) LY ( ) : :...:. (6) P( y ) P( y ) P( y ) (1) (2) ( l) For multple vrtual evdences, the belef update for one vrtual evdence wll not affect the belef update for the other vrtual evdences (Pan, Peng, and Dng 2006). However, ths s not the case for multple soft evdences. For BN belef update wth multple soft evdences, Peng et al. proposed BN-IPFP (Peng, Zhang and Pan 2010) whch combnes Pearl s vrtual evdence method wth IPFP. It s able to preserve margnal dstrbutons specfed n all soft evdences after convertng them to vrtual evdences. BN- IPFP converges teratvely, and the resultng BN can satsfy multple soft evdences at the same tme. 4 Identfy Structural Inconsstences In ths secton we examne how to dentfy such structural nconsstences for a gven constrant and a BN. Ths s done by checkng f every dependency relaton n the constrant also holds n the BN. Frst we dscover dependency relatons from the constrant and from the BN. We can use d-separaton method (Pearl 1988) to dscover dependency relatons n the BN. Ths method tells whether every two varables are condtonally ndependent under another set of varables by lookng at the connecton type between these two varables n the BN. We can use the ndependence test to fnd out dependency relatons n the constrant. The zero-order (uncondtonal) ndependence test s to check whether R( A, B) R( A) R( B) holds for each par of varables A and B n the varable set of R. Smlarly, the -order ndependence test for R s to check whether Equaton 7 holds for each par of varables A and B, where Z s a set that contans any number of varables n the varable set of R other than A and B. R( A, B Z) R( A Z) R( B Z) (7) The followng s the algorthm for dentfyng structural nconsstences between the constrant and the BN. Algorthm INCIDENT Input: BN G ( Gs, GP) wth orderng of nodes n G s, and constrants R { R1, R2,..., R m }. Output: Structural Consstent constrant set R, structural nconsstent constrant set R and Dependency Lst DL. Steps: 1. Create empty sets R and R, and empty lst DL ; 2. Perform the followng steps for each constrant R n R : 2.1 Create an empty lst DL ; 2.2 For each par of varables A and B n R, do from zero-order to ( R 2) -order ndependence test on them, where R s the number of varables n R. If the test fals, add R, A, B, Z to DL ; 2.3 For each entry R, A, B, Z n DL, use d- separaton method to test whether A and B are ndependent gven Z,.e., A B Z. If the test fals, remove ths entry from DL ; 2.4 If DL s empty, add R to R. Otherwse, add R to R, and merge DL nto DL ; 3. Return R, R, and DL.

5 5 Overcome Structural Inconsstences In ths secton we focus on modfyng the structure of the exstng BN to accommodate the dentfed structural nconsstences n a way smlar to addng vrtual evdence node n Pearl s vrtual evdence method. We can add one node n BN for each structural nconsstent constrant and make ths node the chld of all varables of that constrant. Then we set the CPT of the node wth the lkelhood rato calculated from the constrant usng (6), and set the state of the node to true. Ths guarantees the varables covered by the constrant to be dependent, whch wll overcome any structural nconsstences for that constrant. Ths forms the core of our AddNode method, whch can also be used wth E-IPFP to ntegrate a set of constrants whch may contan both structural consstent constrants and structural nconsstent constrants. The followng s the algorthm whch adds nodes for structural nconsstent constrants n R, and updates BN wth constrants n R usng E-IPFP. The nput constrants are appled teratvely durng the process. When the algorthm converges, all constrants are satsfed (C1). Besdes, the IPFP style computng makes the JPD of the fnal BN as close to the JPD of the orgnal BN as possble (C2). We are workng to formally prove these clams. Algorthm AddNode+E-IPFP Input: BN G ( GS, GP) wth orderng of nodes n G S, and constrant set R { R1, R2,..., R m }. Output: BN G ' ( G ', ' S GP) that satsfes R wth JPD as close to that of G as possble. Steps: 1. Run INCIDENT to partton R nto R and R ; 2. For each constrant R n R, add a new node to G wth varables n R as ts parents; S 3. /* ths step s the same as AddNode method */ For each constrant R n R, 3.1 Calculate lkelhood rato for R usng ( (1) ) ( (2)) ( ( )) R R y R y R y l LY ( ) : :...: PY ( ) P( y ) P( y ) P( y ), (1) (2) ( l ) where Y s the varable set of R, and l s the number of dstnct nstantatons for Y ; 3.2 Construct CPT ofv wth lkelhood rato LY ( ); 3.3 Set the state ofv to true; 4. Apply one teraton of E-IPFP wth R and the updated BN as nput,.e., 4.1 Do I-proecton for the current probablty dstrbuton P ( ) k 1 X on each constrant R n R : V R ( Y ) P X P X, P ( Y ) k( ) k1( ) k 1 where Y s the varable set of set of all varables n G ; 4.2 Form and apply the structure constrant: n P ( X ) P ( X ), where ( X, ) G ; k k 1 1 s R, and X s the 5. Repeat step 3 and step 4 untl the JPD of the updated BN does not change; 6. Return ' ( ', ' ' G GS GP), where GS s the updated structure after addng nodes to G S, and G ' P s the CPTs for the nodes n G. ' S 6 Experments and Results Our experments are based on the BN shown n Fgure 3, and constrant set R { R1, R2, R3}, where R1( A, C, E ) = ,0.2232,0.03,0.03,0.1672,0.2728,0. 07,0.07, R 2 ( BC, ) = 0.6, 0.1, 0.2, 0.1 and R3( C, D, E ) = (0.228,0.372, 0.076,0.124,0.04,0.04,0.06,0.06). Fgure 3: A 5 Node BN Frst we run the INCIDENT algorthm to dscover the structural nconsstences. R { R1, R2}, R { R3} and DL= {< R 1, A, C, >, < R 1, A, E, >, < R 1, A, C,{ E } >, < R 2, B, C, >} are returned as the output of INCIDENT. Then we run AddNode+E-IPFP to ntegrate constrants n R. The BN structure has been updated as n Fgure 4 after addng nodes for constrants n R. Fgure 4: Modfed BN Structure

6 Table 1 gves the performance comparson for dfferent methods. From the table we can see that E-IPFP-SMOOTH does not add any extra nodes to the exstng BN. But ts I- aggregate s not 0. AddNode+E-IPFP adds extra nodes to the exstng BN. But t has 0 I-aggregate. Table 1 ncludes experment results for two more effcent varatons of AddNode+E-IPFP. AddNode+Merge frst merges the constrants n R nto a sngle constrant usng IPFP algorthm, and then adds a sngle node to the exstng BN for the merged constrant. AddNode+Fatorzaton frst decomposes each constrant n R nto sub-constrants accordng to the ndependences among ts varables, and then merges the sub-constrants that are structural nconsstent wth the exstng BN, and at the end adds a sngle node to the exstng BN for the merged constrant. Table 1: Experment Result wth 5 Node BN To further compare the performance of the dfferent versons of our methods and get a sense of ther scalablty, we have conducted experments wth an artfcally composed BN of 10 dscrete varables. The nput constrant set conssts of 3 consstent constrants and 3 structural nconsstent constrants. The results are gven n Table 2 below. AddNode+Merge and AddNode+Fatorzaton have better performance than AddNode+E-IPFP on large BN because these two methods add fewer nodes to the exstng BN. Table 2: Experment Result wth 10 Node BN 7 Conclusons In ths paper we propose several methods to ntegrate structural nconsstent constrants nto a BN. The methods frst dentfy the structural nconsstences between the constrant and the exstng BN, and then overcome the dentfed structural nconsstences by addng nodes to the exstng BN n a way smlar to the vrtual evdence method. Experments show that the proposed methods have an advantage n ncorporatng new structural nformaton. We look forward to applyng the framework and related methods to update a BN bult for a demand-supply network wth uncertan demand-reply relatons. The nformaton used to update ths BN may requre addng new demand-supply relatons to the exstng BN. Ths s mportant n supply chan management to better satsfy the needs of the customers. Our future work also ncludes mprovng the performance of our methods so they can be easly appled to ndustry problems modeled wth large BNs. References [1] H. Chan and A. Darwche On the Revson of Probablstc Belefs usng Uncertan Evdence. Artfcal Intellgence, [2] I. Csszar I-dvergence Geometry of Probablty Dstrbutons and Mnmzaton Problems. The Annuals of Probablty, 3(1): [3] D. Geger, T Verma and J. Pearl Identfyng Independence n Bayesan Networks. Networks, 20: [4] R.Kruthof Telefoonverkeersrekenng. De Ingeneur, 52: [5] S. Kullback and R.A. Lebler On Informaton and Suffcency. Ann. Math. Stats, 22: [6] R. Pan, Y. Peng and Z. Dng Belef Update n Bayesan Networks Usng Uncertan Evdence. In Proceedngs of the IEEE Internatonal Conference on Tools wth Artfcal Intellgence, [7] J. Pearl Probablstc Reasonng n Intellgent Systems: Networks of Plausble Inference. San Mateo: Morgan Kaufman. [8] J. Pearl Jeffery s Rule, Passage of Experence, and Neo-Bayesansm. Knowledge Representaton and Defeasble Reasonng. H.E. Kyburg, Jr., R.P. Lou, and G.N. Carlson, Eds. Boston: Kluwer Academc Publshers. [9] Y. Peng and Z. Dng Modfyng Bayesan Networks by Probablty Constrants. In Proc. 21st Conference on Uncertanty n Artfcal Intellgence. Ednburgh. [10] Y. Peng, Z. Dng, S. Zhang and R. Pan Bayesan Network Revson wth Probablstc Constrants. Internatonal Journal of Uncertanty, Fuzzness and Knowledge- Based Systems, [11] Y. Peng and S. Zhang Integratng Probablty Constrants nto Bayesan Nets. In Proceedngs of 9th European Conference on Artfcal Intellgence (ECAI2010), Lsbon, Portugal. [12] Y. Peng, S. Zhang and R. Pan Bayesan Network Reasonng wth Uncertanty Evdences. Internatonal Journal of Uncertanty, Fuzzness and Knowledge-Based Systems, 18(5): [13] M. Valtorta, Y. Km and J. Vomlel Soft Evdental Update for Probablstc Multagent Systems. Internatonal Journal of Approxmate Reasonng, 29(1): [14] J. Vomlel Methods of Probablstc Knowledge Integraton. Ph.D. dss., Department of Cybernetcs, Faculty of Electrcal Engneerng, Czech Techncal Unversty. [15] J. Vomlel Integratng Inconsstent Data n a Probablstc Model. Journal of Appled Non-Classcal Logcs, 14(3): [16] S. Zhang and Y. Peng An Effcent Method for Probablstc Knowledge Integraton. In Proceedngs of The 20th IEEE Internatonal Conference on Tools wth Artfcal Intellgence (ICTAI-2008), Nov Dayton, Oho.

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009 College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:

More information

The Minimum Universal Cost Flow in an Infeasible Flow Network

The Minimum Universal Cost Flow in an Infeasible Flow Network Journal of Scences, Islamc Republc of Iran 17(2): 175-180 (2006) Unversty of Tehran, ISSN 1016-1104 http://jscencesutacr The Mnmum Unversal Cost Flow n an Infeasble Flow Network H Saleh Fathabad * M Bagheran

More information

Power law and dimension of the maximum value for belief distribution with the max Deng entropy

Power law and dimension of the maximum value for belief distribution with the max Deng entropy Power law and dmenson of the maxmum value for belef dstrbuton wth the max Deng entropy Bngy Kang a, a College of Informaton Engneerng, Northwest A&F Unversty, Yanglng, Shaanx, 712100, Chna. Abstract Deng

More information

BayesOWL: A Prototype System for Uncertainty in Semantic Web

BayesOWL: A Prototype System for Uncertainty in Semantic Web BayesOWL: A Prototype System for Uncertanty n Semantc Web Shenyong Zhang 1,2, Y Sun 2, Yun Peng 2, Xaopu Wang 1 1 Department of Astronomy and Appled Physcs Unversty of Scence and Technology of hna, Hefe,

More information

International Journal of Mathematical Archive-3(3), 2012, Page: Available online through ISSN

International Journal of Mathematical Archive-3(3), 2012, Page: Available online through   ISSN Internatonal Journal of Mathematcal Archve-3(3), 2012, Page: 1136-1140 Avalable onlne through www.ma.nfo ISSN 2229 5046 ARITHMETIC OPERATIONS OF FOCAL ELEMENTS AND THEIR CORRESPONDING BASIC PROBABILITY

More information

BAYESIAN NETWORK REASONING WITH UNCERTAIN EVIDENCES

BAYESIAN NETWORK REASONING WITH UNCERTAIN EVIDENCES Internatonal Journal of Uncertanty, Fuzzness and Knowledge-Based Systes Vol. 8, No. 5 (200) 539 564 World Scentfc Publshng Copany DOI: 0.42/S02848850006696 BAYESIAN NETWORK REASONING WITH UNCERTAIN EVIDENCES

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

Lecture 12: Discrete Laplacian

Lecture 12: Discrete Laplacian Lecture 12: Dscrete Laplacan Scrbe: Tanye Lu Our goal s to come up wth a dscrete verson of Laplacan operator for trangulated surfaces, so that we can use t n practce to solve related problems We are mostly

More information

A new construction of 3-separable matrices via an improved decoding of Macula s construction

A new construction of 3-separable matrices via an improved decoding of Macula s construction Dscrete Optmzaton 5 008 700 704 Contents lsts avalable at ScenceDrect Dscrete Optmzaton journal homepage: wwwelsevercom/locate/dsopt A new constructon of 3-separable matrces va an mproved decodng of Macula

More information

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U) Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of

More information

Artificial Intelligence Bayesian Networks

Artificial Intelligence Bayesian Networks Artfcal Intellgence Bayesan Networks Adapted from sldes by Tm Fnn and Mare desjardns. Some materal borrowed from Lse Getoor. 1 Outlne Bayesan networks Network structure Condtonal probablty tables Condtonal

More information

VARIATION OF CONSTANT SUM CONSTRAINT FOR INTEGER MODEL WITH NON UNIFORM VARIABLES

VARIATION OF CONSTANT SUM CONSTRAINT FOR INTEGER MODEL WITH NON UNIFORM VARIABLES VARIATION OF CONSTANT SUM CONSTRAINT FOR INTEGER MODEL WITH NON UNIFORM VARIABLES BÂRZĂ, Slvu Faculty of Mathematcs-Informatcs Spru Haret Unversty barza_slvu@yahoo.com Abstract Ths paper wants to contnue

More information

Fuzzy Boundaries of Sample Selection Model

Fuzzy Boundaries of Sample Selection Model Proceedngs of the 9th WSES Internatonal Conference on ppled Mathematcs, Istanbul, Turkey, May 7-9, 006 (pp309-34) Fuzzy Boundares of Sample Selecton Model L. MUHMD SFIIH, NTON BDULBSH KMIL, M. T. BU OSMN

More information

On the Revision of Probabilistic Beliefs using Uncertain Evidence

On the Revision of Probabilistic Beliefs using Uncertain Evidence On the Revson of Probablstc Belefs usng Uncertan Evdence He Chan and Adnan Darwche Computer Scence Department Unversty of Calforna, Los Angeles Los Angeles, CA 90095 {he,darwche}@cs.ucla.edu Abstract We

More information

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family IOSR Journal of Mathematcs IOSR-JM) ISSN: 2278-5728. Volume 3, Issue 3 Sep-Oct. 202), PP 44-48 www.osrjournals.org Usng T.O.M to Estmate Parameter of dstrbutons that have not Sngle Exponental Famly Jubran

More information

A New Evolutionary Computation Based Approach for Learning Bayesian Network

A New Evolutionary Computation Based Approach for Learning Bayesian Network Avalable onlne at www.scencedrect.com Proceda Engneerng 15 (2011) 4026 4030 Advanced n Control Engneerng and Informaton Scence A New Evolutonary Computaton Based Approach for Learnng Bayesan Network Yungang

More information

A Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach

A Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach A Bayes Algorthm for the Multtask Pattern Recognton Problem Drect Approach Edward Puchala Wroclaw Unversty of Technology, Char of Systems and Computer etworks, Wybrzeze Wyspanskego 7, 50-370 Wroclaw, Poland

More information

NUMERICAL DIFFERENTIATION

NUMERICAL DIFFERENTIATION NUMERICAL DIFFERENTIATION 1 Introducton Dfferentaton s a method to compute the rate at whch a dependent output y changes wth respect to the change n the ndependent nput x. Ths rate of change s called the

More information

4DVAR, according to the name, is a four-dimensional variational method.

4DVAR, according to the name, is a four-dimensional variational method. 4D-Varatonal Data Assmlaton (4D-Var) 4DVAR, accordng to the name, s a four-dmensonal varatonal method. 4D-Var s actually a drect generalzaton of 3D-Var to handle observatons that are dstrbuted n tme. The

More information

Simultaneous Optimization of Berth Allocation, Quay Crane Assignment and Quay Crane Scheduling Problems in Container Terminals

Simultaneous Optimization of Berth Allocation, Quay Crane Assignment and Quay Crane Scheduling Problems in Container Terminals Smultaneous Optmzaton of Berth Allocaton, Quay Crane Assgnment and Quay Crane Schedulng Problems n Contaner Termnals Necat Aras, Yavuz Türkoğulları, Z. Caner Taşkın, Kuban Altınel Abstract In ths work,

More information

Physics 5153 Classical Mechanics. D Alembert s Principle and The Lagrangian-1

Physics 5153 Classical Mechanics. D Alembert s Principle and The Lagrangian-1 P. Guterrez Physcs 5153 Classcal Mechancs D Alembert s Prncple and The Lagrangan 1 Introducton The prncple of vrtual work provdes a method of solvng problems of statc equlbrum wthout havng to consder the

More information

NP-Completeness : Proofs

NP-Completeness : Proofs NP-Completeness : Proofs Proof Methods A method to show a decson problem Π NP-complete s as follows. (1) Show Π NP. (2) Choose an NP-complete problem Π. (3) Show Π Π. A method to show an optmzaton problem

More information

Stat260: Bayesian Modeling and Inference Lecture Date: February 22, Reference Priors

Stat260: Bayesian Modeling and Inference Lecture Date: February 22, Reference Priors Stat60: Bayesan Modelng and Inference Lecture Date: February, 00 Reference Prors Lecturer: Mchael I. Jordan Scrbe: Steven Troxler and Wayne Lee In ths lecture, we assume that θ R; n hgher-dmensons, reference

More information

Outline. Bayesian Networks: Maximum Likelihood Estimation and Tree Structure Learning. Our Model and Data. Outline

Outline. Bayesian Networks: Maximum Likelihood Estimation and Tree Structure Learning. Our Model and Data. Outline Outlne Bayesan Networks: Maxmum Lkelhood Estmaton and Tree Structure Learnng Huzhen Yu janey.yu@cs.helsnk.f Dept. Computer Scence, Unv. of Helsnk Probablstc Models, Sprng, 200 Notces: I corrected a number

More information

Hidden Markov Models

Hidden Markov Models CM229S: Machne Learnng for Bonformatcs Lecture 12-05/05/2016 Hdden Markov Models Lecturer: Srram Sankararaman Scrbe: Akshay Dattatray Shnde Edted by: TBD 1 Introducton For a drected graph G we can wrte

More information

Subset Topological Spaces and Kakutani s Theorem

Subset Topological Spaces and Kakutani s Theorem MOD Natural Neutrosophc Subset Topologcal Spaces and Kakutan s Theorem W. B. Vasantha Kandasamy lanthenral K Florentn Smarandache 1 Copyrght 1 by EuropaNova ASBL and the Authors Ths book can be ordered

More information

A Bayesian Network Approach to Ontology Mapping

A Bayesian Network Approach to Ontology Mapping A Bayesan Network Approach to Ontology Mappng Rong Pan, Zhongl Dng, Yang Yu, and Yun Peng Department of Computer Scence and Electrcal Engneerng Unversty of Maryland, Baltmore County Baltmore, Maryland,

More information

Markov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement

Markov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement Markov Chan Monte Carlo MCMC, Gbbs Samplng, Metropols Algorthms, and Smulated Annealng 2001 Bonformatcs Course Supplement SNU Bontellgence Lab http://bsnuackr/ Outlne! Markov Chan Monte Carlo MCMC! Metropols-Hastngs

More information

Graph Reconstruction by Permutations

Graph Reconstruction by Permutations Graph Reconstructon by Permutatons Perre Ille and Wllam Kocay* Insttut de Mathémathques de Lumny CNRS UMR 6206 163 avenue de Lumny, Case 907 13288 Marselle Cedex 9, France e-mal: lle@ml.unv-mrs.fr Computer

More information

Department of Computer Science Artificial Intelligence Research Laboratory. Iowa State University MACHINE LEARNING

Department of Computer Science Artificial Intelligence Research Laboratory. Iowa State University MACHINE LEARNING MACHINE LEANING Vasant Honavar Bonformatcs and Computatonal Bology rogram Center for Computatonal Intellgence, Learnng, & Dscovery Iowa State Unversty honavar@cs.astate.edu www.cs.astate.edu/~honavar/

More information

The Order Relation and Trace Inequalities for. Hermitian Operators

The Order Relation and Trace Inequalities for. Hermitian Operators Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence

More information

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS Avalable onlne at http://sck.org J. Math. Comput. Sc. 3 (3), No., 6-3 ISSN: 97-537 COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

More information

Volume 18 Figure 1. Notation 1. Notation 2. Observation 1. Remark 1. Remark 2. Remark 3. Remark 4. Remark 5. Remark 6. Theorem A [2]. Theorem B [2].

Volume 18 Figure 1. Notation 1. Notation 2. Observation 1. Remark 1. Remark 2. Remark 3. Remark 4. Remark 5. Remark 6. Theorem A [2]. Theorem B [2]. Bulletn of Mathematcal Scences and Applcatons Submtted: 016-04-07 ISSN: 78-9634, Vol. 18, pp 1-10 Revsed: 016-09-08 do:10.1805/www.scpress.com/bmsa.18.1 Accepted: 016-10-13 017 ScPress Ltd., Swtzerland

More information

The Study of Teaching-learning-based Optimization Algorithm

The Study of Teaching-learning-based Optimization Algorithm Advanced Scence and Technology Letters Vol. (AST 06), pp.05- http://dx.do.org/0.57/astl.06. The Study of Teachng-learnng-based Optmzaton Algorthm u Sun, Yan fu, Lele Kong, Haolang Q,, Helongang Insttute

More information

x = , so that calculated

x = , so that calculated Stat 4, secton Sngle Factor ANOVA notes by Tm Plachowsk n chapter 8 we conducted hypothess tests n whch we compared a sngle sample s mean or proporton to some hypotheszed value Chapter 9 expanded ths to

More information

A Note on Bound for Jensen-Shannon Divergence by Jeffreys

A Note on Bound for Jensen-Shannon Divergence by Jeffreys OPEN ACCESS Conference Proceedngs Paper Entropy www.scforum.net/conference/ecea- A Note on Bound for Jensen-Shannon Dvergence by Jeffreys Takuya Yamano, * Department of Mathematcs and Physcs, Faculty of

More information

LOW BIAS INTEGRATED PATH ESTIMATORS. James M. Calvin

LOW BIAS INTEGRATED PATH ESTIMATORS. James M. Calvin Proceedngs of the 007 Wnter Smulaton Conference S G Henderson, B Bller, M-H Hseh, J Shortle, J D Tew, and R R Barton, eds LOW BIAS INTEGRATED PATH ESTIMATORS James M Calvn Department of Computer Scence

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

Computation of Higher Order Moments from Two Multinomial Overdispersion Likelihood Models

Computation of Higher Order Moments from Two Multinomial Overdispersion Likelihood Models Computaton of Hgher Order Moments from Two Multnomal Overdsperson Lkelhood Models BY J. T. NEWCOMER, N. K. NEERCHAL Department of Mathematcs and Statstcs, Unversty of Maryland, Baltmore County, Baltmore,

More information

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity LINEAR REGRESSION ANALYSIS MODULE IX Lecture - 30 Multcollnearty Dr. Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur 2 Remedes for multcollnearty Varous technques have

More information

Chapter - 2. Distribution System Power Flow Analysis

Chapter - 2. Distribution System Power Flow Analysis Chapter - 2 Dstrbuton System Power Flow Analyss CHAPTER - 2 Radal Dstrbuton System Load Flow 2.1 Introducton Load flow s an mportant tool [66] for analyzng electrcal power system network performance. Load

More information

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4) I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes

More information

Chapter 13: Multiple Regression

Chapter 13: Multiple Regression Chapter 13: Multple Regresson 13.1 Developng the multple-regresson Model The general model can be descrbed as: It smplfes for two ndependent varables: The sample ft parameter b 0, b 1, and b are used to

More information

= z 20 z n. (k 20) + 4 z k = 4

= z 20 z n. (k 20) + 4 z k = 4 Problem Set #7 solutons 7.2.. (a Fnd the coeffcent of z k n (z + z 5 + z 6 + z 7 + 5, k 20. We use the known seres expanson ( n+l ( z l l z n below: (z + z 5 + z 6 + z 7 + 5 (z 5 ( + z + z 2 + z + 5 5

More information

1 Motivation and Introduction

1 Motivation and Introduction Instructor: Dr. Volkan Cevher EXPECTATION PROPAGATION September 30, 2008 Rce Unversty STAT 63 / ELEC 633: Graphcal Models Scrbes: Ahmad Beram Andrew Waters Matthew Nokleby Index terms: Approxmate nference,

More information

Maximizing Overlap of Large Primary Sampling Units in Repeated Sampling: A comparison of Ernst s Method with Ohlsson s Method

Maximizing Overlap of Large Primary Sampling Units in Repeated Sampling: A comparison of Ernst s Method with Ohlsson s Method Maxmzng Overlap of Large Prmary Samplng Unts n Repeated Samplng: A comparson of Ernst s Method wth Ohlsson s Method Red Rottach and Padrac Murphy 1 U.S. Census Bureau 4600 Slver Hll Road, Washngton DC

More information

Chapter Newton s Method

Chapter Newton s Method Chapter 9. Newton s Method After readng ths chapter, you should be able to:. Understand how Newton s method s dfferent from the Golden Secton Search method. Understand how Newton s method works 3. Solve

More information

A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS

A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS HCMC Unversty of Pedagogy Thong Nguyen Huu et al. A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS Thong Nguyen Huu and Hao Tran Van Department of mathematcs-nformaton,

More information

Finite Element Modelling of truss/cable structures

Finite Element Modelling of truss/cable structures Pet Schreurs Endhoven Unversty of echnology Department of Mechancal Engneerng Materals echnology November 3, 214 Fnte Element Modellng of truss/cable structures 1 Fnte Element Analyss of prestressed structures

More information

arxiv:cs.cv/ Jun 2000

arxiv:cs.cv/ Jun 2000 Correlaton over Decomposed Sgnals: A Non-Lnear Approach to Fast and Effectve Sequences Comparson Lucano da Fontoura Costa arxv:cs.cv/0006040 28 Jun 2000 Cybernetc Vson Research Group IFSC Unversty of São

More information

Speeding up Computation of Scalar Multiplication in Elliptic Curve Cryptosystem

Speeding up Computation of Scalar Multiplication in Elliptic Curve Cryptosystem H.K. Pathak et. al. / (IJCSE) Internatonal Journal on Computer Scence and Engneerng Speedng up Computaton of Scalar Multplcaton n Ellptc Curve Cryptosystem H. K. Pathak Manju Sangh S.o.S n Computer scence

More information

Bayesian predictive Configural Frequency Analysis

Bayesian predictive Configural Frequency Analysis Psychologcal Test and Assessment Modelng, Volume 54, 2012 (3), 285-292 Bayesan predctve Confgural Frequency Analyss Eduardo Gutérrez-Peña 1 Abstract Confgural Frequency Analyss s a method for cell-wse

More information

Lecture 4: November 17, Part 1 Single Buffer Management

Lecture 4: November 17, Part 1 Single Buffer Management Lecturer: Ad Rosén Algorthms for the anagement of Networs Fall 2003-2004 Lecture 4: November 7, 2003 Scrbe: Guy Grebla Part Sngle Buffer anagement In the prevous lecture we taled about the Combned Input

More information

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number

More information

Finding Dense Subgraphs in G(n, 1/2)

Finding Dense Subgraphs in G(n, 1/2) Fndng Dense Subgraphs n Gn, 1/ Atsh Das Sarma 1, Amt Deshpande, and Rav Kannan 1 Georga Insttute of Technology,atsh@cc.gatech.edu Mcrosoft Research-Bangalore,amtdesh,annan@mcrosoft.com Abstract. Fndng

More information

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction ECONOMICS 5* -- NOTE (Summary) ECON 5* -- NOTE The Multple Classcal Lnear Regresson Model (CLRM): Specfcaton and Assumptons. Introducton CLRM stands for the Classcal Lnear Regresson Model. The CLRM s also

More information

Difference Equations

Difference Equations Dfference Equatons c Jan Vrbk 1 Bascs Suppose a sequence of numbers, say a 0,a 1,a,a 3,... s defned by a certan general relatonshp between, say, three consecutve values of the sequence, e.g. a + +3a +1

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

Module 9. Lecture 6. Duality in Assignment Problems

Module 9. Lecture 6. Duality in Assignment Problems Module 9 1 Lecture 6 Dualty n Assgnment Problems In ths lecture we attempt to answer few other mportant questons posed n earler lecture for (AP) and see how some of them can be explaned through the concept

More information

Comparative Studies of Law of Conservation of Energy. and Law Clusters of Conservation of Generalized Energy

Comparative Studies of Law of Conservation of Energy. and Law Clusters of Conservation of Generalized Energy Comparatve Studes of Law of Conservaton of Energy and Law Clusters of Conservaton of Generalzed Energy No.3 of Comparatve Physcs Seres Papers Fu Yuhua (CNOOC Research Insttute, E-mal:fuyh1945@sna.com)

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Games of Threats. Elon Kohlberg Abraham Neyman. Working Paper

Games of Threats. Elon Kohlberg Abraham Neyman. Working Paper Games of Threats Elon Kohlberg Abraham Neyman Workng Paper 18-023 Games of Threats Elon Kohlberg Harvard Busness School Abraham Neyman The Hebrew Unversty of Jerusalem Workng Paper 18-023 Copyrght 2017

More information

EGR 544 Communication Theory

EGR 544 Communication Theory EGR 544 Communcaton Theory. Informaton Sources Z. Alyazcoglu Electrcal and Computer Engneerng Department Cal Poly Pomona Introducton Informaton Source x n Informaton sources Analog sources Dscrete sources

More information

HMMT February 2016 February 20, 2016

HMMT February 2016 February 20, 2016 HMMT February 016 February 0, 016 Combnatorcs 1. For postve ntegers n, let S n be the set of ntegers x such that n dstnct lnes, no three concurrent, can dvde a plane nto x regons (for example, S = {3,

More information

On the Repeating Group Finding Problem

On the Repeating Group Finding Problem The 9th Workshop on Combnatoral Mathematcs and Computaton Theory On the Repeatng Group Fndng Problem Bo-Ren Kung, Wen-Hsen Chen, R.C.T Lee Graduate Insttute of Informaton Technology and Management Takmng

More information

Online Appendix to: Axiomatization and measurement of Quasi-hyperbolic Discounting

Online Appendix to: Axiomatization and measurement of Quasi-hyperbolic Discounting Onlne Appendx to: Axomatzaton and measurement of Quas-hyperbolc Dscountng José Lus Montel Olea Tomasz Strzaleck 1 Sample Selecton As dscussed before our ntal sample conssts of two groups of subjects. Group

More information

On a direct solver for linear least squares problems

On a direct solver for linear least squares problems ISSN 2066-6594 Ann. Acad. Rom. Sc. Ser. Math. Appl. Vol. 8, No. 2/2016 On a drect solver for lnear least squares problems Constantn Popa Abstract The Null Space (NS) algorthm s a drect solver for lnear

More information

Bayesian Networks. Course: CS40022 Instructor: Dr. Pallab Dasgupta

Bayesian Networks. Course: CS40022 Instructor: Dr. Pallab Dasgupta Bayesan Networks Course: CS40022 Instructor: Dr. Pallab Dasgupta Department of Computer Scence & Engneerng Indan Insttute of Technology Kharagpur Example Burglar alarm at home Farly relable at detectng

More information

Structure and Drive Paul A. Jensen Copyright July 20, 2003

Structure and Drive Paul A. Jensen Copyright July 20, 2003 Structure and Drve Paul A. Jensen Copyrght July 20, 2003 A system s made up of several operatons wth flow passng between them. The structure of the system descrbes the flow paths from nputs to outputs.

More information

MODELING TRAFFIC LIGHTS IN INTERSECTION USING PETRI NETS

MODELING TRAFFIC LIGHTS IN INTERSECTION USING PETRI NETS The 3 rd Internatonal Conference on Mathematcs and Statstcs (ICoMS-3) Insttut Pertanan Bogor, Indonesa, 5-6 August 28 MODELING TRAFFIC LIGHTS IN INTERSECTION USING PETRI NETS 1 Deky Adzkya and 2 Subono

More information

A new Approach for Solving Linear Ordinary Differential Equations

A new Approach for Solving Linear Ordinary Differential Equations , ISSN 974-57X (Onlne), ISSN 974-5718 (Prnt), Vol. ; Issue No. 1; Year 14, Copyrght 13-14 by CESER PUBLICATIONS A new Approach for Solvng Lnear Ordnary Dfferental Equatons Fawz Abdelwahd Department of

More information

Foundations of Arithmetic

Foundations of Arithmetic Foundatons of Arthmetc Notaton We shall denote the sum and product of numbers n the usual notaton as a 2 + a 2 + a 3 + + a = a, a 1 a 2 a 3 a = a The notaton a b means a dvdes b,.e. ac = b where c s an

More information

Why BP Works STAT 232B

Why BP Works STAT 232B Why BP Works STAT 232B Free Energes Helmholz & Gbbs Free Energes 1 Dstance between Probablstc Models - K-L dvergence b{ KL b{ p{ = b{ ln { } p{ Here, p{ s the eact ont prob. b{ s the appromaton, called

More information

This article appeared in a journal published by Elsevier. The attached copy is furnished to the author for internal non-commercial research and

This article appeared in a journal published by Elsevier. The attached copy is furnished to the author for internal non-commercial research and Ths artcle appeared n a journal publshed by Elsevr. The attached copy s furnshed to the author for nternal non-commercal research and educaton use, ncludng for nstructon at the authors nsttuton and sharng

More information

Gaussian Mixture Models

Gaussian Mixture Models Lab Gaussan Mxture Models Lab Objectve: Understand the formulaton of Gaussan Mxture Models (GMMs) and how to estmate GMM parameters. You ve already seen GMMs as the observaton dstrbuton n certan contnuous

More information

A Bayesian Network Approach to Ontology Mapping

A Bayesian Network Approach to Ontology Mapping A Bayesan Network Approach to Ontology Mappng Rong Pan, Zhongl Dng, Yang Yu, and Yun Peng Department of Computer Scence and Electrcal Engneerng Unversty of Maryland, Baltmore County Baltmore, Maryland,

More information

Semi-supervised Classification with Active Query Selection

Semi-supervised Classification with Active Query Selection Sem-supervsed Classfcaton wth Actve Query Selecton Jao Wang and Swe Luo School of Computer and Informaton Technology, Beng Jaotong Unversty, Beng 00044, Chna Wangjao088@63.com Abstract. Labeled samples

More information

Engineering Risk Benefit Analysis

Engineering Risk Benefit Analysis Engneerng Rsk Beneft Analyss.55, 2.943, 3.577, 6.938, 0.86, 3.62, 6.862, 22.82, ESD.72, ESD.72 RPRA 2. Elements of Probablty Theory George E. Apostolaks Massachusetts Insttute of Technology Sprng 2007

More information

Identifying Dynamic Sequential Plans

Identifying Dynamic Sequential Plans Identfyng Dynamc Sequental Plans Jn Tan Department of Computer Scence Iowa State Unversty Ames, IA 50011 jtan@cs.astate.edu Abstract We address the problem of dentfyng dynamc sequental plans n the framework

More information

18.1 Introduction and Recap

18.1 Introduction and Recap CS787: Advanced Algorthms Scrbe: Pryananda Shenoy and Shjn Kong Lecturer: Shuch Chawla Topc: Streamng Algorthmscontnued) Date: 0/26/2007 We contnue talng about streamng algorthms n ths lecture, ncludng

More information

Expected Value and Variance

Expected Value and Variance MATH 38 Expected Value and Varance Dr. Neal, WKU We now shall dscuss how to fnd the average and standard devaton of a random varable X. Expected Value Defnton. The expected value (or average value, or

More information

Turbulence classification of load data by the frequency and severity of wind gusts. Oscar Moñux, DEWI GmbH Kevin Bleibler, DEWI GmbH

Turbulence classification of load data by the frequency and severity of wind gusts. Oscar Moñux, DEWI GmbH Kevin Bleibler, DEWI GmbH Turbulence classfcaton of load data by the frequency and severty of wnd gusts Introducton Oscar Moñux, DEWI GmbH Kevn Blebler, DEWI GmbH Durng the wnd turbne developng process, one of the most mportant

More information

A Bayesian Approach to Uncertainty Modeling in OWL Ontology

A Bayesian Approach to Uncertainty Modeling in OWL Ontology 137-04 1 A Bayesan Approach to Uncertanty Modelng n OWL Ontology Zhongl DING, Yun PENG, ong PAN Abstract-- Dealng wth uncertanty s crucal n ontology engneerng tasks such as doman modelng, ontology reasonng,

More information

Grover s Algorithm + Quantum Zeno Effect + Vaidman

Grover s Algorithm + Quantum Zeno Effect + Vaidman Grover s Algorthm + Quantum Zeno Effect + Vadman CS 294-2 Bomb 10/12/04 Fall 2004 Lecture 11 Grover s algorthm Recall that Grover s algorthm for searchng over a space of sze wors as follows: consder the

More information

Optimal Solution to the Problem of Balanced Academic Curriculum Problem Using Tabu Search

Optimal Solution to the Problem of Balanced Academic Curriculum Problem Using Tabu Search Optmal Soluton to the Problem of Balanced Academc Currculum Problem Usng Tabu Search Lorna V. Rosas-Téllez 1, José L. Martínez-Flores 2, and Vttoro Zanella-Palacos 1 1 Engneerng Department,Unversdad Popular

More information

6. Stochastic processes (2)

6. Stochastic processes (2) Contents Markov processes Brth-death processes Lect6.ppt S-38.45 - Introducton to Teletraffc Theory Sprng 5 Markov process Consder a contnuous-tme and dscrete-state stochastc process X(t) wth state space

More information

Estimation: Part 2. Chapter GREG estimation

Estimation: Part 2. Chapter GREG estimation Chapter 9 Estmaton: Part 2 9. GREG estmaton In Chapter 8, we have seen that the regresson estmator s an effcent estmator when there s a lnear relatonshp between y and x. In ths chapter, we generalzed the

More information

1 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations

1 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations Physcs 171/271 -Davd Klenfeld - Fall 2005 (revsed Wnter 2011) 1 Dervaton of Rate Equatons from Sngle-Cell Conductance (Hodgkn-Huxley-lke) Equatons We consder a network of many neurons, each of whch obeys

More information

The Expectation-Maximization Algorithm

The Expectation-Maximization Algorithm The Expectaton-Maxmaton Algorthm Charles Elan elan@cs.ucsd.edu November 16, 2007 Ths chapter explans the EM algorthm at multple levels of generalty. Secton 1 gves the standard hgh-level verson of the algorthm.

More information

Limited Dependent Variables

Limited Dependent Variables Lmted Dependent Varables. What f the left-hand sde varable s not a contnuous thng spread from mnus nfnty to plus nfnty? That s, gven a model = f (, β, ε, where a. s bounded below at zero, such as wages

More information

6. Stochastic processes (2)

6. Stochastic processes (2) 6. Stochastc processes () Lect6.ppt S-38.45 - Introducton to Teletraffc Theory Sprng 5 6. Stochastc processes () Contents Markov processes Brth-death processes 6. Stochastc processes () Markov process

More information

Notes on Frequency Estimation in Data Streams

Notes on Frequency Estimation in Data Streams Notes on Frequency Estmaton n Data Streams In (one of) the data streamng model(s), the data s a sequence of arrvals a 1, a 2,..., a m of the form a j = (, v) where s the dentty of the tem and belongs to

More information

Global Sensitivity. Tuesday 20 th February, 2018

Global Sensitivity. Tuesday 20 th February, 2018 Global Senstvty Tuesday 2 th February, 28 ) Local Senstvty Most senstvty analyses [] are based on local estmates of senstvty, typcally by expandng the response n a Taylor seres about some specfc values

More information

Hongyi Miao, College of Science, Nanjing Forestry University, Nanjing ,China. (Received 20 June 2013, accepted 11 March 2014) I)ϕ (k)

Hongyi Miao, College of Science, Nanjing Forestry University, Nanjing ,China. (Received 20 June 2013, accepted 11 March 2014) I)ϕ (k) ISSN 1749-3889 (prnt), 1749-3897 (onlne) Internatonal Journal of Nonlnear Scence Vol.17(2014) No.2,pp.188-192 Modfed Block Jacob-Davdson Method for Solvng Large Sparse Egenproblems Hongy Mao, College of

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

Singular Value Decomposition: Theory and Applications

Singular Value Decomposition: Theory and Applications Sngular Value Decomposton: Theory and Applcatons Danel Khashab Sprng 2015 Last Update: March 2, 2015 1 Introducton A = UDV where columns of U and V are orthonormal and matrx D s dagonal wth postve real

More information

Design and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm

Design and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm Desgn and Optmzaton of Fuzzy Controller for Inverse Pendulum System Usng Genetc Algorthm H. Mehraban A. Ashoor Unversty of Tehran Unversty of Tehran h.mehraban@ece.ut.ac.r a.ashoor@ece.ut.ac.r Abstract:

More information

ECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006)

ECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006) ECE 534: Elements of Informaton Theory Solutons to Mdterm Eam (Sprng 6) Problem [ pts.] A dscrete memoryless source has an alphabet of three letters,, =,, 3, wth probabltes.4,.4, and., respectvely. (a)

More information

Fundamental loop-current method using virtual voltage sources technique for special cases

Fundamental loop-current method using virtual voltage sources technique for special cases Fundamental loop-current method usng vrtual voltage sources technque for specal cases George E. Chatzaraks, 1 Marna D. Tortorel 1 and Anastasos D. Tzolas 1 Electrcal and Electroncs Engneerng Departments,

More information