Sensor localization using nonparametric generalized belief propagation in network with loop

Size: px
Start display at page:

Download "Sensor localization using nonparametric generalized belief propagation in network with loop"

Transcription

1 Sensor localzaton usng nonparametrc generalzed belef propagaton n network wth loop Vladmr Savc and Santago Zazo Post Prnt N.B.: When ctng ths work, cte the orgnal artcle. 009 IEEE. Personal use of ths materal s permtted. However, permsson to reprnt/republsh ths materal for advertsng or promotonal purposes or for creatng new collectve works for resale or redstrbuton to servers or lsts, or to reuse any copyrghted component of ths work n other works must be obtaned from the IEEE. Vladmr Savc and Santago Zazo, Sensor localzaton usng nonparametrc generalzed belef propagaton n network wth loop, 009, IEEE Proc. of Intl. onf. on Informaton Fuson (FUSION. Postprnt avalable at: Lnköpng Unversty Electronc Press

2 Sensor Localzaton usng Nonparametrc Generalzed Belef Propagaton n Network wth Loops Vladmr Savc Sgnal Processng Applcaton Group Polytechnc Unversty of Madrd udad Unverstara S/N, 8040 Madrd, vladmr@gaps.ssr.upm.es Abstract - Belef propagaton (BP s one of the bestknown graphcal model for nference n statstcal physcs, artfcal ntellgence, computer vson, etc. Furthermore, a recent research n dstrbuted sensor network localzaton showed us that BP s an effcent way to obtan sensor locaton as well as approprate uncertanty. However, BP convergence s not guaranteed n a network wth loops. In ths paper, we propose localzaton usng generalzed belef propagaton based on uncton tree method (GBP-JT and nonparametrc (partcle-based appromaton of ths algorthm (NGBP-JT. We llustrate t n a network wth loop where BP shows poor performance. In fact, we compared estmated locatons wth Nonparametrc Belef Propagaton (NBP algorthm. Accordng to our smulaton results, GBP-JT resolved the problems wth loops, but the prce for ths s unacceptable large computatonal cost. Therefore, our appromated verson of ths algorthm, NGBP-JT, reduced sgnfcantly ths cost, wth lttle effect on accuracy. Keywords: Localzaton, generalzed belef propagaton, uncton tree, partcle flters, loops. Introducton The localzaton conssts n obtanng the relatve or absolute poston of a sensor node together wth the uncertanty of ts estmate. Equppng every sensor wth a GPS recever or equvalent technology may be epensve, energy prohbtve and lmted to outdoor applcatons. Therefore, we consder the problem n whch some small number of sensors, called anchor nodes, obtan ther coordnates va GPS or by nstallng them at ponts wth known coordnates, and the rest, unknown nodes, must determne ther own coordnates. If unknown nodes were capable of hgh-power transmsson, they would be able to make measurements wth all anchor nodes (sngle-hop technque. However, we prefer to use energy-conservng devces wthout power amplfer, wth lack the energy necessary for long-range communcaton. In ths, mult-hop, case, each sensor has avalable nosy measurements only to several neghborng sensors. Santago Zazo Sgnal Processng Applcaton Group Polytechnc Unversty of Madrd udad Unverstara S/N, 8040 santago@gaps.ssr.upm.es A recent drecton of research n dstrbuted sensor network localzaton s the use of partcle flters [, ]. In [], Ihler et al. formulated the sensor network localzaton problem as an nference problem on a graphcal model and appled partcle based varant of belef propagaton (BP methods [4], the so-called nonparametrc belef propagaton (NBP algorthm, to obtan an appromate soluton to the sensor locatons. omparng wth determnstc algorthms [5, 6, 7], the man advantages of ths statstcal approach are ts easy mplementaton n a dstrbuted fashon and suffcency of a small number of teratons to converge. Furthermore, NBP s capable of provdng nformaton about locaton estmaton uncertantes and accommodatng non-gaussan dstance measurement errors. However, NBP convergence s not guaranteed n a network wth loops [4, 8], or even f NBP converges, t could provde us less accurate estmates. Therefore, n ths paper, we present a new varant of the NBP method whch solves problem wth loops. We propose localzaton usng generalzed belef propagaton based on uncton tree (GBP-JT and nonparametrc (partcle based appromaton of ths algorthm (NGBP- JT. Juncton tree model s a generalzaton of belef propagaton (BP that s correct for arbtrary graphs. Jordan proved t usng elmnaton procedure [9]. omparng wth Ihler s Nonparametrc Belef Propagaton (NBP algorthm, GBP-JT converges well n network wth loops, but the prce for ths s unacceptable large computatonal cost. Therefore, we mplemented appromated verson of ths algorthm, NGBP-JT, by drawng hgh-dmensonal partcles from approprate clques n the network. Moreover, n order to draw samples n hgh-probablstc area, we used mproved samplng procedure whch utlzes nformaton from frst phase of the algorthm as an a pror. Ths verson reduces sgnfcantly computatonal cost (over a 00 tmes, wth lttle effect on accuracy. The remander of ths paper s organzed as follows. In Secton, we revew standard BP and condton for ts convergence. In Sectons and 4, we propose GBP-JT and NGBP-JT algorthms, respectvely. Smulaton results are presented n Secton 5. Fnally, Secton 6 provdes some conclusons and future work perspectve.

3 onvergence of Belef Propagaton In the standard BP algorthm, the belef at a node s proportonal to the product of the local evdence at that node ( ψ (, and all the messages comng nto node : M ( = kψ ( m ( ( N( where k s a normalzaton constant and N( denotes the neghbors of node. The messages are determned by the message update rules: m ( ψ ( ψ (, m ( = ( k k N( \ where ψ (, s parwse potental between nodes and. On the rght-hand sde, there s a product over all messages gong nto node ecept for the one comng from node. In practcal computaton, one starts wth nodes at the edge of the graph, and only computes a message when one has avalable all the messages requred. It s easy to see [4] that each message needs to be computed only once for sngle connected graphs. That means that whole computaton takes a tme proportonal to the number of lnks n the graph, whch s dramatcally less that the eponentally large tme that would be requred to compute margnal probabltes navely. In other words, BP s a way of organzng the "global" computaton of margnal belefs n terms of smaller local computatons. The BP algorthm, defned by equatons ( and (, does not make a reference to the topology of the graph that t s runnng on. Thus, there s nothng to stop us from mplementng t on a graph that has loops. One starts wth some ntal set of messages, and smply terates the message-update rules ( untl they eventually converge, and then can read off the appromate belefs from the belef equatons (. But f we gnore the estence of loops and permt the nodes to contnue communcatng wth each other, messages may crculate ndefntely around these loops, and the process may not converge to a stable equlbrum. One can ndeed fnd eamples of graphcal models wth loops, where, for certan parameter values, the BP algorthm fals to converge or predcts belefs that are naccurate. On the other hand, the BP algorthm could be successful n graphs wth loops, e.g. error-correctng codes defned on Tanner graphs that have loops [0]. Ths can be proved usng Bethe appromaton to the "free energy" [4, 8]. The fed ponts of the BP algorthm correspond to the statonary ponts of the Bethe "free energy". To make ths more clear, let's defne for one graphcal model, a ont probablty functon p({ }. If we have some other appromate ont probablty functon b({ }, we can defne a "dstance" between p({ } and b({ }, called Kullback-Lebler (KL dstance, by: b({ } Db (({} p({} = b({}ln p ({ ( } { } The KL dstance s useful because t s always nonnegatve and s zero f and only f the two probablty functons p({ } and b({ } are equal. Statstcal physcsts generally assume that Boltzmann's law s true: E({ }/ T p({ } = e (4 Z where Z s a normalzaton constant, and the "temperature" T s ust a parameter that defnes a scale of unts for the "energy" E. For smplcty, we can choose T =. Usng eqs. ( and (4, we fnd the KL dstance: Db (({} p({} = b({ } E({ } + b({ }ln b({ } + ln Z (5 { } { } So we see that ths KL dstance wll be zero when appromate probablty functon b({ } wll equal to the eact probablty functon p({ }.The Bethe appromaton s the case when ont belef b({ } s functon of sngle-node belefs b ( and two-node belefs b (,. Yedda et al. proved [4] that for a sngleconnected graph, values of these belefs that mnmze the Bethe free energy, wll correspond to the eact margnal probabltes. For graph wth loops, these belefs wll only be appromatons, although a lot of them are qute good. Localzaton usng Generalzed Belef Propagaton Our goal n ths secton s to develop new localzaton algorthm usng generalzed belef propagaton based on uncton tree method (GBP-JT. Juncton tree algorthm s a standard method for eact nference n graphcal model [9]. The graph s frst trangulated (added vrtual edges so that every loop of length more than has a chord. Gven a trangulated graph, wth clques and potentals ψ (, and gven correspondng uncton tree whch defnes lnks between the clques, we send the followng message from clque to clque by the message update rule: m ( ψ ( m ( = (6 S k S \ S k N(\ k where S =, and where N( are the neghbors of clque n the uncton tree. The belef at clque s proportonal to the product of the local evdence at that clque and all the messages comng nto clque :

4 Fgure. Eample of 0-node network wth loop M ( = kψ ( m ( (7 S N( Belefs for sngle nodes can be obtaned va further margnalzaton: M ( = M ( for (8 \ The equatons (6, (7, and (8 represent generalzed belef propagaton algorthm whch s vald for arbtrary graphs. The BP algorthm defned wth ( and ( s a specal case of GBP-JT, obtanng by notng that the orgnal tree s already trangulated, and has only pars of nodes as clques. In ths case, sets S are sngle nodes, and margnalzaton usng eq. (8 s unnecessary. Let s show how t works n our eample n Fgure. The network has 0 nodes, 5 anchors (nodes 6-0 and 5 unknowns (nodes -5. There s a loop , so we have to trangulate t by addng two more edges (- and - 4. Then we can defne 8 clques n the graph: = {,, }, = {,, 4}, = {, 4, 5}, 4 = { 4, 9}, 5 = { 5, 0}, 6 = {, 6}, 7 = {, 7}, 8 = {, 8}. The approprate potentals of -node clques are gven by: ψ (,, = ψ(, ψ(, ψ (,, 4 = ψ4(, 4 ψ (,, = ψ (, ψ (, Note that vrtual edges do not appear n these equatons snce they are used only to defne clques. Other clques, defned over pars of nodes, are nothng else than potental functons between two nodes already known from standard BP: 0 (9 ψ ( 4 4, 9 = ψ 49( 4, 9, ψ ( 5 5, 0 = ψ 50 ( 5, 0, ψ ( 6, 6 = ψ6(, 6, ψ ( 7, 7 = ψ 7(, 7, (0 ψ (, = ψ (, The uncton tree correspondng to the network n Fgure s shown n Fgure. As we can see, anchor clques ( 4 8 do not receve messages, so ths graph does not contan loops. Actually, these anchor clques also nclude one unknown node so we can send them messages, but ths node also could be located margnalzng the belef of some other clque. In the net step, we can compute all messages usng equaton (6. The complete set of messages s gven by: m ( = ψ (,, m ( = ψ (, * * m ( = m ( =ψ (, * m ( = m ( = ψ (, * m ( = m ( = m ( = ψ (, * m (, = ψ (, ψ (, ψ (, ψ (,, 7 * * * m (, = ψ (, ψ (, ψ (, ψ (,, * * * ( (, = * * * 7 (, 7 8(, 8 49 ( 4, 9 (,, 4 m (, 4 m ψ ψ ψ ψ 4 m (, = ψ (, ψ (, ψ (, ψ (,, (, * * * m where astersk denotes the known locaton of the anchor node and the messages from "anchor clques" are drectly replaced by approprate potental functon. The belefs of clques are computed usng equaton (7: M(,, = ψ * * * (,, ψ6(, 6 ψ7 (, 7 ψ8(, 8 m(, M(,, 4 = ( ψ * * * (,, 4 ψ7 (, 7 ψ8(, 8 ψ49 ( 4, 9 m(, m (, 4 M(, 4, 5 = ψ (,, ψ (, * ψ (, * ψ (, * m (, Now t s easy to compute belefs of sngle nodes by margnalzng belefs of clques usng eq. (8. Obvously, t s suffcent to know belefs of and snce these clques nclude all unknown nodes. Margnalzaton of provdes degree of freedom and could be used to check the estmated postons of some nodes (n our case, for nodes, and 4.

5 8 6 m6( m(, m(, 4 5 m5( 5, 6,, m ( 8 m(, 8, 8 m ( m (,, 4, 4, 5 m (, 4 8, 5 0 m ( 7 m7( m ( 4 4, 7 4, m ( 4 4 Fgure. The uncton tree correspondng to the network n Fgure Fnally, n order to use ths method for localzaton, we have to defne potental functons. In our case, we assumed that we ddn't obtan a pror nformaton about node poston, so sngle-node potentals are equal to (n opposte case, belefs computed usng eq. (7 have to be multpled by ther own potentals. The parwse potental between nodes t and u s gven by []: Pd ( t, u pv ( dtu t u, f otu =, ψ tu ( t, u = Pd( t, u, otherwse. ( where P d s the probablty of detectng nearby sensors; n our case we used mproved model whch assumes that the probablty of detectng nearby sensors falls off eponentally wth squared dstance: Pd( t, u ep = t u / R (4 where R s the transmsson radus. The bnary varable o tu ndcates whether ths observaton s avalable ( o tu = or not ( o tu = 0. And the last remanng parameter s measured dstance. The unknown node t obtans a nosy measurement d of ts dstance from detected node u: tu d = + v, v p (, (5 tu t u tu tu v t u In our case, we used Gaussan dstrbuton for p v, but, as we can see, t's very easy to change t to any desred dstrbuton (e.g. obtaned by runnng tranng eperment n the deployment area. The proposed GBP-JT algorthm s not unque. There are a lot of varatons of ths method; the best known s cluster varaton method [8]. However, t can be shown that they are qute smlar. For eample, n [8] s descrbed the relatonshp between dfferent regon-based appromatons. The man goal s acheved n all of them: estmated belefs are correct n network wth loops. However, the prce for ths s unacceptable large computatonal cost, so we are gong to mplement appromated verson of GBP-JT algorthm. 4 Nonparametrc Generalzed Belef Propagaton In order to obtan acceptable spatal resoluton for unknown nodes, number of dscrete ponts n the deployment area (e.g. N Ny for D grd must be too large for GBP to be computatonally feasble []. Besdes, the presence of nonlnear relatonshps and potentally hghly non-gaussan uncertantes n sensor localzaton makes GBP undesrable. However, usng partcle-based representatons va nonparametrc generalzed belef propagaton (NGBP, enables the applcaton of GBP to nference n sensor networks. In ths secton we propose NGBP-JT, partcle-based appromaton of GBP-JT method for the same eample of network from prevous secton (Fgures and. 4. Drawng ntal partcles Let s draw : N weghted partcles from clques and { W, X} = { W,[ X,, X,, X,]} { W, X } = { W,[ X, X, X ]},,4,5 (6 where Wm represents weght of 6-dmensonal partcle X m from clque m whch conssts from three -dmensonal partcles from node t ( X mt,. For now, we don t need partcle from clque. There s a lot of ways to draw these partcles. In general, we can draw all partcles unformly wthn the deployment area, but t requres

6 sgnfcant large number of partcles (e.g. 00 partcles drawn for each node, corresponds to = 06 partcles of ts clque. Therefore, we wll mmedately nclude all nformaton avalable wthn the clque: potental functons gven by (9 and ( whch represent our nformaton about dstance between nodes wthn the clque. Frst, we draw partcles of node t unformly wthn the deployment area. To draw partcle of any neghborng node u, we shft partcle of node t n the random drecton for an amount whch represents the observed dstance between these two nodes: Xmu, = Xmt, + ( dtu+ v[sn( θ cos( θ ], θ ~ Unf[0, π, =,..., N We wll use smplfed notaton of the above equaton: mu, mt, (7 X = shft( X, d (8 Assumng that we have already drawn samples, e.g. from nodes and 5, we can compute partcles of other nodes: tu 4. omputng messages Havng drawn all partcles, we can now compute all messages. Messages m and m are functon of m and m respectvely (see eqs. (, so they wll be computed later. Also, messages from the anchor clques wll be drectly replaced wth approprate potental functon. So we start wth messages m and m whch depends on ψ and ψ from whch we already have drawn partcles. Let s represent these two messages n slghtly dfferent form: m (, = M (,, ( M (,, = ψ (, ψ (, ψ (, ψ (,, * * * m (, = M (,, ( M (,, = ψ (, ψ (, ψ (, ψ (,, * * * Defned factors, M and M, are some knd of unmargnalzed messages, so we ll call them ont messages. Now t s very easy to compute weghted partcles from these ont messages ({ W, X }: mn mn X = shft( X, d, X = shft( X, d,,,, X = shft( X, d, X = shft( X, d,4,5 45,,5 5 (9 X = X = [ X, X, X ],,,, W = ψ ( X, ψ ( X, ψ ( X, W ( * * * 7, 7 8, 8 6, 6 Snce these partcles are drawn from ψ and ψ respectvely, and that we already ncluded all nformaton whch place these partcles n hgh-probablstc regons wth respect to X, and X,5 (see eqs. ( and (9, all clque s weghts can be appromated wth the same value: W = W =, =,..., N (0 N Note that all partcles of nodes wthn the clque have one common weght, e.g. { W,[ X,, X,, X,]}. Our ntal set of partcles for clque s llustrated n Fgure. X = X = [ X, X, X ] W X X X W,,4,5 * * * = ψ49 (,4, 9 ψ8(,, 8 ψ50(,5, 0 (4 Before computng fnal messages, we notced usual problem, sample depleton [], the problem when one, or few, of the weghts are much larger than the rest. Ths means that any sample-based estmate wll be unduly domnated by the nfluence of ust few partcles. In our case, t s epected because we are workng n 6- dmensonal space where t s very hard to draw good sample (clque wth poston and shape smlar to the rght one see Fgure. Therefore, we resample wth replacement [, ], whch wll produce N equal-weght partcles ( Wmn = / N. In our case, we have to resample from clques, thus the easest way s to resample from sngle nodes usng standard resamplng procedure [], and then to synchronze ndees n order to keep orgnal shapes of the partcles. Ths procedure s llustrated for M, by the followng pseudocode: d X + v, d + v X, X, Fgure. Intal set of partcles for clque [ W, X,, nde] = resample( W, X, ; for =: N nde( nde( X, = X, ; X, = X, ; end Note the dfference between X and X,! (5

7 where { W, X,} s the vector of N partcles from node (part of ont message, and nde s the vector of old (preresampled ndees of new partcles. Now we are ready to compute partcles from messages ({ wmn, mn}. The margnalzaton of ont messages s very easy snce we already have weghted partcles from them. So we ust need to dscard one data, and keep the same weghts. Thus, they are gven by:,,,,4 = X ( : = [ X, X ], w = W = / N = X (: = [ X, X ], w = W = / N (6 for =: N end X = (; X = (;,, whle( k < k ma X = shft( X, d ;,, f abs( X, X ( d ε, d + ε break; end k = k + ; end,, W = ψ ( X, ψ ( X, ψ ( X, w ; * * * 6, 6 7, 7 8, 8 (9 Fnally, we can compute partcles of other two messages, m and m. Accordng to eqs. (, they are functon of ψ m and ψ m, respectvely, so we wll draw partcles from these products and then re-weght by remander of eq. (. Actually, two sngle-node partcles of messages m and m are already computed (see eqs. (6, so we have ust to draw mssng partcle usng nformaton from ψ, the observed dstance between nodes and 4. The result of ths procedure are partcles of ont messages M and M. Margnalzng them, we obtan fnal messages m and m. The complete procedure s gven as follows: X = [shft( (, d, (, (] W X X X w resample and synchronze = X (: = [ X (, X (], w = W = / N 4 * * * = ψ49 ( (, 9 ψ8( (, 8 ψ7 ( (, 7 X = [ (, (, shft( (, d ] W X X X w resample and synchronze = X ( : = [ X (, X (], w = W = / N 4 * * * = ψ49 ( (, 9 ψ8( (, 8 ψ7 ( (, 7 4. omputng fnal belefs (7 (8 To estmate belefs of unknown nodes, we are gong to compute belefs of clques usng already computed partcles of messages. Accordng to eqs. (, belefs M, M and M are functon of ψ m, ψ m m and ψ m, respectvely, so we wll draw partcles from these products and then re-weght by remander of eqs. (. Let s start wth M and ts correspondng product ψ m. As we can see n eqs. (9, ψ ncludes nformaton about dstance between nodes and, as well as between nodes and. Besdes, message m ncludes nformaton about postons of nodes and. So we ust need to locate node, usng avalable postons and dstances. It could be done geometrcally by ntersectng crcles, but we prefer statstcal approach whch s qute faster. It s done by followng pseudocode: where,, abs( X, X s estmated dstance between these two partcles, and ε s predefned tolerance. If we do not obtan good partcle after k ma teratons, that s mean that these two crcles cannot ntersect, so our partcle s the poston shfted for d n random drecton. Ths s not a problem because ths wrong partcle wll obvously have later a very small weght (fltered by potental functons from anchors. The other problem s bmodalty, f the crcles ntersect n two ponts; but the wrong partcle wll be also fltered n same way. The same procedure s done for M snce we have dstance between nodes 5 and 4, as well as between nodes 5 and, and message m whch ncludes nformaton about postons of nodes and 4. As we already mentoned, the belef M s not necessary snce other two clques nclude postons of all unknown nodes. Anyway, we wll show the procedure because t s slghtly dfferent. We have to draw partcles from product of two 4-dmensonal messages and as result we epect 6-dmensonal message. So, f we want to avod to draw randomly mssng partcles (e.g. for message m(,, we would have to draw sngle-node partcles from 4, we wll drectly draw partcles from the product ψ (,, 4 m(, m(, 4 and then re-weght by remander of eq. (. The followng procedure shows t:, X = [shft( (, d4, (, (];, X = [ (, (, shft( (, d4];,, X = choose( X X, N; W = / N (0 m ( X,, X, m ( X,, X,4 W = W m ( X,, X, + m ( X,, X,4 W = ψ (, * (, * (, * X ψ X ψ X W 7, 7 8, 8 49,4 9,, where functon choose( X X, N chooses randomly N partcles from N. Ths procedure s known as mportance samplng [], the appromaton of orgnal dstrbuton ( mm wth proposal one ( m + m from,, whch s easy to draw samples ( X X, and then reweghtng ( m m /( m + m to compensate the error. For the smplcty, updated and old partcles are denoted by same symbols.

8 (a (b Fgure 4. omparson of the results for a 0-node network (a NBP, (b NGBP-JT The fnal estmated postons of unknown nodes are gven by mean values of partcles from clque : N N est u m m, u / m = = W X W = ( 4.4 Improved samplng procedure There s one mportant modfcaton to ths algorthm that can reduce sgnfcantly the ntal number of partcles. As we already mentoned, f we draw N partcles from one node, generally t corresponds to N = N partcles of a - node clque. However, we ncluded nformaton about dstance, so our new number for the same clque s N NN θ angles. But ths number s stll very large, so we would lke to nclude addtonal nformaton. We assumed that there s no a pror nformaton about node poston. However, after very frst phase of algorthm, = where N θ represents the number of possble we computed ont messages M and M whch nclude current nformaton about postons of clques and. At ths pont, partcles are concentrated n a smaller regon (ecept for very few of them, so we can draw new set of partcles around sngle-node partcles of ont messages. For, t s done by followng procedure: d = Unf (0, r, X, = shft( X,, d X, = shft( X,, d, X, = shft( X,, d W = / N compute messages agan m ( where r s the radus of deployment area of new partcles. omputng messages agan s very mportant snce we draw new set of partcles, whch means that we have to run algorthm from the begnnng. Of course, for clque, we use analog procedure. Ths mproved procedure allows us to decrease ntal number of samples to N = NNθ / n where n s the reducng factor that could be found epermentally. Theoretcally, t s proportonal to the rato of the new to the old deployment area. 5 Smulaton Results We smulated the network from Fgure usng NBP, GBP- JT and NGBP-JT algorthms. We placed 0 nodes n m m area, 5 anchors and 5 unknowns. We set the values of transmsson radus ( R = 5% of dagonal length of the deployment area and standard devaton of measured dstance ( sgma = 0.m = 4% R. Number of teraton for NBP s set to the length of the longest path n the graph ( N ter = 5, and for GBP-JT/NGBP-JT there s obvously, n our eample, ust one teraton. Number of partcle for NBP s set to N = 400, so correspondng number of grd ponts for GBP-JT s N G = 0 0. For NGBP-JT we used mproved samplng procedure wth radus r = 0. and epermentally we found out reducng factor whch do not change the accuracy ( n 4. Assumng that mnmum number of possble angles could be appromated wth N θ = 0, we set the number of clque s partcles to N = NNθ / n = We ran the smulaton for NBP and NGBP-JT, and obtaned results shown n Fgure 4. Obvously, the locaton estmates for the NGBP-JT are more accurate snce ths algorthm s correct for network wth loops. NBP algorthm does not converge well for a few nodes, but for some other values of parameters, or wth dfferent postons of some nodes, t provdes estmates wth almost same accuracy as NGBP-JT. However, comparng uncertantes for NBP and NGBP-JT (contours n Fgures 4a and 4b, we can see that NBP provdes us better guarantees of ts estmate.

9 Average error [%R] NBP (.8 MFlops NGBP-JT (7.89 MFlops GBP-JT ( MFlops Dstance devaton [%R] Fgure 5. omparson of accuracy Fnally, we checked the averaged accuracy wth respect to the devaton of measured dstance for all three methods (Fgure 5. The accuracy of GBP-JT s always hgher than accuracy of NBP and NGBP-JT. NGBP-JT provdes us better accuracy than NBP for some usual values of dstance devaton (e.g. for measurements usng tme of arrval, the error s 5-0 %R [], and unepectedly worse accuracy for hgher values of mentoned devaton. Anyway, ths accuracy could be ncreased, usng larger number of partcles (e.g. ncreasng N θ, untl the bottom lne defned by the accuracy of GBP-JT. omparng wth NBP/NGBP-JT, the computatonal cost of GBP-JT s, of course, very large ( MFlops and absolutely unacceptable. Nonparametrc appromaton of ths algorthm decreased t around 5 tmes, and mproved samplng addtonal 4 tmes. So the fnal computatonal cost of NGBP-JT n smulated eample s 7.89 MFlops, ust double as many comparng wth NBP (.8 MFlops. 6 onclusons As presented n ths artcle, uncton tree model s a generalzaton of belef propagaton that s correct for arbtrary graphs. We proposed localzaton usng generalzed belef propagaton based on uncton tree method and nonparametrc appromaton of ths algorthm (NGBP-JT. Our man goal was to solve the problem wth loops wth some acceptable computatonal cost and we acheved t usng NGBP-JT approach. We can conclude that ths algorthm could provde hgher accuracy wth acceptable computatonal cost. The man open drecton for future work s generalzng ths algorthm for an ad-hoc network. Ths would probably requre addtonal computaton necessary for constructon of uncton tree clques wthn the network. Moreover, communcaton cost has to be consdered snce t s obvous that we can not echange thousands of partcles wthout any compresson. Ths wll be a part of our future research. Acknowledgment Ths work has been performed n the framework of the IT proect IT-70 WHERE, whch s partly funded by the European Unon and partly by the Spansh Educaton and Scence Mnstry under Grant TE /0/TM. Furthermore, we thank partal support by program ONSOLIDER-INGENIO 00 SD OMONSENS. References [] M. S. Arulampalam, S. Maskell, N. Gordon, and T. lapp, A Tutoral on Partcle Flters for Onlne Nonlnear/Non-Gaussan Bayesan Trackng, IEEE Transactons on Sgnal Processng, vol. 50, ssue, pp , February 00. [] P.M. Durc, J.H. Kotecha, J. Zhang, Y. Huang, T. Ghrma, M.F. Bugallo, J. Mguez, Partcle Flterng, IEEE Sgnal Processng Magazne, vol. 0, ssue 5, pp. 9-8, September 00. [] A. T. Ihler, J. W. Fsher III, R. L. Moses, and A. S. Wllsky, Nonparametrc Belef Propagaton for Self- Localzaton of Sensor Networks, IEEE Journal On Selected Areas In ommuncatons, vol., ssue 4, pp , Aprl 005. [4] J.S. Yedda, W.T. Freeman, and Y. Wess, Understandng belef propagaton and ts generalzatons, Eplorng artfcal ntellgence n the new mllennum, AM, pp. 9-69, 00. [5] D. Nculescu and B. Nath, Ad hoc postonng system (APS, n IEEE GLOBEOM, vol. 5, pp. 96 9, November 00. [6] Y. Shang, W. Ruml, Y. Zhang, and M. Fromherz, Localzaton from onnectvty n Sensor Networks, IEEE Transactons on Parallel and Dstrbuted Systems, vol. 5, no., pp , November 004. [7] A. Savvdes, H. Park, and M. B. Srvastava, The Bts and Flops of the N-hop Multlateraton Prmtve for Node Localzaton Problems, n Internatonal Workshop on Sensor Networks Applcaton, pp., Sept. 00. [8] J.S. Yedda, W.T. Freeman, and Y. Wess, onstructng Free Energy Appromatons and Generalzed Belef Propagaton Algorthms, Informaton theory, IEEE Transacton On Informaton Theory, vol. 5, ssue 7, pp. 8-, July 005. [9] M.I. Jordan and Y. Wess, Graphcal model: Probablstc nference, The Handbook of Bran Theory and Neural Networks, nd edton. ambrdge, MA: MIT Press, 00. [0] B.J. Frey, A revoluton: Belef propagaton n graphs wth cycles, Adv. n Neural Informaton Processng Systems, vol. 0, MIT Press, 998. [] N. Patwar, J.N. Ash, S. Kyperountas, A.O. Hero III, R.L. Moses and N.S. orreal, Locatng the nodes, IEEE Sgnal Processng Magazne, vol., ssue 4, pp , July 005.

Why BP Works STAT 232B

Why BP Works STAT 232B Why BP Works STAT 232B Free Energes Helmholz & Gbbs Free Energes 1 Dstance between Probablstc Models - K-L dvergence b{ KL b{ p{ = b{ ln { } p{ Here, p{ s the eact ont prob. b{ s the appromaton, called

More information

A Robust Method for Calculating the Correlation Coefficient

A Robust Method for Calculating the Correlation Coefficient A Robust Method for Calculatng the Correlaton Coeffcent E.B. Nven and C. V. Deutsch Relatonshps between prmary and secondary data are frequently quantfed usng the correlaton coeffcent; however, the tradtonal

More information

Probability-Theoretic Junction Trees

Probability-Theoretic Junction Trees Probablty-Theoretc Juncton Trees Payam Pakzad, (wth Venkat Anantharam, EECS Dept, U.C. Berkeley EPFL, ALGO/LMA Semnar 2/2/2004 Margnalzaton Problem Gven an arbtrary functon of many varables, fnd (some

More information

Appendix B: Resampling Algorithms

Appendix B: Resampling Algorithms 407 Appendx B: Resamplng Algorthms A common problem of all partcle flters s the degeneracy of weghts, whch conssts of the unbounded ncrease of the varance of the mportance weghts ω [ ] of the partcles

More information

Markov Chain Monte Carlo Lecture 6

Markov Chain Monte Carlo Lecture 6 where (x 1,..., x N ) X N, N s called the populaton sze, f(x) f (x) for at least one {1, 2,..., N}, and those dfferent from f(x) are called the tral dstrbutons n terms of mportance samplng. Dfferent ways

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

Markov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement

Markov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement Markov Chan Monte Carlo MCMC, Gbbs Samplng, Metropols Algorthms, and Smulated Annealng 2001 Bonformatcs Course Supplement SNU Bontellgence Lab http://bsnuackr/ Outlne! Markov Chan Monte Carlo MCMC! Metropols-Hastngs

More information

Outline. Communication. Bellman Ford Algorithm. Bellman Ford Example. Bellman Ford Shortest Path [1]

Outline. Communication. Bellman Ford Algorithm. Bellman Ford Example. Bellman Ford Shortest Path [1] DYNAMIC SHORTEST PATH SEARCH AND SYNCHRONIZED TASK SWITCHING Jay Wagenpfel, Adran Trachte 2 Outlne Shortest Communcaton Path Searchng Bellmann Ford algorthm Algorthm for dynamc case Modfcatons to our algorthm

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

Structure and Drive Paul A. Jensen Copyright July 20, 2003

Structure and Drive Paul A. Jensen Copyright July 20, 2003 Structure and Drve Paul A. Jensen Copyrght July 20, 2003 A system s made up of several operatons wth flow passng between them. The structure of the system descrbes the flow paths from nputs to outputs.

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

Parametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010

Parametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010 Parametrc fractonal mputaton for mssng data analyss Jae Kwang Km Survey Workng Group Semnar March 29, 2010 1 Outlne Introducton Proposed method Fractonal mputaton Approxmaton Varance estmaton Multple mputaton

More information

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton

More information

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography CSc 6974 and ECSE 6966 Math. Tech. for Vson, Graphcs and Robotcs Lecture 21, Aprl 17, 2006 Estmatng A Plane Homography Overvew We contnue wth a dscusson of the major ssues, usng estmaton of plane projectve

More information

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009 College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:

More information

Uncertainty as the Overlap of Alternate Conditional Distributions

Uncertainty as the Overlap of Alternate Conditional Distributions Uncertanty as the Overlap of Alternate Condtonal Dstrbutons Olena Babak and Clayton V. Deutsch Centre for Computatonal Geostatstcs Department of Cvl & Envronmental Engneerng Unversty of Alberta An mportant

More information

AN IMPROVED PARTICLE FILTER ALGORITHM BASED ON NEURAL NETWORK FOR TARGET TRACKING

AN IMPROVED PARTICLE FILTER ALGORITHM BASED ON NEURAL NETWORK FOR TARGET TRACKING AN IMPROVED PARTICLE FILTER ALGORITHM BASED ON NEURAL NETWORK FOR TARGET TRACKING Qn Wen, Peng Qcong 40 Lab, Insttuton of Communcaton and Informaton Engneerng,Unversty of Electronc Scence and Technology

More information

Grover s Algorithm + Quantum Zeno Effect + Vaidman

Grover s Algorithm + Quantum Zeno Effect + Vaidman Grover s Algorthm + Quantum Zeno Effect + Vadman CS 294-2 Bomb 10/12/04 Fall 2004 Lecture 11 Grover s algorthm Recall that Grover s algorthm for searchng over a space of sze wors as follows: consder the

More information

/ n ) are compared. The logic is: if the two

/ n ) are compared. The logic is: if the two STAT C141, Sprng 2005 Lecture 13 Two sample tests One sample tests: examples of goodness of ft tests, where we are testng whether our data supports predctons. Two sample tests: called as tests of ndependence

More information

VQ widely used in coding speech, image, and video

VQ widely used in coding speech, image, and video at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng

More information

The Order Relation and Trace Inequalities for. Hermitian Operators

The Order Relation and Trace Inequalities for. Hermitian Operators Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence

More information

Physical Fluctuomatics Applied Stochastic Process 9th Belief propagation

Physical Fluctuomatics Applied Stochastic Process 9th Belief propagation Physcal luctuomatcs ppled Stochastc Process 9th elef propagaton Kazuyuk Tanaka Graduate School of Informaton Scences Tohoku Unversty kazu@smapp.s.tohoku.ac.jp http://www.smapp.s.tohoku.ac.jp/~kazu/ Stochastc

More information

Homework Assignment 3 Due in class, Thursday October 15

Homework Assignment 3 Due in class, Thursday October 15 Homework Assgnment 3 Due n class, Thursday October 15 SDS 383C Statstcal Modelng I 1 Rdge regresson and Lasso 1. Get the Prostrate cancer data from http://statweb.stanford.edu/~tbs/elemstatlearn/ datasets/prostate.data.

More information

Chapter Newton s Method

Chapter Newton s Method Chapter 9. Newton s Method After readng ths chapter, you should be able to:. Understand how Newton s method s dfferent from the Golden Secton Search method. Understand how Newton s method works 3. Solve

More information

Difference Equations

Difference Equations Dfference Equatons c Jan Vrbk 1 Bascs Suppose a sequence of numbers, say a 0,a 1,a,a 3,... s defned by a certan general relatonshp between, say, three consecutve values of the sequence, e.g. a + +3a +1

More information

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method Appled Mathematcal Scences, Vol. 7, 0, no. 47, 07-0 HIARI Ltd, www.m-hkar.com Comparson of the Populaton Varance Estmators of -Parameter Exponental Dstrbuton Based on Multple Crtera Decson Makng Method

More information

Gaussian Mixture Models

Gaussian Mixture Models Lab Gaussan Mxture Models Lab Objectve: Understand the formulaton of Gaussan Mxture Models (GMMs) and how to estmate GMM parameters. You ve already seen GMMs as the observaton dstrbuton n certan contnuous

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

Quantifying Uncertainty

Quantifying Uncertainty Partcle Flters Quantfyng Uncertanty Sa Ravela M. I. T Last Updated: Sprng 2013 1 Quantfyng Uncertanty Partcle Flters Partcle Flters Appled to Sequental flterng problems Can also be appled to smoothng problems

More information

4DVAR, according to the name, is a four-dimensional variational method.

4DVAR, according to the name, is a four-dimensional variational method. 4D-Varatonal Data Assmlaton (4D-Var) 4DVAR, accordng to the name, s a four-dmensonal varatonal method. 4D-Var s actually a drect generalzaton of 3D-Var to handle observatons that are dstrbuted n tme. The

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Speech and Language Processing

Speech and Language Processing Speech and Language rocessng Lecture 3 ayesan network and ayesan nference Informaton and ommuncatons Engneerng ourse Takahro Shnozak 08//5 Lecture lan (Shnozak s part) I gves the frst 6 lectures about

More information

Hidden Markov Models

Hidden Markov Models CM229S: Machne Learnng for Bonformatcs Lecture 12-05/05/2016 Hdden Markov Models Lecturer: Srram Sankararaman Scrbe: Akshay Dattatray Shnde Edted by: TBD 1 Introducton For a drected graph G we can wrte

More information

Week3, Chapter 4. Position and Displacement. Motion in Two Dimensions. Instantaneous Velocity. Average Velocity

Week3, Chapter 4. Position and Displacement. Motion in Two Dimensions. Instantaneous Velocity. Average Velocity Week3, Chapter 4 Moton n Two Dmensons Lecture Quz A partcle confned to moton along the x axs moves wth constant acceleraton from x =.0 m to x = 8.0 m durng a 1-s tme nterval. The velocty of the partcle

More information

Power law and dimension of the maximum value for belief distribution with the max Deng entropy

Power law and dimension of the maximum value for belief distribution with the max Deng entropy Power law and dmenson of the maxmum value for belef dstrbuton wth the max Deng entropy Bngy Kang a, a College of Informaton Engneerng, Northwest A&F Unversty, Yanglng, Shaanx, 712100, Chna. Abstract Deng

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

UNIVERSITY OF TORONTO Faculty of Arts and Science. December 2005 Examinations STA437H1F/STA1005HF. Duration - 3 hours

UNIVERSITY OF TORONTO Faculty of Arts and Science. December 2005 Examinations STA437H1F/STA1005HF. Duration - 3 hours UNIVERSITY OF TORONTO Faculty of Arts and Scence December 005 Examnatons STA47HF/STA005HF Duraton - hours AIDS ALLOWED: (to be suppled by the student) Non-programmable calculator One handwrtten 8.5'' x

More information

Statistical analysis using matlab. HY 439 Presented by: George Fortetsanakis

Statistical analysis using matlab. HY 439 Presented by: George Fortetsanakis Statstcal analyss usng matlab HY 439 Presented by: George Fortetsanaks Roadmap Probablty dstrbutons Statstcal estmaton Fttng data to probablty dstrbutons Contnuous dstrbutons Contnuous random varable X

More information

Ensemble Methods: Boosting

Ensemble Methods: Boosting Ensemble Methods: Boostng Ncholas Ruozz Unversty of Texas at Dallas Based on the sldes of Vbhav Gogate and Rob Schapre Last Tme Varance reducton va baggng Generate new tranng data sets by samplng wth replacement

More information

Tracking with Kalman Filter

Tracking with Kalman Filter Trackng wth Kalman Flter Scott T. Acton Vrgna Image and Vdeo Analyss (VIVA), Charles L. Brown Department of Electrcal and Computer Engneerng Department of Bomedcal Engneerng Unversty of Vrgna, Charlottesvlle,

More information

Copyright 2017 by Taylor Enterprises, Inc., All Rights Reserved. Adjusted Control Limits for P Charts. Dr. Wayne A. Taylor

Copyright 2017 by Taylor Enterprises, Inc., All Rights Reserved. Adjusted Control Limits for P Charts. Dr. Wayne A. Taylor Taylor Enterprses, Inc. Control Lmts for P Charts Copyrght 2017 by Taylor Enterprses, Inc., All Rghts Reserved. Control Lmts for P Charts Dr. Wayne A. Taylor Abstract: P charts are used for count data

More information

Resource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud

Resource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud Resource Allocaton wth a Budget Constrant for Computng Independent Tasks n the Cloud Wemng Sh and Bo Hong School of Electrcal and Computer Engneerng Georga Insttute of Technology, USA 2nd IEEE Internatonal

More information

AP Physics 1 & 2 Summer Assignment

AP Physics 1 & 2 Summer Assignment AP Physcs 1 & 2 Summer Assgnment AP Physcs 1 requres an exceptonal profcency n algebra, trgonometry, and geometry. It was desgned by a select group of college professors and hgh school scence teachers

More information

Numerical Heat and Mass Transfer

Numerical Heat and Mass Transfer Master degree n Mechancal Engneerng Numercal Heat and Mass Transfer 06-Fnte-Dfference Method (One-dmensonal, steady state heat conducton) Fausto Arpno f.arpno@uncas.t Introducton Why we use models and

More information

Open Systems: Chemical Potential and Partial Molar Quantities Chemical Potential

Open Systems: Chemical Potential and Partial Molar Quantities Chemical Potential Open Systems: Chemcal Potental and Partal Molar Quanttes Chemcal Potental For closed systems, we have derved the followng relatonshps: du = TdS pdv dh = TdS + Vdp da = SdT pdv dg = VdP SdT For open systems,

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they

More information

Physics 5153 Classical Mechanics. Principle of Virtual Work-1

Physics 5153 Classical Mechanics. Principle of Virtual Work-1 P. Guterrez 1 Introducton Physcs 5153 Classcal Mechancs Prncple of Vrtual Work The frst varatonal prncple we encounter n mechancs s the prncple of vrtual work. It establshes the equlbrum condton of a mechancal

More information

CONTRAST ENHANCEMENT FOR MIMIMUM MEAN BRIGHTNESS ERROR FROM HISTOGRAM PARTITIONING INTRODUCTION

CONTRAST ENHANCEMENT FOR MIMIMUM MEAN BRIGHTNESS ERROR FROM HISTOGRAM PARTITIONING INTRODUCTION CONTRAST ENHANCEMENT FOR MIMIMUM MEAN BRIGHTNESS ERROR FROM HISTOGRAM PARTITIONING N. Phanthuna 1,2, F. Cheevasuvt 2 and S. Chtwong 2 1 Department of Electrcal Engneerng, Faculty of Engneerng Rajamangala

More information

Pulse Coded Modulation

Pulse Coded Modulation Pulse Coded Modulaton PCM (Pulse Coded Modulaton) s a voce codng technque defned by the ITU-T G.711 standard and t s used n dgtal telephony to encode the voce sgnal. The frst step n the analog to dgtal

More information

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons

More information

Copyright 2017 by Taylor Enterprises, Inc., All Rights Reserved. Adjusted Control Limits for U Charts. Dr. Wayne A. Taylor

Copyright 2017 by Taylor Enterprises, Inc., All Rights Reserved. Adjusted Control Limits for U Charts. Dr. Wayne A. Taylor Taylor Enterprses, Inc. Adjusted Control Lmts for U Charts Copyrght 207 by Taylor Enterprses, Inc., All Rghts Reserved. Adjusted Control Lmts for U Charts Dr. Wayne A. Taylor Abstract: U charts are used

More information

Linear Feature Engineering 11

Linear Feature Engineering 11 Lnear Feature Engneerng 11 2 Least-Squares 2.1 Smple least-squares Consder the followng dataset. We have a bunch of nputs x and correspondng outputs y. The partcular values n ths dataset are x y 0.23 0.19

More information

Lecture 12: Classification

Lecture 12: Classification Lecture : Classfcaton g Dscrmnant functons g The optmal Bayes classfer g Quadratc classfers g Eucldean and Mahalanobs metrcs g K Nearest Neghbor Classfers Intellgent Sensor Systems Rcardo Guterrez-Osuna

More information

ELASTIC WAVE PROPAGATION IN A CONTINUOUS MEDIUM

ELASTIC WAVE PROPAGATION IN A CONTINUOUS MEDIUM ELASTIC WAVE PROPAGATION IN A CONTINUOUS MEDIUM An elastc wave s a deformaton of the body that travels throughout the body n all drectons. We can examne the deformaton over a perod of tme by fxng our look

More information

arxiv:cs.cv/ Jun 2000

arxiv:cs.cv/ Jun 2000 Correlaton over Decomposed Sgnals: A Non-Lnear Approach to Fast and Effectve Sequences Comparson Lucano da Fontoura Costa arxv:cs.cv/0006040 28 Jun 2000 Cybernetc Vson Research Group IFSC Unversty of São

More information

Measurement Uncertainties Reference

Measurement Uncertainties Reference Measurement Uncertantes Reference Introducton We all ntutvely now that no epermental measurement can be perfect. It s possble to mae ths dea quanttatve. It can be stated ths way: the result of an ndvdual

More information

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M CIS56: achne Learnng Lecture 3 (Sept 6, 003) Preparaton help: Xaoyng Huang Lnear Regresson Lnear regresson can be represented by a functonal form: f(; θ) = θ 0 0 +θ + + θ = θ = 0 ote: 0 s a dummy attrbute

More information

The Minimum Universal Cost Flow in an Infeasible Flow Network

The Minimum Universal Cost Flow in an Infeasible Flow Network Journal of Scences, Islamc Republc of Iran 17(2): 175-180 (2006) Unversty of Tehran, ISSN 1016-1104 http://jscencesutacr The Mnmum Unversal Cost Flow n an Infeasble Flow Network H Saleh Fathabad * M Bagheran

More information

2016 Wiley. Study Session 2: Ethical and Professional Standards Application

2016 Wiley. Study Session 2: Ethical and Professional Standards Application 6 Wley Study Sesson : Ethcal and Professonal Standards Applcaton LESSON : CORRECTION ANALYSIS Readng 9: Correlaton and Regresson LOS 9a: Calculate and nterpret a sample covarance and a sample correlaton

More information

Research Article Green s Theorem for Sign Data

Research Article Green s Theorem for Sign Data Internatonal Scholarly Research Network ISRN Appled Mathematcs Volume 2012, Artcle ID 539359, 10 pages do:10.5402/2012/539359 Research Artcle Green s Theorem for Sgn Data Lous M. Houston The Unversty of

More information

This column is a continuation of our previous column

This column is a continuation of our previous column Comparson of Goodness of Ft Statstcs for Lnear Regresson, Part II The authors contnue ther dscusson of the correlaton coeffcent n developng a calbraton for quanttatve analyss. Jerome Workman Jr. and Howard

More information

The Study of Teaching-learning-based Optimization Algorithm

The Study of Teaching-learning-based Optimization Algorithm Advanced Scence and Technology Letters Vol. (AST 06), pp.05- http://dx.do.org/0.57/astl.06. The Study of Teachng-learnng-based Optmzaton Algorthm u Sun, Yan fu, Lele Kong, Haolang Q,, Helongang Insttute

More information

Mean Field / Variational Approximations

Mean Field / Variational Approximations Mean Feld / Varatonal Appromatons resented by Jose Nuñez 0/24/05 Outlne Introducton Mean Feld Appromaton Structured Mean Feld Weghted Mean Feld Varatonal Methods Introducton roblem: We have dstrbuton but

More information

MMA and GCMMA two methods for nonlinear optimization

MMA and GCMMA two methods for nonlinear optimization MMA and GCMMA two methods for nonlnear optmzaton Krster Svanberg Optmzaton and Systems Theory, KTH, Stockholm, Sweden. krlle@math.kth.se Ths note descrbes the algorthms used n the author s 2007 mplementatons

More information

Split alignment. Martin C. Frith April 13, 2012

Split alignment. Martin C. Frith April 13, 2012 Splt algnment Martn C. Frth Aprl 13, 2012 1 Introducton Ths document s about algnng a query sequence to a genome, allowng dfferent parts of the query to match dfferent parts of the genome. Here are some

More information

MIMA Group. Chapter 2 Bayesian Decision Theory. School of Computer Science and Technology, Shandong University. Xin-Shun SDU

MIMA Group. Chapter 2 Bayesian Decision Theory. School of Computer Science and Technology, Shandong University. Xin-Shun SDU Group M D L M Chapter Bayesan Decson heory Xn-Shun Xu @ SDU School of Computer Scence and echnology, Shandong Unversty Bayesan Decson heory Bayesan decson theory s a statstcal approach to data mnng/pattern

More information

Bayesian predictive Configural Frequency Analysis

Bayesian predictive Configural Frequency Analysis Psychologcal Test and Assessment Modelng, Volume 54, 2012 (3), 285-292 Bayesan predctve Confgural Frequency Analyss Eduardo Gutérrez-Peña 1 Abstract Confgural Frequency Analyss s a method for cell-wse

More information

A new Approach for Solving Linear Ordinary Differential Equations

A new Approach for Solving Linear Ordinary Differential Equations , ISSN 974-57X (Onlne), ISSN 974-5718 (Prnt), Vol. ; Issue No. 1; Year 14, Copyrght 13-14 by CESER PUBLICATIONS A new Approach for Solvng Lnear Ordnary Dfferental Equatons Fawz Abdelwahd Department of

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

Department of Quantitative Methods & Information Systems. Time Series and Their Components QMIS 320. Chapter 6

Department of Quantitative Methods & Information Systems. Time Series and Their Components QMIS 320. Chapter 6 Department of Quanttatve Methods & Informaton Systems Tme Seres and Ther Components QMIS 30 Chapter 6 Fall 00 Dr. Mohammad Zanal These sldes were modfed from ther orgnal source for educatonal purpose only.

More information

(Online First)A Lattice Boltzmann Scheme for Diffusion Equation in Spherical Coordinate

(Online First)A Lattice Boltzmann Scheme for Diffusion Equation in Spherical Coordinate Internatonal Journal of Mathematcs and Systems Scence (018) Volume 1 do:10.494/jmss.v1.815 (Onlne Frst)A Lattce Boltzmann Scheme for Dffuson Equaton n Sphercal Coordnate Debabrata Datta 1 *, T K Pal 1

More information

ECE559VV Project Report

ECE559VV Project Report ECE559VV Project Report (Supplementary Notes Loc Xuan Bu I. MAX SUM-RATE SCHEDULING: THE UPLINK CASE We have seen (n the presentaton that, for downlnk (broadcast channels, the strategy maxmzng the sum-rate

More information

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers Psychology 282 Lecture #24 Outlne Regresson Dagnostcs: Outlers In an earler lecture we studed the statstcal assumptons underlyng the regresson model, ncludng the followng ponts: Formal statement of assumptons.

More information

Natural Images, Gaussian Mixtures and Dead Leaves Supplementary Material

Natural Images, Gaussian Mixtures and Dead Leaves Supplementary Material Natural Images, Gaussan Mxtures and Dead Leaves Supplementary Materal Danel Zoran Interdscplnary Center for Neural Computaton Hebrew Unversty of Jerusalem Israel http://www.cs.huj.ac.l/ danez Yar Wess

More information

A New Evolutionary Computation Based Approach for Learning Bayesian Network

A New Evolutionary Computation Based Approach for Learning Bayesian Network Avalable onlne at www.scencedrect.com Proceda Engneerng 15 (2011) 4026 4030 Advanced n Control Engneerng and Informaton Scence A New Evolutonary Computaton Based Approach for Learnng Bayesan Network Yungang

More information

1 The Mistake Bound Model

1 The Mistake Bound Model 5-850: Advanced Algorthms CMU, Sprng 07 Lecture #: Onlne Learnng and Multplcatve Weghts February 7, 07 Lecturer: Anupam Gupta Scrbe: Bryan Lee,Albert Gu, Eugene Cho he Mstake Bound Model Suppose there

More information

Design and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm

Design and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm Desgn and Optmzaton of Fuzzy Controller for Inverse Pendulum System Usng Genetc Algorthm H. Mehraban A. Ashoor Unversty of Tehran Unversty of Tehran h.mehraban@ece.ut.ac.r a.ashoor@ece.ut.ac.r Abstract:

More information

An Improved multiple fractal algorithm

An Improved multiple fractal algorithm Advanced Scence and Technology Letters Vol.31 (MulGraB 213), pp.184-188 http://dx.do.org/1.1427/astl.213.31.41 An Improved multple fractal algorthm Yun Ln, Xaochu Xu, Jnfeng Pang College of Informaton

More information

One-sided finite-difference approximations suitable for use with Richardson extrapolation

One-sided finite-difference approximations suitable for use with Richardson extrapolation Journal of Computatonal Physcs 219 (2006) 13 20 Short note One-sded fnte-dfference approxmatons sutable for use wth Rchardson extrapolaton Kumar Rahul, S.N. Bhattacharyya * Department of Mechancal Engneerng,

More information

Department of Statistics University of Toronto STA305H1S / 1004 HS Design and Analysis of Experiments Term Test - Winter Solution

Department of Statistics University of Toronto STA305H1S / 1004 HS Design and Analysis of Experiments Term Test - Winter Solution Department of Statstcs Unversty of Toronto STA35HS / HS Desgn and Analyss of Experments Term Test - Wnter - Soluton February, Last Name: Frst Name: Student Number: Instructons: Tme: hours. Ads: a non-programmable

More information

Errors for Linear Systems

Errors for Linear Systems Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch

More information

Statistics II Final Exam 26/6/18

Statistics II Final Exam 26/6/18 Statstcs II Fnal Exam 26/6/18 Academc Year 2017/18 Solutons Exam duraton: 2 h 30 mn 1. (3 ponts) A town hall s conductng a study to determne the amount of leftover food produced by the restaurants n the

More information

OPTIMISATION. Introduction Single Variable Unconstrained Optimisation Multivariable Unconstrained Optimisation Linear Programming

OPTIMISATION. Introduction Single Variable Unconstrained Optimisation Multivariable Unconstrained Optimisation Linear Programming OPTIMIATION Introducton ngle Varable Unconstraned Optmsaton Multvarable Unconstraned Optmsaton Lnear Programmng Chapter Optmsaton /. Introducton In an engneerng analss, sometmes etremtes, ether mnmum or

More information

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4) I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes

More information

Module 9. Lecture 6. Duality in Assignment Problems

Module 9. Lecture 6. Duality in Assignment Problems Module 9 1 Lecture 6 Dualty n Assgnment Problems In ths lecture we attempt to answer few other mportant questons posed n earler lecture for (AP) and see how some of them can be explaned through the concept

More information

A Particle Filter Algorithm based on Mixing of Prior probability density and UKF as Generate Importance Function

A Particle Filter Algorithm based on Mixing of Prior probability density and UKF as Generate Importance Function Advanced Scence and Technology Letters, pp.83-87 http://dx.do.org/10.14257/astl.2014.53.20 A Partcle Flter Algorthm based on Mxng of Pror probablty densty and UKF as Generate Importance Functon Lu Lu 1,1,

More information

Suppose that there s a measured wndow of data fff k () ; :::; ff k g of a sze w, measured dscretely wth varable dscretzaton step. It s convenent to pl

Suppose that there s a measured wndow of data fff k () ; :::; ff k g of a sze w, measured dscretely wth varable dscretzaton step. It s convenent to pl RECURSIVE SPLINE INTERPOLATION METHOD FOR REAL TIME ENGINE CONTROL APPLICATIONS A. Stotsky Volvo Car Corporaton Engne Desgn and Development Dept. 97542, HA1N, SE- 405 31 Gothenburg Sweden. Emal: astotsky@volvocars.com

More information

A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS

A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS HCMC Unversty of Pedagogy Thong Nguyen Huu et al. A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS Thong Nguyen Huu and Hao Tran Van Department of mathematcs-nformaton,

More information

The Feynman path integral

The Feynman path integral The Feynman path ntegral Aprl 3, 205 Hesenberg and Schrödnger pctures The Schrödnger wave functon places the tme dependence of a physcal system n the state, ψ, t, where the state s a vector n Hlbert space

More information

Department of Electrical & Electronic Engineeing Imperial College London. E4.20 Digital IC Design. Median Filter Project Specification

Department of Electrical & Electronic Engineeing Imperial College London. E4.20 Digital IC Design. Median Filter Project Specification Desgn Project Specfcaton Medan Flter Department of Electrcal & Electronc Engneeng Imperal College London E4.20 Dgtal IC Desgn Medan Flter Project Specfcaton A medan flter s used to remove nose from a sampled

More information

A Note on Bound for Jensen-Shannon Divergence by Jeffreys

A Note on Bound for Jensen-Shannon Divergence by Jeffreys OPEN ACCESS Conference Proceedngs Paper Entropy www.scforum.net/conference/ecea- A Note on Bound for Jensen-Shannon Dvergence by Jeffreys Takuya Yamano, * Department of Mathematcs and Physcs, Faculty of

More information

Notes on Frequency Estimation in Data Streams

Notes on Frequency Estimation in Data Streams Notes on Frequency Estmaton n Data Streams In (one of) the data streamng model(s), the data s a sequence of arrvals a 1, a 2,..., a m of the form a j = (, v) where s the dentty of the tem and belongs to

More information

Chapter 11: Simple Linear Regression and Correlation

Chapter 11: Simple Linear Regression and Correlation Chapter 11: Smple Lnear Regresson and Correlaton 11-1 Emprcal Models 11-2 Smple Lnear Regresson 11-3 Propertes of the Least Squares Estmators 11-4 Hypothess Test n Smple Lnear Regresson 11-4.1 Use of t-tests

More information

Lecture 6: Introduction to Linear Regression

Lecture 6: Introduction to Linear Regression Lecture 6: Introducton to Lnear Regresson An Manchakul amancha@jhsph.edu 24 Aprl 27 Lnear regresson: man dea Lnear regresson can be used to study an outcome as a lnear functon of a predctor Example: 6

More information

High resolution entropy stable scheme for shallow water equations

High resolution entropy stable scheme for shallow water equations Internatonal Symposum on Computers & Informatcs (ISCI 05) Hgh resoluton entropy stable scheme for shallow water equatons Xaohan Cheng,a, Yufeng Ne,b, Department of Appled Mathematcs, Northwestern Polytechncal

More information

Learning Theory: Lecture Notes

Learning Theory: Lecture Notes Learnng Theory: Lecture Notes Lecturer: Kamalka Chaudhur Scrbe: Qush Wang October 27, 2012 1 The Agnostc PAC Model Recall that one of the constrants of the PAC model s that the data dstrbuton has to be

More information

Probability Theory. The nth coefficient of the Taylor series of f(k), expanded around k = 0, gives the nth moment of x as ( ik) n n!

Probability Theory. The nth coefficient of the Taylor series of f(k), expanded around k = 0, gives the nth moment of x as ( ik) n n! 8333: Statstcal Mechancs I Problem Set # 3 Solutons Fall 3 Characterstc Functons: Probablty Theory The characterstc functon s defned by fk ep k = ep kpd The nth coeffcent of the Taylor seres of fk epanded

More information

Composite Hypotheses testing

Composite Hypotheses testing Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter

More information

Correlation and Regression. Correlation 9.1. Correlation. Chapter 9

Correlation and Regression. Correlation 9.1. Correlation. Chapter 9 Chapter 9 Correlaton and Regresson 9. Correlaton Correlaton A correlaton s a relatonshp between two varables. The data can be represented b the ordered pars (, ) where s the ndependent (or eplanator) varable,

More information

Supporting Information

Supporting Information Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to

More information