Weighted Gossip: Distributed Averaging Using Non-Doubly Stochastic Matrices
|
|
- Vanessa Phelps
- 5 years ago
- Views:
Transcription
1 Weghted Gossp: Dstrbuted Averagng Usng Non-Doubly Stochastc Matrces Florence Bénézt ENS-INRIA, France Vncent Blondel UCL, Belgum Patrc Thran EPFL, Swtzerland John Tstsls MIT, USA Martn Vetterl EPFL, Swtzerland Abstract Ths paper presents a general class of gosspbased averagng algorthms, whch are nspred from Unform Gossp [1] Whle Unform Gossp wors synchronously on complete graphs, weghted gossp algorthms allow asynchronous rounds and converge on any connected, drected or undrected graph Unle most prevous gossp algorthms [2] [6], Weghted Gossp admts stochastc update matrces whch need not be doubly stochastc Double-stochastcty beng very restrctve n a dstrbuted settng [7], ths novel degree of freedom s essental and t opens the perspectve of desgnng a large number of new gossp-based algorthms To gve an example, we present one of these algorthms, whch we call One-Way Averagng It s based on random geographc routng, just le Path Averagng [5], except that routes are one way nstead of round trp Hence n ths example, gettng rd of double stochastcty allows us to add robustness to Path Averagng I INTRODUCTION Gossp algorthms were recently developed to solve the dstrbuted average consensus problem [1] [6] Every node n a networ holds a value x and wants to learn the average x ave of all the values n the networ n a dstrbuted way Most gossp algorthms were desgned for wreless sensor networs, whch are usually modeled as random geometrc graphs and sometmes as lattces Ideally a dstrbuted averagng algorthm should be effcent n terms of energy and delay wthout requrng too much nowledge about the networ topology at each node, nor sophstcated coordnaton between nodes The smplest gossp algorthm s Parwse Gossp, where random pars of connected nodes teratvely and locally average ther values untl convergence to the global average [2] Parwse local averagng s an easy tas, whch does not requre global nowledge nor global coordnaton, thus Parwse Gossp fulflls the requrements of our dstrbuted problem However, the convergence speed of Parwse Gossp suffers from the localty of the updates, and t was shown that averagng random geographc routes nstead of local neghborhoods s an order-optmal communcaton scheme to run gossp Let n be the number of nodes n the networ On random geometrc graphs, Parwse Gossp requres Θ(n 2 ) messages whereas Path Averagng requres only Θ(n log n) messages under some condtons [5] The prevous algorthm ganed effcency at the prce of more complex coordnaton At every round of Path Averagng, a random node waes up and generates a random route Values are aggregated along the route and the destnaton node computes the average of the values collected along the route Then the destnaton node sends the average bac through the same route so that all the nodes n the route can update ther values to the average Path Averagng s effcent n terms of energy consumpton, but t demands some long dstance coordnaton to mae sure that all the values n the route were updated correctly Routng nformaton bac and forth mght as well ntroduce delay ssues, because a node that s engaged n a route needs to wat for the update to come bac before t can proceed to another round Furthermore, n a moble networ, or n a hghly dynamc networ, routng the nformaton bac on the same route mght even not succeed Ths wor started wth the goal of desgnng a undrectonal gossp algorthm fulfllng the followng requrements: Keep a geographc routng communcaton scheme because t s hghly dffusve, Avod routng bac data: nstead of long dstance agreements, only agreements between neghbors are allowed, Route crossng s possble at any tme, wthout ntroducng errors n the algorthm As we were desgnng One-Way Averagng, we happened to prove the correctness of a broad set of gossp-based algorthms, whch we present n ths paper along wth One-Way Averagng These algorthms can be asynchronous and they use stochastc dffuson matrces whch are not necessarly doubly stochastc, as announced by the ttle of the paper In Secton II, we gve some bacground on gossp algorthms, and we explan why Unform Gossp s a ey algorthm to get nspred from when buldng a undrectonal gossp algorthm In Secton III, we present Weghted Gossp, an asynchronous generalzaton of Unform Gossp, whch was already suggested n [1] but had remaned unnamed We show n Secton IV that weghted gossp algorthms converge to x ave, whch s a novel result to the best of our nowledge In Secton V, we descrbe n detal One-Way Averagng and we show on smulatons that the good dffusvty of geographc routes n Path Averagng perssts n One-Way Averagng Computng the speed of convergence of weghted gossp algorthms remans open and s part of future wor II BACKGROUND ON GOSSIP ALGORITHMS The values to be averaged are gathered n a vector x(0) and at any teraton t, the current estmates of the average x ave are gathered n x(t) Gossp algorthms update estmates lnearly At any teraton t, there s a matrx W(t) such that: x(t) T x(t 1) T W(t)
2 In gossp algorthms that converge to average consensus, W(t) s doubly stochastc: W(t)1 1 ensures that the global average s conserved, and 1 T W(t) 1 T guarantees stable consensus To perform averagng on a one way route, W(t) should be upper trangular (up to a node ndex permutaton) But the only matrx that s both doubly stochastc and upper trangular matrx s the dentty matrx Thus, undrectonal averagng requres to drop double stochastcty Unform Gossp solves ths ssue n the followng way Instead of updatng one vector x(t) of varables, t updates a vector s(t) of sums, and a vector ω(t) of weghts Unform Gossp ntalzes s(0) x(0) and ω(0) 1 At any tme, the vector of estmates s x(t) s(t)/ω(t), where the dvson s performed elementwse The updates are computed wth stochastc dffuson matrces {D(t)} t>0 : s(t) T s(t 1) T D(t), (1) ω(t) T ω(t 1) T D(t) (2) Kempe et al [1] prove that the algorthm converges to a consensus on x ave (lm t x(t) x ave 1) n the specal case where for any node, D (t) 1/2 and D j (t) 1/2 for one node j chosen d unformly at random As a ey remar, note that here D(t) s not doubly stochastc The algorthm s synchronous and t wors on complete graphs wthout routng, and on other graphs wth routng We show n ths paper that the dea wors wth many more sequences of matrces {D(t)} t>0 than just the one used n Unform Gossp III WEIGHTED GOSSIP We call Weghted Gossp the class of gossp-based algorthms followng the sum and weght structure of Unform Gossp descrbed above (Eq (1) and (2)) A weghted gossp algorthm s entrely characterzed by the dstrbuton of ts dffuson matrces {D(t)} t>0 Let P(s,t) : D(s)D(s + 1)D(t) and let P(t) : P(1,t) Then s(t) T x(0) T P(t), (3) ω(t) T 1 T P(t) (4) If a weghted gossp algorthm s asynchronous, then, D (t) 1 and D j,j (t) 0 for the nodes that do not contrbute to teraton t If D j (t) 0, then node sends (D j (t)s (t 1),D j (t)ω (t 1)) to node j, whch adds the receved data to ts own sums j (t 1) and weght ω j (t 1) At any teraton t, the estmate at node s x (t) s (t)/ω (t) Because 1 T D(t) 1 T, sums and weghts do not reach a consensus However, because D(t)1 1, sums and weghts are conserved: at any teraton t, s (t) x (0) nx ave, (5) ω (t) n (6) Ths mples that Weghted Gossp s a class of non-based estmators for the average (even though x (t) s not conserved through tme!): Theorem 31 (Non-based estmator): If the estmates x(t) s(t)/ω(t) converge to a consensus, then the consensus value s the average x ave Proof: Let c be the consensus value For any ǫ > 0, there s an teraton t 0 after whch, for any node, x (t) c < ǫ Then, for any t > t 0, s (t) cω (t) < ǫω (t) (weghts are always postve) Hence, summng over, (s (t) cω (t)) s (t) cω (t) < ǫ ω (t) Usng Eq (5), (6), the prevous equaton can be wrtten as nx ave nc < nǫ, whch s equvalent to x ave c < ǫ Hence c x ave In the next secton, we show that, although sums and weghts do not reach a consensus, the estmates{x (t)} 1 n converge to a consensus under some condtons IV CONVERGENCE In ths secton we prove that Weghted Gossp succeeds n other cases than just Unform Gossp Assumpton 1: {D(t)} t>0 s a statonary and ergodc sequence of stochastc matrces wth postve dagonals, and E[D] s rreducble Irreducblty means that the graph formed by edges (, j) such that P[D j > 0] > 0 s connected, whch requres the connectvty of the networ Note that d sequences are statonary and ergodc Statonarty mples that E[D] does not depend on t Postve dagonals means that each node should always eep part of ts sum and weght:,t,d (t) > 0 Theorem 41 (Man Theorem): Under Assumpton 1, Weghted Gossp usng {D(t)} t>0 converges to a consensus wth probablty 1, e lm t x(t) x ave 1 To prove Th 41, we wll start by upper boundng the error x(t) x ave 1 wth a non-ncreasng functon f(t) (Lemma 41): letη j (t) P j (t) j1 P j(t)/n P j (t) ω (t)/n, then f s defned as f(t) max 1 n f (t), where f (t) j1 η j(t) /ω (t) Then, we wll prove that f(t) vanshes to 0 by showng that η j (t) vanshes to 0 (wea ergodcty argument of Lemma 43) and that ω (t) s bounded away from 0 nfntely often (Lemma 44) Lemma 41: If {D(t)} t>0 s a sequence of stochastc matrces, then the functon f(t) s non ncreasng Furthermore, x(t) x ave 1 x(0) f(t) (7) Proof: By Eq (3), for any node, j1 x (t) x ave P j(t)x j (0) x ave ω (t) j1 (ω (t)/n+η j (t))x j (0) x ave ω (t) j1 η j(t)x j (0) ω (t) x(0) j1 η j(t) ω (t) x(0) f (t),
3 whch proves Eq (7) Next, we need to prove that f(t) s a non-ncreasng functon For any node, by Eq (1) and (2), η j (t) n f (t) ω j1 (t) η j(t 1)D (t) j1 ω (t 1)D (t) η j(t 1) D (t) ω (t 1)D (t) j1 max max max j1 η j(t 1) D (t) ω (t 1)D (t) j1 η j(t 1) D (t) ω (t 1)D (t) j1 η j(t 1) ω (t 1) f (t 1) f(t 1), whch mples that f(t) f(t 1) Eq (8) comes from the followng equalty: for any {a } 1 n 0,{b } 1 n > 0, a b b a a n j1 b max j b b The followng lemma s useful to prove Lemmas 43 and 44 Lemma 42: Under Assumpton 1, there s a determnstc tme T and a constant c such that P[D(1)D(2)D(T) > c] > 0, where A > c means that every entry of A s larger than c Proof: The proof of ths lemma can be found n [8] For the case where {D(t)} t>0 s d, a smpler proof can be found n [9] Note that the theorems proven n [8] and [9] are slghtly dfferent than our lemma because the authors multply matrces on the left, whereas we multply them on the rght However the multplcaton sde does not change the proof For completeness and smplcty, we gve the proof n the d case E[D] beng rreducble and havng a postve dagonal, t s prmtve as well: there s an m > 0 such that E[D] m > 0 (elementwse) {D(t)} t 1 s d, [ hence E[D(1)D(2)D(m)] ] E[D] m > 0, and P (D(1)D(2)D(m)) j > 0 > 0 for any entry (,j) For any tme t, the dagonal coeffcents of D(t) are nonzero, thus, f the (,j) th entry of P(, + m 1) D()D( +1)D(+m 1) s postve, then P j (t) > 0 for all t + m 1 Now tae T n(n 1)m The probablty that P(T) > 0 s larger than or equal to the jont probablty that P 12 (1,m) > 0, P 13 (m + 1,2m) > 0,, P n,n 1 (T m+1,t) > 0 By ndependence of {D(t)} t 1, P[P(T) > 0] P[P 1,2 (1,m) > 0]P[P 1,3 (m+1,2m) > 0] (8) P[P n,n 1 (T m+1,t) > 0] > 0 Therefore, there s a c > 0 such that P[D(1)D(2)D(T) > c] > 0 Lemma 43 (Wea ergodcty): Under Assumpton 1, {D(t)} t 1 s wealy ergodc Wea ergodcty means that when t grows, P(t) tends to have dentcal rows, whch may vary wth t It s weaer than strong ergodcty, where P(t) tends to a matrx 1π T, where π does not vary wth t Interestngly, smple computatons show that f P(t) has dentcal rows, then consensus s reached All we need to now n ths paper s that wea ergodcty mples that lm max t,j P (t) P j (t) 0, and we suggest [10] for further readng about wea ergodcty Proof: Let Q be a stochastc matrx The Dobrushn coeffcent δ(q) of matrx Q s defned as: δ(q) 1 2 max j Q Q j One can show [10] that 0 δ(q) 1, and that for any stochastc matrces Q 1 and Q 2, δ(q 1 Q 2 ) δ(q 1 )δ(q 2 ) (9) Another useful fact s that for any stochastc matrx Q 1 δ(q) maxmnq j mn Q j j (10),j A bloc crteron for wea ergodcty [10] s based on Eq (9): {D(t)} t 1 s wealy ergodc f and only f there s a strctly ncreasng sequence of ntegers { s } s 1 such that (1 δ(p( s +1, s+1 ))) (11) s1 We use ths crteron wth s st, where T was defned n Lemma 42 A jont consequence of Lemma 42 and of Brhoff s ergodc theorem [11], [8] (n the d case, one can use the strong law of large numbers nstead) s that the event {D( s + 1)D( s + 2) D( s+1 ) > c} happens nfntely often wth probablty 1 Hence, usng Eq (10), the event {1 δ(p( s +1, s+1 )) > c} happens nfntely often wth probablty 1 We can thus conclude that the bloc crteron (11) holds wth probablty 1 and that {D(t)} t 1 s wealy ergodc The next lemma shows that, although weghts can become arbtrarly small, they are unformly large enough nfntely often Lemma 44: Under Assumpton 1, there s a constant α such that, for any tme t, wth probablty 1, there s a tme t 1 t at whch mn ω (t 1 ) α Proof: As mentoned n the proof of Lemma 43, the event {D( s + 1)D( s + 2) D( s+1 ) > c}, where s st, happens nfntely often wth probablty 1 Let t 1 be the frst tme larger than t such that D(t 1 T + 1)D(t 1 T + 2)D(t 1 ) > c Then the weghts at tme t 1 satsfy ω(t 1 ) T ω(t 1 T) T D(t 1 T +1)D(t 1 ) > cω(t 1 T) T 11 T,
4 because weghts are always postve Now, because the sum of weghts s equal to n, ω(t 1 T) T 1 n Hence ω(t 1 ) T > cn1 T Tang α cn concludes the proof To prove Theorem 41, t remans to show that f(t) converges to 0 Proof: (Theorem 41) For any ε > 0, accordng to Lemma 43, there s a tme t 0 such that for any t t 0, max,j P (t) P j (t) < ε As a consequence P (t) P j (t) < ε for any,j, Hence η j (t) < ε as well Indeed, η j (t) P P (t) j(t) n P j (t) P (t) n P j (t) P (t) ε < n n ε Therefore, for any t t 0 and any 1 n, f (t) < nε ω (t), and therefore nε f(t) < mn ω (t) Usng Lemma 44, there s a constant α such that, wth probablty 1, there s a tme t 1 t 0 at whch mn ω (t 1 ) α Then, for any ε, t suffces to tae ε αε /n to conclude that there s a tme t 1 wth probablty 1 such that f(t 1 ) < ε Snce f s non ncreasng (Lemma 41), for all tme t t 1, f(t) < ε ; n other words f(t) converges to 0 Usng (7) concludes the proof Remar: A smlar convergence result can be proved wthout Assumpton 1 (statonarty and ergodcty of the matrces D(t)), n a settng where the matrces are chosen n a perhaps adversaral manner One needs only some mnmal connectvty assumptons, whch then guarantee that there exsts a fnte number T such that, for all t, all entres of D(t+1) D(t+T) are bounded below by a postve constant c (see, eg, Lemma 521 n [12]) V ONE-WAY AVERAGING In ths secton, we descrbe n detal a novel weghted gossp algorthm, whch we call One-Way Averagng A Assumptons and Notatons Assume that the networ s a random geometrc graph on a convex area A, wth a connecton radus r(n) large enough to enable geographc routng [3] For every node, let T be a dstrbuton of ponts outsde of the area A, and let H be a dstrbuton of ntegers larger than 2 Each node has an ndependent local exponental random cloc of rate λ, and ntates an teraton when t rngs Equvalently, tme s counted n terms of a global and vrtual exponental cloc of rate nλ Each tme the global cloc rngs, a node waes up ndependently and unformly at random In the analyss, t ndcates how many tmes the global cloc rang A detaled analyss of ths tme model can be found n [2] B Descrpton of One-Way Averagng Each node ntalzes ts sum s (0) x (0) and ts weght ω (0) 1 For any teraton t > 0, let be the node whose cloc rngs Node draws a target Z accordng to dstrbuton Z and a number H 2 of hops accordng to dstrbuton H Node chooses unformly at random a neghbor whch s closer to the target Z than tself If there s no such neghbor then the teraton termnates If such a neghbor j exsts, then node dvdes ts sum s (t 1) and ts weght ω (t 1) by H and sends (s (t 1),ω (t 1)) (H 1)/H to node j It also sends the remanng number H 1 of hops and the target Z Node j adds the receved sum and weght to ts sum s j (t 1) and ts weght ω j (t 1) Then t performs the same operaton as node towards a node that s closer to the target, except that t dvdes ts new sum and weght by H 1 nstead of H (formally, H H 1) Messages are greedly sent towards the target,h beng decremented at each hop The teraton ends when H 1 or when a node does not have any neghbor to forward a message to At any tme, the estmate of any node s the rato between ts sum and ts weght C Dffuson Matrces Suppose that at round t, a whole route of H nodes s generated Then, after re-ndexng nodes startng wth the nodes n the route, the dffuson matrx D(t) can be wrtten as: 1/H 1/H 1/H 1/H 0 0 1/(H 1) 1/(H 1) 1/(H 1) /2 1/ Id, where Id denotes the dentty matrx If the route stops early and has for example only 3 nodes whle H 4, then, after re-ndexng the nodes, D(t) can be wrtten as: 1/4 1/4 1/ /3 2/ Id Note that D(t) s ndeed stochastc for all t It s uppertrangular as well: One-Way Averagng does not requre to route nformaton bacwards along the path Furthermore, {D(t)} t>0 verfes Assumpton 1 Frst, {D(t)} t>0 s an d sequence Second, {D(t)} t>0 have postve dagonals Thrd, f the networ s connected and f the routes generated by dstrbutons {Z } 1 n and {H } 1 n connect the networ, then E[D] s rreducble Therefore, One-Way Averagng s a successful dstrbuted averagng algorthm Fnally, routes can cross each other wthout corruptng the algorthm (the resultng dffuson matrces are stll stochastc) D Smulaton One-Way Averagng and Path Averagng were run (Matlab) on random geometrc graphs on the unt square, usng the
5 same routes for a far comparson At each teraton t, the number H(t) of hops was generated wth H unform n [ 1/ 2r(n), 2/r(n) ] and the target Z(t) was drawn n the followng way: let I be the coordnates of the woen node, and let U be a pont drawn unformly at random n the unt square, then Z(t) I +3 U I U I 2 Let C(t 1,t 2 ) be the message cost of a gven algorthm from teraton t 1 to teraton t 2 For One-Way Averagng, C(t 1,t 2 ) t 2 tt1 R(t), where R(t) H(t) s the effectve route length at teraton t Because Path Averagng routes nformaton bac and forth, the cost of one teraton s taen to be equal to twce the route length: C(t 1,t 2 ) 2 t 2 tt 1 R(t) Let ǫ(t) x(t) x ave 1 The emprcal consensus cost s defned as: so that C emp (t 1,t 2 ) C(t 1,t 2 ) log ǫ(t 1 ) log ǫ(t 2 ), ( ǫ(t 2 ) ǫ(t 1 ) exp C(t ) 1,t 2 ) C emp (t 1,t 2 ) In Fg 1, we dsplay the emprcal consensus cost of both algorthms, wth t and t 2 growng lnearly wth n We can see that One-Way Averagng performs better than Path Averagng on ths example Although One-Way Averagng converges slower n terms of teratons, spendng twce as few messages per teraton s suffcent here to outperform Path Averagng The speed of convergence depends on the networ but also on {Z } 1 n and {H } 1 n, whch we have not optmzed It would be nterestng n further wor to compute the speed of convergence of Weghted Gossp, and to derve optmal dstrbutons {Z } 1 n and {H } 1 n for a gven networ usng One-Way Averagng As a concluson, One-Way Averagng seems to have the same dffusve qualtes as Path Averagng whle beng more robust at the same tme ACKNOWLEDGEMENTS The wor presented n ths paper was supported (n part) by the Natonal Competence Center n Research on Moble Informaton and Communcaton Systems (NCCR-MICS), a center supported by the Swss Natonal Scence Foundaton under grant number , and by the NSF under grant ECCS VI CONCLUSION We proved that weghted gossp algorthms converge to average consensus wth probablty 1 n a very general settng, e n connected networs, wth statonary and ergodc teratons, and wth a smple stablty condton (postve dagonals) We beleve that droppng double stochastcty opens great opportuntes n desgnng new dstrbuted averagng algorthms that are more robust and adapted to the specfctes of each networ One-Way Averagng for example s more C emp (t1,t2) One Way Averagng Path Averagng networ sze n Fg 1 Comparson of the consensus cost for One-Way Averagng and Path Averagng n random geometrc graphs of ncreasng szes n The connecton radus scales as r(n) 6logn/n Dsplay of C emp (t 1,t 2 ) averaged over 15 graphs and 4 smulaton runs per graph robust than Path Averagng, and t surprsngly consumes fewer messages on smulatons Also, double stochastcty s dffcult to enforce n a dstrbuted manner n drected graphs usng undrectonal communcatons Wth Weghted Gossp, one could easly buld averagng algorthms for drected networs that are relable enough not to requre acnowledgements The next step of ths wor s to compute analytcally the speed of convergence of Weghted Gossp In classcal Gossp, double stochastcty would greatly smplfy dervatons, but ths feature dsappears n Weghted Gossp, whch maes the problem more dffcult REFERENCES [1] D Kempe, A Dobra, and J Gehre, Gossp-based computaton of aggregate nformaton, n FOCS, vol 44 IEEE, 2003, pp [2] S Boyd, A Ghosh, B Prabhaar, and D Shah, Gossp algorthms : Desgn, analyss and applcatons, n IEEE, INFOCOM, 2005 [3] A G Dmas, A D Sarwate, and M J Wanwrght, Geographc gossp: effcent aggregaton for sensor networs, n ACM/IEEE Symposum on Informaton Processng n Sensor Networs, 2006 [4] B Nazer, A Dmas, and M Gastpar, Local nterference can accelerate gossp algorthms, n Allerton Conference on Communcaton, Control, and Computng, 2008, pp [5] F Bénézt, A Dmas, P Thran, and M Vetterl, Order-optmal consensus through randomzed path averagng, submtted for publcaton [6] A Nedc, A Olshevsy, A Ozdaglar, and J Tstsls, On dstrbuted averagng algorthms and quantzaton effects, n IEEE Conference on Decson and Control, 2008, pp [7] B Gharesfard and J Cortés, When does a dgraph admt a doubly stochastc adjacency matrx? n submtted to the Amercan Control Conference, 2010 [8] A Tahbaz-Saleh and A Jadbabae, Consensus over ergodc statonary graph processes, IEEE Transactons on Automatc Control, 2009 [9], Necessary and suffcent condtons for consensus over random ndependent and dentcally dstrbuted swtchng graphs, n IEEE Conference on Decson and Control, 2007, pp [10] P Brémaud, Marov Chans Gbbs Felds, Monte Carlo Smulaton, and Queues Sprnger, 1999 [11] R Durrett, Probablty: theory and examples Duxbury Press Belmont, CA, 1996 [12] J Tstsls, Problems n decentralzed decson mang and computaton, PhD dssertaton, M I T, Dept of Electrcal Engneerng and Computer Scence, 1984
Weighted Gossip: Distributed Averaging Using Non-Doubly Stochastic Matrices
Weighted Gossip: Distributed Averaging Using Non-Doubly Stochastic Matrices Florence Bénézit ENS-INRIA, France Vincent Blondel UCL, Belgium Patric Thiran EPFL, Switzerland John Tsitsilis MIT, USA Martin
More informationOutline. Communication. Bellman Ford Algorithm. Bellman Ford Example. Bellman Ford Shortest Path [1]
DYNAMIC SHORTEST PATH SEARCH AND SYNCHRONIZED TASK SWITCHING Jay Wagenpfel, Adran Trachte 2 Outlne Shortest Communcaton Path Searchng Bellmann Ford algorthm Algorthm for dynamc case Modfcatons to our algorthm
More informationChanging Topology and Communication Delays
Prepared by F.L. Lews Updated: Saturday, February 3, 00 Changng Topology and Communcaton Delays Changng Topology The graph connectvty or topology may change over tme. Let G { G, G,, G M } wth M fnte be
More informationLectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix
Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could
More informationContinuous Time Markov Chain
Contnuous Tme Markov Chan Hu Jn Department of Electroncs and Communcaton Engneerng Hanyang Unversty ERICA Campus Contents Contnuous tme Markov Chan (CTMC) Propertes of sojourn tme Relatons Transton probablty
More informationProblem Set 9 Solutions
Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem
More informationLecture Notes on Linear Regression
Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume
More informationMin Cut, Fast Cut, Polynomial Identities
Randomzed Algorthms, Summer 016 Mn Cut, Fast Cut, Polynomal Identtes Instructor: Thomas Kesselhem and Kurt Mehlhorn 1 Mn Cuts n Graphs Lecture (5 pages) Throughout ths secton, G = (V, E) s a mult-graph.
More informationOn the correction of the h-index for career length
1 On the correcton of the h-ndex for career length by L. Egghe Unverstet Hasselt (UHasselt), Campus Depenbeek, Agoralaan, B-3590 Depenbeek, Belgum 1 and Unverstet Antwerpen (UA), IBW, Stadscampus, Venusstraat
More information1 GSW Iterative Techniques for y = Ax
1 for y = A I m gong to cheat here. here are a lot of teratve technques that can be used to solve the general case of a set of smultaneous equatons (wrtten n the matr form as y = A), but ths chapter sn
More informationCase A. P k = Ni ( 2L i k 1 ) + (# big cells) 10d 2 P k.
THE CELLULAR METHOD In ths lecture, we ntroduce the cellular method as an approach to ncdence geometry theorems lke the Szemeréd-Trotter theorem. The method was ntroduced n the paper Combnatoral complexty
More informationEconomics 101. Lecture 4 - Equilibrium and Efficiency
Economcs 0 Lecture 4 - Equlbrum and Effcency Intro As dscussed n the prevous lecture, we wll now move from an envronment where we looed at consumers mang decsons n solaton to analyzng economes full of
More informationCOMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS
Avalable onlne at http://sck.org J. Math. Comput. Sc. 3 (3), No., 6-3 ISSN: 97-537 COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS
More informationFoundations of Arithmetic
Foundatons of Arthmetc Notaton We shall denote the sum and product of numbers n the usual notaton as a 2 + a 2 + a 3 + + a = a, a 1 a 2 a 3 a = a The notaton a b means a dvdes b,.e. ac = b where c s an
More informationDynamic Systems on Graphs
Prepared by F.L. Lews Updated: Saturday, February 06, 200 Dynamc Systems on Graphs Control Graphs and Consensus A network s a set of nodes that collaborates to acheve what each cannot acheve alone. A network,
More informationMMA and GCMMA two methods for nonlinear optimization
MMA and GCMMA two methods for nonlnear optmzaton Krster Svanberg Optmzaton and Systems Theory, KTH, Stockholm, Sweden. krlle@math.kth.se Ths note descrbes the algorthms used n the author s 2007 mplementatons
More informationLinear Approximation with Regularization and Moving Least Squares
Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...
More informationStanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011
Stanford Unversty CS359G: Graph Parttonng and Expanders Handout 4 Luca Trevsan January 3, 0 Lecture 4 In whch we prove the dffcult drecton of Cheeger s nequalty. As n the past lectures, consder an undrected
More informationSingular Value Decomposition: Theory and Applications
Sngular Value Decomposton: Theory and Applcatons Danel Khashab Sprng 2015 Last Update: March 2, 2015 1 Introducton A = UDV where columns of U and V are orthonormal and matrx D s dagonal wth postve real
More informationLecture 4. Instructor: Haipeng Luo
Lecture 4 Instructor: Hapeng Luo In the followng lectures, we focus on the expert problem and study more adaptve algorthms. Although Hedge s proven to be worst-case optmal, one may wonder how well t would
More informationU.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016
U.C. Berkeley CS94: Spectral Methods and Expanders Handout 8 Luca Trevsan February 7, 06 Lecture 8: Spectral Algorthms Wrap-up In whch we talk about even more generalzatons of Cheeger s nequaltes, and
More informationEcon107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)
I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes
More informationNotes on Frequency Estimation in Data Streams
Notes on Frequency Estmaton n Data Streams In (one of) the data streamng model(s), the data s a sequence of arrvals a 1, a 2,..., a m of the form a j = (, v) where s the dentty of the tem and belongs to
More informationLecture 10: May 6, 2013
TTIC/CMSC 31150 Mathematcal Toolkt Sprng 013 Madhur Tulsan Lecture 10: May 6, 013 Scrbe: Wenje Luo In today s lecture, we manly talked about random walk on graphs and ntroduce the concept of graph expander,
More informationThis model contains two bonds per unit cell (one along the x-direction and the other along y). So we can rewrite the Hamiltonian as:
1 Problem set #1 1.1. A one-band model on a square lattce Fg. 1 Consder a square lattce wth only nearest-neghbor hoppngs (as shown n the fgure above): H t, j a a j (1.1) where,j stands for nearest neghbors
More informationComplete subgraphs in multipartite graphs
Complete subgraphs n multpartte graphs FLORIAN PFENDER Unverstät Rostock, Insttut für Mathematk D-18057 Rostock, Germany Floran.Pfender@un-rostock.de Abstract Turán s Theorem states that every graph G
More informationSTAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 16
STAT 39: MATHEMATICAL COMPUTATIONS I FALL 218 LECTURE 16 1 why teratve methods f we have a lnear system Ax = b where A s very, very large but s ether sparse or structured (eg, banded, Toepltz, banded plus
More informationBézier curves. Michael S. Floater. September 10, These notes provide an introduction to Bézier curves. i=0
Bézer curves Mchael S. Floater September 1, 215 These notes provde an ntroducton to Bézer curves. 1 Bernsten polynomals Recall that a real polynomal of a real varable x R, wth degree n, s a functon of
More informationSection 8.3 Polar Form of Complex Numbers
80 Chapter 8 Secton 8 Polar Form of Complex Numbers From prevous classes, you may have encountered magnary numbers the square roots of negatve numbers and, more generally, complex numbers whch are the
More informationConvergence of random processes
DS-GA 12 Lecture notes 6 Fall 216 Convergence of random processes 1 Introducton In these notes we study convergence of dscrete random processes. Ths allows to characterze phenomena such as the law of large
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 1 10/1/013 Martngale Concentraton Inequaltes and Applcatons Content. 1. Exponental concentraton for martngales wth bounded ncrements.
More informationA New Refinement of Jacobi Method for Solution of Linear System Equations AX=b
Int J Contemp Math Scences, Vol 3, 28, no 17, 819-827 A New Refnement of Jacob Method for Soluton of Lnear System Equatons AX=b F Naem Dafchah Department of Mathematcs, Faculty of Scences Unversty of Gulan,
More informationDr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur
Analyss of Varance and Desgn of Exerments-I MODULE III LECTURE - 2 EXPERIMENTAL DESIGN MODELS Dr. Shalabh Deartment of Mathematcs and Statstcs Indan Insttute of Technology Kanur 2 We consder the models
More information6. Stochastic processes (2)
6. Stochastc processes () Lect6.ppt S-38.45 - Introducton to Teletraffc Theory Sprng 5 6. Stochastc processes () Contents Markov processes Brth-death processes 6. Stochastc processes () Markov process
More information10.34 Fall 2015 Metropolis Monte Carlo Algorithm
10.34 Fall 2015 Metropols Monte Carlo Algorthm The Metropols Monte Carlo method s very useful for calculatng manydmensonal ntegraton. For e.g. n statstcal mechancs n order to calculate the prospertes of
More information6. Stochastic processes (2)
Contents Markov processes Brth-death processes Lect6.ppt S-38.45 - Introducton to Teletraffc Theory Sprng 5 Markov process Consder a contnuous-tme and dscrete-state stochastc process X(t) wth state space
More informationLecture 3. Ax x i a i. i i
18.409 The Behavor of Algorthms n Practce 2/14/2 Lecturer: Dan Spelman Lecture 3 Scrbe: Arvnd Sankar 1 Largest sngular value In order to bound the condton number, we need an upper bound on the largest
More informationLecture Space-Bounded Derandomization
Notes on Complexty Theory Last updated: October, 2008 Jonathan Katz Lecture Space-Bounded Derandomzaton 1 Space-Bounded Derandomzaton We now dscuss derandomzaton of space-bounded algorthms. Here non-trval
More informationGrover s Algorithm + Quantum Zeno Effect + Vaidman
Grover s Algorthm + Quantum Zeno Effect + Vadman CS 294-2 Bomb 10/12/04 Fall 2004 Lecture 11 Grover s algorthm Recall that Grover s algorthm for searchng over a space of sze wors as follows: consder the
More information3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X
Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number
More informationDegree Fluctuations and the Convergence Time of Consensus Algorithms
Degree Fluctuatons and the Convergence Tme of Consensus Algorthms Alex Olshevsy John N. Tstsls Abstract We consder a consensus algorthm n whch every node n a tme-varyng undrected connected graph assgns
More informationVQ widely used in coding speech, image, and video
at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng
More informationAdditional Codes using Finite Difference Method. 1 HJB Equation for Consumption-Saving Problem Without Uncertainty
Addtonal Codes usng Fnte Dfference Method Benamn Moll 1 HJB Equaton for Consumpton-Savng Problem Wthout Uncertanty Before consderng the case wth stochastc ncome n http://www.prnceton.edu/~moll/ HACTproect/HACT_Numercal_Appendx.pdf,
More informationn α j x j = 0 j=1 has a nontrivial solution. Here A is the n k matrix whose jth column is the vector for all t j=0
MODULE 2 Topcs: Lnear ndependence, bass and dmenson We have seen that f n a set of vectors one vector s a lnear combnaton of the remanng vectors n the set then the span of the set s unchanged f that vector
More informationBezier curves. Michael S. Floater. August 25, These notes provide an introduction to Bezier curves. i=0
Bezer curves Mchael S. Floater August 25, 211 These notes provde an ntroducton to Bezer curves. 1 Bernsten polynomals Recall that a real polynomal of a real varable x R, wth degree n, s a functon of the
More informationCollege of Computer & Information Science Fall 2009 Northeastern University 20 October 2009
College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:
More information2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification
E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton
More informationAPPENDIX A Some Linear Algebra
APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,
More informationCSCE 790S Background Results
CSCE 790S Background Results Stephen A. Fenner September 8, 011 Abstract These results are background to the course CSCE 790S/CSCE 790B, Quantum Computaton and Informaton (Sprng 007 and Fall 011). Each
More informationReport on Image warping
Report on Image warpng Xuan Ne, Dec. 20, 2004 Ths document summarzed the algorthms of our mage warpng soluton for further study, and there s a detaled descrpton about the mplementaton of these algorthms.
More informationProbabilistic Graphical Models
School of Computer Scence robablstc Graphcal Models Appromate Inference: Markov Chan Monte Carlo 05 07 Erc Xng Lecture 7 March 9 04 X X 075 05 05 03 X 3 Erc Xng @ CMU 005-04 Recap of Monte Carlo Monte
More informationOutline and Reading. Dynamic Programming. Dynamic Programming revealed. Computing Fibonacci. The General Dynamic Programming Technique
Outlne and Readng Dynamc Programmng The General Technque ( 5.3.2) -1 Knapsac Problem ( 5.3.3) Matrx Chan-Product ( 5.3.1) Dynamc Programmng verson 1.4 1 Dynamc Programmng verson 1.4 2 Dynamc Programmng
More informationThe Order Relation and Trace Inequalities for. Hermitian Operators
Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence
More informationprinceton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg
prnceton unv. F 17 cos 521: Advanced Algorthm Desgn Lecture 7: LP Dualty Lecturer: Matt Wenberg Scrbe: LP Dualty s an extremely useful tool for analyzng structural propertes of lnear programs. Whle there
More informationSupplement to Clustering with Statistical Error Control
Supplement to Clusterng wth Statstcal Error Control Mchael Vogt Unversty of Bonn Matthas Schmd Unversty of Bonn In ths supplement, we provde the proofs that are omtted n the paper. In partcular, we derve
More informationLecture 4: November 17, Part 1 Single Buffer Management
Lecturer: Ad Rosén Algorthms for the anagement of Networs Fall 2003-2004 Lecture 4: November 7, 2003 Scrbe: Guy Grebla Part Sngle Buffer anagement In the prevous lecture we taled about the Combned Input
More information5 The Rational Canonical Form
5 The Ratonal Canoncal Form Here p s a monc rreducble factor of the mnmum polynomal m T and s not necessarly of degree one Let F p denote the feld constructed earler n the course, consstng of all matrces
More informationRandom Walks on Digraphs
Random Walks on Dgraphs J. J. P. Veerman October 23, 27 Introducton Let V = {, n} be a vertex set and S a non-negatve row-stochastc matrx (.e. rows sum to ). V and S defne a dgraph G = G(V, S) and a drected
More information1 Matrix representations of canonical matrices
1 Matrx representatons of canoncal matrces 2-d rotaton around the orgn: ( ) cos θ sn θ R 0 = sn θ cos θ 3-d rotaton around the x-axs: R x = 1 0 0 0 cos θ sn θ 0 sn θ cos θ 3-d rotaton around the y-axs:
More informationA PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS
HCMC Unversty of Pedagogy Thong Nguyen Huu et al. A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS Thong Nguyen Huu and Hao Tran Van Department of mathematcs-nformaton,
More informationGeneralized Linear Methods
Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set
More informationKernel Methods and SVMs Extension
Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general
More informationChapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems
Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons
More informationEigenvalues of Random Graphs
Spectral Graph Theory Lecture 2 Egenvalues of Random Graphs Danel A. Spelman November 4, 202 2. Introducton In ths lecture, we consder a random graph on n vertces n whch each edge s chosen to be n the
More informationSupplementary material: Margin based PU Learning. Matrix Concentration Inequalities
Supplementary materal: Margn based PU Learnng We gve the complete proofs of Theorem and n Secton We frst ntroduce the well-known concentraton nequalty, so the covarance estmator can be bounded Then we
More informationIntroduction to Algorithms
Introducton to Algorthms 6.046J/8.40J Lecture 7 Prof. Potr Indyk Data Structures Role of data structures: Encapsulate data Support certan operatons (e.g., INSERT, DELETE, SEARCH) Our focus: effcency of
More informationCHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE
CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE Analytcal soluton s usually not possble when exctaton vares arbtrarly wth tme or f the system s nonlnear. Such problems can be solved by numercal tmesteppng
More informationMarkov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement
Markov Chan Monte Carlo MCMC, Gbbs Samplng, Metropols Algorthms, and Smulated Annealng 2001 Bonformatcs Course Supplement SNU Bontellgence Lab http://bsnuackr/ Outlne! Markov Chan Monte Carlo MCMC! Metropols-Hastngs
More informationThe Minimum Universal Cost Flow in an Infeasible Flow Network
Journal of Scences, Islamc Republc of Iran 17(2): 175-180 (2006) Unversty of Tehran, ISSN 1016-1104 http://jscencesutacr The Mnmum Unversal Cost Flow n an Infeasble Flow Network H Saleh Fathabad * M Bagheran
More informationParametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010
Parametrc fractonal mputaton for mssng data analyss Jae Kwang Km Survey Workng Group Semnar March 29, 2010 1 Outlne Introducton Proposed method Fractonal mputaton Approxmaton Varance estmaton Multple mputaton
More informationPrimer on High-Order Moment Estimators
Prmer on Hgh-Order Moment Estmators Ton M. Whted July 2007 The Errors-n-Varables Model We wll start wth the classcal EIV for one msmeasured regressor. The general case s n Erckson and Whted Econometrc
More informationEquilibrium with Complete Markets. Instructor: Dmytro Hryshko
Equlbrum wth Complete Markets Instructor: Dmytro Hryshko 1 / 33 Readngs Ljungqvst and Sargent. Recursve Macroeconomc Theory. MIT Press. Chapter 8. 2 / 33 Equlbrum n pure exchange, nfnte horzon economes,
More informationThe optimal delay of the second test is therefore approximately 210 hours earlier than =2.
THE IEC 61508 FORMULAS 223 The optmal delay of the second test s therefore approxmately 210 hours earler than =2. 8.4 The IEC 61508 Formulas IEC 61508-6 provdes approxmaton formulas for the PF for smple
More informationStanford University CS254: Computational Complexity Notes 7 Luca Trevisan January 29, Notes for Lecture 7
Stanford Unversty CS54: Computatonal Complexty Notes 7 Luca Trevsan January 9, 014 Notes for Lecture 7 1 Approxmate Countng wt an N oracle We complete te proof of te followng result: Teorem 1 For every
More informationCHAPTER 14 GENERAL PERTURBATION THEORY
CHAPTER 4 GENERAL PERTURBATION THEORY 4 Introducton A partcle n orbt around a pont mass or a sphercally symmetrc mass dstrbuton s movng n a gravtatonal potental of the form GM / r In ths potental t moves
More informationFinding Dense Subgraphs in G(n, 1/2)
Fndng Dense Subgraphs n Gn, 1/ Atsh Das Sarma 1, Amt Deshpande, and Rav Kannan 1 Georga Insttute of Technology,atsh@cc.gatech.edu Mcrosoft Research-Bangalore,amtdesh,annan@mcrosoft.com Abstract. Fndng
More informationMore metrics on cartesian products
More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of
More informationLecture 4: Universal Hash Functions/Streaming Cont d
CSE 5: Desgn and Analyss of Algorthms I Sprng 06 Lecture 4: Unversal Hash Functons/Streamng Cont d Lecturer: Shayan Oves Gharan Aprl 6th Scrbe: Jacob Schreber Dsclamer: These notes have not been subjected
More informationYong Joon Ryang. 1. Introduction Consider the multicommodity transportation problem with convex quadratic cost function. 1 2 (x x0 ) T Q(x x 0 )
Kangweon-Kyungk Math. Jour. 4 1996), No. 1, pp. 7 16 AN ITERATIVE ROW-ACTION METHOD FOR MULTICOMMODITY TRANSPORTATION PROBLEMS Yong Joon Ryang Abstract. The optmzaton problems wth quadratc constrants often
More informationNUMERICAL DIFFERENTIATION
NUMERICAL DIFFERENTIATION 1 Introducton Dfferentaton s a method to compute the rate at whch a dependent output y changes wth respect to the change n the ndependent nput x. Ths rate of change s called the
More informationCME 302: NUMERICAL LINEAR ALGEBRA FALL 2005/06 LECTURE 13
CME 30: NUMERICAL LINEAR ALGEBRA FALL 005/06 LECTURE 13 GENE H GOLUB 1 Iteratve Methods Very large problems (naturally sparse, from applcatons): teratve methods Structured matrces (even sometmes dense,
More informationU.C. Berkeley CS278: Computational Complexity Professor Luca Trevisan 2/21/2008. Notes for Lecture 8
U.C. Berkeley CS278: Computatonal Complexty Handout N8 Professor Luca Trevsan 2/21/2008 Notes for Lecture 8 1 Undrected Connectvty In the undrected s t connectvty problem (abbrevated ST-UCONN) we are gven
More informationGaussian Mixture Models
Lab Gaussan Mxture Models Lab Objectve: Understand the formulaton of Gaussan Mxture Models (GMMs) and how to estmate GMM parameters. You ve already seen GMMs as the observaton dstrbuton n certan contnuous
More informationThe lower and upper bounds on Perron root of nonnegative irreducible matrices
Journal of Computatonal Appled Mathematcs 217 (2008) 259 267 wwwelsevercom/locate/cam The lower upper bounds on Perron root of nonnegatve rreducble matrces Guang-Xn Huang a,, Feng Yn b,keguo a a College
More informationSociété de Calcul Mathématique SA
Socété de Calcul Mathématque SA Outls d'ade à la décson Tools for decson help Probablstc Studes: Normalzng the Hstograms Bernard Beauzamy December, 202 I. General constructon of the hstogram Any probablstc
More informationC/CS/Phy191 Problem Set 3 Solutions Out: Oct 1, 2008., where ( 00. ), so the overall state of the system is ) ( ( ( ( 00 ± 11 ), Φ ± = 1
C/CS/Phy9 Problem Set 3 Solutons Out: Oct, 8 Suppose you have two qubts n some arbtrary entangled state ψ You apply the teleportaton protocol to each of the qubts separately What s the resultng state obtaned
More informationCalculation of time complexity (3%)
Problem 1. (30%) Calculaton of tme complexty (3%) Gven n ctes, usng exhaust search to see every result takes O(n!). Calculaton of tme needed to solve the problem (2%) 40 ctes:40! dfferent tours 40 add
More informationLecture 3: Shannon s Theorem
CSE 533: Error-Correctng Codes (Autumn 006 Lecture 3: Shannon s Theorem October 9, 006 Lecturer: Venkatesan Guruswam Scrbe: Wdad Machmouch 1 Communcaton Model The communcaton model we are usng conssts
More informationAn (almost) unbiased estimator for the S-Gini index
An (almost unbased estmator for the S-Gn ndex Thomas Demuynck February 25, 2009 Abstract Ths note provdes an unbased estmator for the absolute S-Gn and an almost unbased estmator for the relatve S-Gn for
More informationE Tail Inequalities. E.1 Markov s Inequality. Non-Lecture E: Tail Inequalities
Algorthms Non-Lecture E: Tal Inequaltes If you hold a cat by the tal you learn thngs you cannot learn any other way. Mar Twan E Tal Inequaltes The smple recursve structure of sp lsts made t relatvely easy
More informationNeural networks. Nuno Vasconcelos ECE Department, UCSD
Neural networs Nuno Vasconcelos ECE Department, UCSD Classfcaton a classfcaton problem has two types of varables e.g. X - vector of observatons (features) n the world Y - state (class) of the world x X
More informationISSN: ISO 9001:2008 Certified International Journal of Engineering and Innovative Technology (IJEIT) Volume 3, Issue 1, July 2013
ISSN: 2277-375 Constructon of Trend Free Run Orders for Orthogonal rrays Usng Codes bstract: Sometmes when the expermental runs are carred out n a tme order sequence, the response can depend on the run
More informationWeek 5: Neural Networks
Week 5: Neural Networks Instructor: Sergey Levne Neural Networks Summary In the prevous lecture, we saw how we can construct neural networks by extendng logstc regresson. Neural networks consst of multple
More informationProf. Dr. I. Nasser Phys 630, T Aug-15 One_dimensional_Ising_Model
EXACT OE-DIMESIOAL ISIG MODEL The one-dmensonal Isng model conssts of a chan of spns, each spn nteractng only wth ts two nearest neghbors. The smple Isng problem n one dmenson can be solved drectly n several
More informationErrors for Linear Systems
Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch
More informationCS : Algorithms and Uncertainty Lecture 17 Date: October 26, 2016
CS 29-128: Algorthms and Uncertanty Lecture 17 Date: October 26, 2016 Instructor: Nkhl Bansal Scrbe: Mchael Denns 1 Introducton In ths lecture we wll be lookng nto the secretary problem, and an nterestng
More information2.3 Nilpotent endomorphisms
s a block dagonal matrx, wth A Mat dm U (C) In fact, we can assume that B = B 1 B k, wth B an ordered bass of U, and that A = [f U ] B, where f U : U U s the restrcton of f to U 40 23 Nlpotent endomorphsms
More informationCommunication-efficient Distributed Solutions to a System of Linear Equations with Laplacian Sparse Structure
Communcaton-effcent Dstrbuted Solutons to a System of Lnear Equatons wth Laplacan Sparse Structure Peng Wang, Yuanq Gao, Nanpeng Yu, We Ren, Janmng Lan, and D Wu Abstract Two communcaton-effcent dstrbuted
More informationAppendix B. The Finite Difference Scheme
140 APPENDIXES Appendx B. The Fnte Dfference Scheme In ths appendx we present numercal technques whch are used to approxmate solutons of system 3.1 3.3. A comprehensve treatment of theoretcal and mplementaton
More informationFinding Primitive Roots Pseudo-Deterministically
Electronc Colloquum on Computatonal Complexty, Report No 207 (205) Fndng Prmtve Roots Pseudo-Determnstcally Ofer Grossman December 22, 205 Abstract Pseudo-determnstc algorthms are randomzed search algorthms
More information