Consensus-Based Distributed Linear Filtering

Size: px
Start display at page:

Download "Consensus-Based Distributed Linear Filtering"

Transcription

1 49th IEEE Conference on Decson and Control December 5-7, 200 Hlton Atlanta Hotel, Atlanta, GA, USA Consensus-Based Dstrbuted Lnear Flterng Ion Mate and John S. Baras Abstract We address the consensus-based dstrbuted lnear flterng problem, where a dscrete tme, lnear stochastc process s observed by a network of sensors. We assume that the consensus weghts are known and we frst provde suffcent condtons under whch the stochastc process s detectable,.e. for a specfc choce of consensus weghts there exsts a set of flterng gans such that the dynamcs of the estmaton errors (wthout nose) s asymptotcally stable. ext, we provde a dstrbuted, sub-optmal flterng scheme based on mnmzng an upper bound on a quadratc flterng cost. In the statonary case, we provde suffcent condtons under whch ths scheme converges; condtons expressed n terms of the convergence propertes of a set of coupled Rccat equatons. We contnue wth presentng a connecton between the consensusbased dstrbuted lnear flter and the optmal lnear flter of a Markovan jump lnear system, approprately defned. More specfcally, we show that f the Markovan jump lnear system s (mean square) detectable, then the stochastc process s detectable under the consensus-based dstrbuted lnear flterng scheme. We also show that the optmal gans of a lnear flter for estmatng the state of a Markovan jump lnear system approprately defned can be used to approxmate the optmal gans of the consensus-based lnear flter. I. Introducton Sensor networks have broad applcatons n survellance and montorng of an envronment, collaboratve processng of nformaton, and gatherng scentfc data from spatally dstrbuted sources for envronmental modelng and protecton. A fundamental problem n sensor networks s developng dstrbuted algorthms for the state estmaton of a process of nterest. Genercally, a process s observed by a group of (moble) sensors organzed n a network. The goal of each sensor s to compute accurate state estmates. The dstrbuted flterng (estmaton) problem has receved a lot of attenton durng the past thrty years. An mportant contrbuton was brought by Borkar and Varaya [], who address the dstrbuted estmaton problem of a random varable by a group of sensors. The partcularty of ther formulaton s that both estmates and measurements are shared among neghborng sensors. The authors show that f the sensors form a communcaton rng, through whch nformaton s exchanged nfntely often, then the estmates converge asymptotcally to the same value,.e. they asymptotcally agree. An extenson of the results n reference [] s gven n []. The recent technologcal advances n moble sensor Ion Mate and John S. Baras are wth the Insttute for Systems Research and the Department of Electrcal and Computer Engneerng, Unversty of Maryland, College Park,mate, baras@umd.edu Ths materal s based upon work supported by the US Ar Force Offce of Scentfc Research MURI award FA , by the Defence Advanced Research Projects Agency (DARPA) award number to the Unversty of Calforna - Berkeley and by BAE Systems award number W9F networks have re-gnted the nterest for the dstrbuted estmaton problem. Most papers focusng on dstrbuted estmaton propose dfferent mechansms for combnng the Kalman flter wth a consensus flter n order to ensure that the estmates asymptotcally converge to the same value; these schemes wll be henceforth called consensus based dstrbuted flterng (estmaton) algorthms. Relevant results related to ths approach can be found n [7], [8], [9], [2], [0], [2]. In ths paper we address the consensus-based determnstc dstrbuted lnear flterng problem as well. We assume that each agent updates ts (local) estmate n two steps. In the frst step, an update s produced usng a Luenberger observer type of flter. In the second step, called consensus step, every sensor computes a convex combnaton between ts local update and the updates receved from the neghborng sensors. Our focus s not on desgnng the consensus weghts, but on desgnng the flter gans. For gven consensus weghts, we wll frst gve suffcent condtons for the exstence of flter gans such that the dynamcs of the estmaton errors (wthout nose) s asymptotcally stable. These suffcent condtons are also expressble n terms of the feasblty of a set of lnear matrx nequaltes. ext, we present a dstrbuted (n the sense that each sensor uses only nformaton avalable wthn ts neghborhood), sub-optmal flterng algorthm, vald for tme varyng topologes as well, resultng from mnmzng an upper bound on a quadratc cost expressed n terms of the covarances matrces of the estmaton errors. In the case where the matrces defnng the stochastc process and the consensus weghts are tme nvarant, we present suffcent condtons such that the aforementoned dstrbuted algorthm produces flter gans whch converge and ensure the stablty of the dynamcs of the covarances matrces of the estmaton errors. We wll also present a connecton between the consensus-based lnear flter and the lnear flterng of a Markovan jump lnear system approprately defned. More precsely, we show that f the aforementoned Markovan jump lnear system s (mean square) detectable then the stochastc process s detectable as well under the consensus-based dstrbuted lnear flterng scheme. Fnally we show that the optmal gans of a lnear flter for the state estmaton of the Markovan jump lnear system can be used to approxmate the optmal gans of the consensusbased dstrbuted lnear flterng strategy. We would lke to menton that, due to space lmtatons, not all the results are proved. The reader s nvted to consult the extended verson of ths paper represented by reference [6], whch contans all the mssng proofs /0/$ IEEE 7009

2 Paper structure: In Secton II we descrbe the problems addressed n ths note. Secton III ntroduces the suffcent condtons for detectablty under the consensus-based lnear flterng scheme together wth a test expressed n terms of the feasblty of a set of lnear matrx nequaltes. In Secton IV we present a sub-optmal dstrbuted consensus based lnear flterng scheme wth quantfable performance. Secton V makes a connecton between the consensus-based dstrbuted lnear flterng algorthm and the lnear flterng scheme for a Markovan jump lnear system. otatons and Abbrevatons: We represent the property of postve (sem-postve) defnteness of a symmetrc matrx A, by A 0 (A 0). By conventon, we say that a symmetrc matrx A s negatve defnte (sem-defnte) f A 0 ( A 0) and we denote ths by A 0 (A 0). Gven a set of square matrces {A }, by dag(a,...n) we understand the block dagonal matrx whch contans the matrces A s on the man dagonal. By A B we understand that A B s postve defnte. We use the abbrevatons CBDLF, MJLS and LMI for Consensus-Based Lnear Flter(ng), Markovan Jump Lnear System and Lnear Matrx Inequalty, respectvely. Remark.: Gven a postve nteger, a set of vectors {x }, a set of non-negatve scalars{p } summng up to one and a postve defnte matrx Q, the followng nequalty holds p x Q p x p x Qx. () II. Problemformulaton We consder a stochastc process modeled by a dscretetme lnear dynamc equaton x(k+ )=A(k)x(k)+w(k), x(0)= x 0, (2) where x(k) R n s the state vector and w(k) R n s a drvng nose, assumed Gaussan wth zero mean and (possbly tme varyng) covarance matrxσ w (k). The ntal condton x 0 s assumed to be Gaussan wth meanµ 0 and covarance matrx Σ 0. The state of the process s observed by a network of sensors ndexed by, whose sensng models are gven by y (k)=c (k)x(k)+v (k),,...,, (3) where y (k) R r s the observaton made by sensor and v (k) R r s the measurement nose, assumed Gaussan wth zero mean and (possbly tme varyng) covarance matrx Σ v (k). We assume that the matrces{σ v (k)} andσ w(k) are postve defnte for k 0 and that the ntal state x 0, the noses v (k) and w(k) are ndependent for all k 0. For later reference we also defneσv /2 (k),σw /2 (k), where Σ v (k) Σv /2 (k)σv /2 (k) andσ w (k) Σw /2 (k)σw /2 (k). The set of sensors form a communcaton network whose topology s modeled by a drected graph that descrbes the nformaton exchanged among agents. The goal of the agents s to (locally) compute estmates of the state of the process (2). Let ˆx (k) denote the state estmate computed by sensor and letǫ (k) denote the estmaton error,.e.ǫ (k) x(k) ˆx (k). The covarance matrx of the estmaton error of sensor s denoted byσ (k) E[ǫ (k)ǫ (k) ], wthσ (0)=Σ 0. The sensors update ther estmates n two steps. In the frst step, an ntermedate estmate, denoted byϕ (k), s produced usng a Luenberger observer flter ϕ (k)= A(k) ˆx (k)+ L (k)(y (k) C (k) ˆx (k)),,...,, (4) where L (k) s the flter gan. In the second step, the new state estmate of sensor s generated by a convex combnaton betweenϕ (k) and all other ntermedate estmates wthn ts communcaton neghborhood,.e. ˆx (k+ )= p j (k)ϕ j (k),,...,, (5) where p j (k) are non-negatve scalars summng up to one ( p j (k)=), and p j (k)=0 f no lnk from j to exsts at tme k. Havng p j (k) dependent on tme accounts for a possbly tme varyng communcaton topology. Combnng (4) and (5) we obtan the dynamc equatons for the consensus based dstrbuted flter: ˆx (k+ )= p j (k) [ A(k) ˆx j (k)+ L j (k) ( y j (k) C j (k) ˆx j (k) )], (6) for,...,. From (6) the estmaton errors evolve accordng to ǫ (k+ )= p j (k) [( A(k) L j (k)c j (k) ) ǫ j (k)+ +w(k) L j (k)v j (k) ],,...,. Defnton 2.: (dstrbuted detectablty) Assumng that A(k), C(k) {C (k)} and p(k) {p j(k)}, are tme nvarant, we say that the lnear system (2) s detectable usng the CBDLF scheme (6), f there exst a set of matrces L {L } such that the system (7), wthout the nose, s asymptotcally stable. We ntroduce the followng fnte horzon quadratc flterng cost functon J K (L( ))= (7) K E[ ǫ (k) 2 ], (8) k=0 where by L( ) we understand the set of matrces L( ) {L (k),k=0... K }. The optmal flterng gans represent the soluton of the followng optmzaton problem L ( )=argmn L( ) JK (L( )). (9) Assumng that A(k), C(k) {C (k)},σ w(k),σ v (k) {Σ v (k)} and p(k) {p j (k)}, are tme nvarant, we can also defne the nfnte horzon flterng cost functon J (L)= lm K K J K(L)= lm E[ ǫ (k) 2 ], (0) k where L {L } s the set of steady state flterng gans. By solvng the optmzaton problem L = argmn L J (L), () we obtan the optmal steady-state flter gans. In the next sectons we wll address the followng problems: 700

3 Problem 2.: (Detectablty condtons) Under the above setup, we want to fnd condtons under whch the system (2) s detectable n the sense of Defnton 2.. Problem 2.2: (Sub-optmal scheme for consensus based dstrbuted flterng) Ideally, we would lke to obtan the optmal flter gans by solvng the optmzaton problems (9) and (), respectvely. Due to the complexty of these problems, we wll not provde the optmal flterng gans but rather focus on provdng a sub-optmal scheme wth quantfable performance. Problem 2.3: (Connecton wth the lnear flterng of a Markovan jump lnear system) We make a parallel between the consensus-based dstrbuted lnear flterng scheme and the lnear flterng of a partcular Markovan jump lnear system. III. Dstrbuteddetectablty Let us assume that no sngle par (A,C ) s detectable n the classcal sense, but the par (A,C) s detectable, where C = (C,...,C ). In ths case, we can desgn a stable (centralzed) Luenberger observer flter. The queston s, can we obtan a stable consensus-based dstrbuted flter? As the Example 3. of [6] shows, ths s not true n general. That s why t s mportant to fnd condtons under whch the CBDLF can produce stable estmates. Proposton 3.: Consder the lnear dynamcs (2)-(3). Assume that n the CBDLF scheme (6), we have p j = and that ˆx (0)= x 0, for all,... If the par (A,C) s detectable, then the system (2) s detectable as well, n the sense of Defnton 2.. Proof: Rewrte the matrx C as C= C, where C = (O n r,...,o n r, C, O n r +,...,O n r ). Ignorng the nose, we defne the measurements ȳ (k)= C x(k), whch are equvalent to the ones n (3). Under the assumpton that p j = and ˆx =x 0 for all,..., t follows that the estmaton errors follow the dynamcs ǫ(k+ )= (A L C )ǫ(k). (2) Settng L = L for..., t follows that ǫ(k+ )=(A LC)ǫ(k). Snce the par (A,C) s detectable, there exsts a matrx L such that A LC has all egenvalues nsde the unt crcle and therefore the dynamcs (2) s asymptotcally stable, whch mples that (2) s detectable n the sense of Defnton 2.. The prevous proposton tells us that f we acheve (average) consensus between the state estmates at each tme nstant, and f the par (A,C) s detectable (n the classcal sense), then the system (2) s detectable n the sense of Defnton 2.. However, achevng consensus at each tme nstant can be tme and numercally costly and that s why t s mportant to fnd (testable) condtons under whch the CBDLF produces stable estmates. Lemma 3.: (suffcent condtons for dstrbuted detectablty) If there exsts a set of symmetrc, postve defnte matrces{q } and a set of matrces{l } such that Q = p j (A L j C j ) Q j (A L j C j )+S,..., (3) for some postve defnte matrces{s }, then the system (2) s detectable n the sense of Defnton 2.. Proof: The dynamcs of the estmaton error wthout nose s gven by ǫ (k+ )= p j (A L j C j )ǫ j (k),,...,. (4) In order to prove the stated result we have to show that (4) s asymptotcally stable. We defne the Lyapunov functon V(k)= x (k) Q x (k), and our goal s to show that V(k+) V(k)<0 for all k 0. The Lyapunov dfference can be upper bounded by V(k+ ) V(k) p j ǫ j (k) (A L j C j ) Q (A L j C j )ǫ j (k) ǫ (k) Q ǫ (k), (5) where the nequalty followed from Remark.. By changng the summaton order we can further wrte V(k+) V(k) ǫ (k) p j (A L j C j ) Q j (A L j C j ) Q ǫ (k). Usng (3) yelds V(k+ ) V(k) ǫ (k) S ǫ (k)<0, snce{s j } are postve defnte matrces and therefore asymptotc stablty follows. The followng result relates the exstence of the sets of matrces{q } and{l } such that (3) s satsfed, wth the feasblty of a set of lnear matrx nequaltes (LMI). Proposton 3.2: (dstrbuted detectablty test) The lnear system (2) s detectable n the sense of Defnton 2. f the followng lnear matrx nequaltes, n the varables{x } and{y }, are feasble ( X M M dag(x,...) ) > 0, (6) ( for =..., where M = p (A X C Y ),..., p (A X C Y )) and where {X } are symmetrc. Moreover, a stable CBDLF s obtaned by choosng the flter gans as L = X Y for... Proof: Gven n [6]. 70

4 IV. Sub-Optmal Consensus-Based Dstrbutedlnear Flterng Obtanng the closed form soluton of the optmzaton problem (9) s a challengng problem, whch s n the same sprt as the decentralzed optmal control problem. In ths secton we provde a sub-optmal algorthm for computng the flter gans of the CBDLF, wth quantfable performance n the sense that we compute a set of flterng gans whch guarantee a certan level of performance wth respect to the quadratc cost (8). A. Fnte Horzon Sub-Optmal Consensus-Based Dstrbuted Lnear Flterng The sub-optmal scheme for computng the CBDLF gans results from mnmzng an upper bound of the quadratc flterng cost (8). The followng proposton gves upperbounds for the covarance matrces of the estmaton errors. Proposton 4.: Consder the followng coupled dfference equatons Q (k+ )= p j (k) [( A(k) L j (k)c j (k) ) Q j (k) (A(k) L j (k)c j (k) ) + L j (k)σ v j (k)l j (k) ] +Σ w (k), (7) wth Q (0)=Σ (0), for,...,. Then, the followng nequalty holds Σ (k) Q (k), (8) for,..., and for all k 0. Proof: Gven n [6]. Defnng the fnte horzon quadratc cost functon J K (L( ))= K k= tr(q (k)), (9) the next Corollary follows mmedately. Corollary 4.: The followng nequaltes hold J K (L( )) J K (L( )), lmsup K K J K (L) lmsup K K J K (L). (20) Proof: Follows mmedately from Proposton 4.. In the prevous corollary we obtaned an upper bound on the flterng cost functon. Our sub-optmal consensus based dstrbuted flterng scheme wll result from mnmzng ths upper bound n terms of the flterng gans{l (k)},.e. mn L( ) J K (L( )). (2) Proposton 4.2: The optmal soluton for the optmzaton problem (2) s L (k)=a(k)q (k)c (k) [ Σ v (k)+c (k)q (k)c (k) ], (22) and the optmal value s gven by J K (L ( ))= K tr(q (k)), k= where Q (k) s computed usng Q (k+ )= p j (k) [ A(k)Q j (k)a(k) +Σ w (k) A(k)Q j (k)c j(k) (Σ v j (k)+c j (k)q j (k)c j(k) ) C j (k)q j (k)a(k) ], wth Q (0)=Σ (0) and for,...,. (23) Proof: Let J K (L( )) be the cost functon when an arbtrary set of flterng gans L( ) {L (k),k=0,..., K } s used n (7). We wll show that J K (L ( )) J K (L( )), whch n turn wll show that L ( ) {L (k),k=0,..., K } s the optmal soluton of the optmzaton problem (2). Let {Q (k)} and{q (k)} be the matrces obtaned when L ( ) and L( ), respectvely are substtuted n (7). In what follows we wll show by nducton that Q (k) Q (k) for k 0 and,...,, whch bascally prove that J K (L ( )) J K (L( )), for any L( ). For smplfyng the proof, we wll omt n what follows the tme ndex for some matrces and for the consensus weghts. Substtutng{L (k),k 0} n (7), after some matrx manpulatons we get Q (k+ )= p j [ AQ j (k)a +Σ w AQ j (k)c j (Σ v j + +C j Q j (k)c j ) C j Q j (k)a ], Q (0)=Σ (0),.... We can derve the followng matrx dentty (for smplcty we wll gve up the tme ndex): (A+ L C )Q (A + L C ) + L Σ v L = (A+ L C )Q (A + L C ) + +L Σ v L + (L L )(Σ v +C Q C )(L L ). (24) Assume that Q (k) Q (k) for... Usng dentty (24), the dynamcs of Q (k) becomes Q (k+ )= p j [ (A+ L j C j )Q j (k)(a+ L j C j ) + L j Σ v j L j (L j L j )(Σ v j +C j Q j (k)c j )(L j L j ) +Σ w ]. The dfference Q (k+) Q (k+) can be wrtten as Q (k+ ) Q (k+ )= p j [ (A+ L j C j )(Q j (k) Q j(k)) (A+ L j C j ) (L j L j )(Σ v j +C j Q j (k)c j )(L j L j ) ]. SnceΣ v +C Q (k)c s postve defnte for all k 0 and =,..., and snce we assumed that Q (k) Q (k), t follows that Q (k+) Q (k+). Hence we obtan that J K (L ( )) J K (L( )), for any set of flterng gans L( )={L (k),k=0,..., K }, whch concludes the proof. We summarze n the followng algorthm the suboptmal CBDLF scheme resulted from Proposton 4.2. Algorthm : Consensus Based Dstrbuted Lnear Flterng Algorthm Input:µ 0, P 0 Intalzaton: ˆx (0)=µ 0, Y (0)=Σ 0 2 whle new data exsts 3 Compute the flter gans: 4 5 Update the state estmates: Update the matrces Y : Y L AY C (Σ v +C Y C ) ϕ A ˆx + L (y C ˆx ) ˆx j p j ϕ j ( p j (A L j C j )Y j (A L j C j ) + L j Σ v j L j) +Σw 702

5 B. Infnte Horzon Consensus Based Dstrbuted Flterng We now assume that the matrces A(k), {C (k)}, {Σ v (k)} andσ w(k) and the weghts{p j (k), } are tme nvarant. We are nterested n fndng out under what condtons Algorthm converges and f the flterng gans produce stable estmates. From the prevous secton we note that the optmal nfnte horzon cost can be wrtten as J = lm k tr(q (k)), where the dynamcs of Q (k) s gven by Q (k+ )= [ p j AQ j (k)a +Σ w AQ j j( (k)c Σv j + ) +C j Q ] j (k)c j C j Q j (k)a, and the optmal flterng gans are gven by L ( (k)=aq (k)c Σv +C Q (k)c ), (25) for,...,. Assumng that (25), converges, the optmal value of the cost s gven by J where{ Q } satsfy Q = J = tr( Q ), [ p j A Q j A +Σ w A Q j C j (Σ v j +C j Q j C j ) C j Q j A ]. (26) Suffcent condtons under whch (26) has a unque soluton and (25) converges to ths unque soluton are provded by Proposton. n the Appendx secton. V. Connectonwththe Markovan Jump Lnear Systems stateestmaton In ths secton we present a connecton between the detectablty of (2) n the sense of Defnton 2. and the detectablty property of a MJLS, whch s gong to be defned n what follows. We also show that the optmal gans of a lnear flter for the state estmaton of the aforementoned MJLS can be used to approxmate the soluton of the optmzaton problem (9), whch gves the optmal CBDLF. We assume that the matrx P(k) descrbng the communcaton topology of the sensors s doubly stochastc and we assume, wthout loss of generalty, that the matrces{c (k),k 0} n the sensng model (3), have the same dmenson. We defne the followng Markovan jump lnear system ξ(k+ )=à θ(k) (k)ξ(k)+ B θ(k) (k) w(k) z(k)= C θ(k) (k)ξ(k)+ D θ(k) (k)ṽ(k),ξ(0)=ξ 0, (27) whereξ(k) s the state, z(k) s the output,θ(k) {,..., } s a Markov chan wth probablty transton matrx P(k), w(k) and ṽ(k) are ndependent Gaussan noses wth zero mean and dentty covarance matrces. Also,ξ 0 s a Gaussan nose wth mean µ 0 and covarance matrx Σ 0. We denote byπ (k) the probablty dstrbuton ofθ(k) (Pr(θ(k)= )=π (k)) and we assume that π (0)>0. We have that à θ(k) (k) {à (k)}, B θ(k) (k) { B (k)}, C θ(k) (k) { C (k)} and D θ(k) (k) { D (k)}, where the ndex refers to the state ofθ(k). We set à (k)=a(k), B (k)= π (0) π (k) Σ/2 w (k), C (k)= π (0) C (k), D (k)= (28) π (k) Σ/2 (k), v for all,k 0 (note that snce P(k) s assumed doubly stochastc andπ (0)>0, we have thatπ (k)>0 for all,k 0). In addton,ξ 0,θ(k), w(k) and ṽ(k) are assumed ndependent for all k 0. The random processθ(k) s also called mode. Assumng that the mode s drectly observed, a lnear flter for the state estmaton s gven by ˆξ(k+ )= à θ(k) (k)ˆξ(k)+ M θ(k) (k)(z(k) C θ(k) (k)ˆξ(k)), (29) where we assume that the flter gan M θ(k) depends only on the current mode. The dynamcs of the estmaton error e(k) ξ(k) ˆξ(k) s gven by e(k+)= ( à θk (k) M θ(k) (k) C θ(k) (k) ) e(k)+ + B θ(k) (k)w(k) M θ(k) (k) D θ(k) (k)v(k). (30) Letµ(k) and Y(k) denote the mean and the covarance matrx of e(k),.e.µ(k) E[e(k)] and Y(k) E[e(k)e(k) ], respectvely. We defne also the mean and the covarance matrx of e(k), when the system s n mode,.e. µ (k) E[e(k) {θ(k)=} ] and Y (k) E[e(k)e(k) {θ(k)=} ], where {θ(k)=} s the ndcator functon. It follows mmedately that µ(k)= µ (k) and Y(k)= Y (k). Defnton 5.: The optmal lnear flter (29) s obtan by mnmzng the followng quadratc fnte horzon cost functon J K (M( ))= K K tr(y(k))= tr(y (k)), (3) k= k= where M( ) {M (k),k=0,..., K } are the flter gans and where M (k) corresponds to M θ(k) (k) whenθ(k) s n mode. We can gve a smlar defnton for an optmal steady state flter usng the nfnte horzon quadratc cost functon. Defnton 5.2: Assume that the matrces à (k), C (k) and P(k) are constant for all k 0. We say that the Markovan jump lnear system (27) s mean square detectable f there exts {M } such that lm k E[ e(k) 2 ]=0, when the noses w(k) and ṽ(k) are set to zero. The next result makes the connecton between the detectablty of the MJLS defned above the dstrbuted detectablty of the process (2). Proposton 5.: If the Markovan jump lnear system (27) s mean square detectable, then the lnear stochastc system (2) s detectable n the sense of Defnton 2.. Proof: Gven n [6]. The next result establshes that the optmal gans of the flter (29) can be used to approxmate the soluton of the optmzaton problem (9). Proposton 5.2: Let M ( ) {M (k),k=0,..., K } be the optmal gans of the lnear flter (29). If we set L (k)= π (0) M (k) as flterng gans n the CBDLF scheme, then the flter cost functon (8) s guaranteed to be upper bounded by J K (L( )) K k=0 π (0) tr(y (k)), (32) where Y (k) are the covarance matrces resultng from mnmzng (3). Proof: 703

6 By Theorem 5.5 of [5], the flterng gans that mnmze (3) are gven by M (k)=ã (k)y (k) C (k) [ π (k) D j (k) D j (k) + C (k)y (k) C (k) ], (33) for..., where Y (k) satsfes Y (k+ )= p j (k) [ Ã j (k)y j (k)ã j (k) +π j (k) B j (k) B j (k) Ã j (k)y j (k) C j (k) ( π j (k) D j (k) D j (k) + C j (k)y j (k) C j (k) ) C j (k)y j (k)ã j (k) ]. (34) In what follows we wll show by nducton that Y (k)= π (0)Q (k) for all,k 0, where Q (k) satsfes (23). For k=0 we have Y (0)=π (0)Y (0)=π (0)Σ 0 =π (0)Q (0). Let us assume that Y (k)=π (0)Q (k). Then, from (28) we have π j (k) B j (k) B j (k) =π (0)Σ w (k), π j (k) D j (k) D j (k) =Σ v (k), π j (k) D j (k) D j (k) + C j (k)y j (k) C j (k) =Σ v j (k)+c j (k)q j (k)c j(k). Also, (35) M (k)= π (0)A(k)Q (k)c (k) [ Σ v j (k)+c j (k)q j (k)c j(k) ], (36) and from (22) we get that M (k)= π (0)L (k). From (34), (35), t can be easly argued that Y (k+ )=π (0)Q (k+ ). By Corollary 4. we have that J K (L( )) J K (L( )), for any set of flterng gans L( ) and n partcular for L (k)= π (0) M (k)= L (k), for all and k. But snce the result follows. J K (L ( ))= K k=0 VI. Conclusons π (0) Y (k), In ths paper we addressed three problems. Frst we provded (testable) suffcent condtons under whch stable consensus-based dstrbuted lnear flters can be obtaned. Second, we gave a sub-optmal, lnear flterng scheme, whch can be mplemented n a dstrbuted manner and s vald for tme varyng communcaton topologes as well, and whch guarantees a certan level of performance. Thrd, under the assumpton that the stochastc matrx used n the consensus step s doubly stochastc we showed that f an approprately defned Markovan jump lnear system s detectable, then the stochastc process of our nterest s detectable as well. We also showed that the optmal gans of the consensus-based dstrbuted lnear flter scheme can be approxmated by usng the optmal lnear flter for the state estmaton of a partcular Markovan jump lnear system. [3] O.L.V. Costa and M.D. Fragoso, Stablty results for dscrete-tme lnear systems wth Markovan jumpng parameters, Journal of Mathematcal Analyss and Applcaton, no. 79(993), pp [4] O.L.V. Costa, Dscrete-Tme Coupled Rccat Equatons for Systems wth Markov Swtchng Parameters, Journal of Mathematcal Analyss and Applcatons, no. 94(995), pp [5] O.L.V. Costa, M.D, Fragoso and R.P. Marques, Dscrete-Tme Markov Jump Lnear Systems, Sprnger [6] I. Mate and J.S. Baras, Consensus-Based Lnear Flterng, ISR Tecncal Report, TR-200-5, [7] R. Olfat-Saber, Dstrbuted Kalman Flterng for Sensor etworks, Proceedngs of the 46 th IEEE Conference on Decson and Control, pages , 2007 [8] R. Olfat-Saber, Dstrbuted Kalman Flter wth Embedded Consensus Flters, Proc. of the 44 th IEEE Conference on Decson and Control, Dec [9] R. Olfat-Saber, Dstrbuted Trackng for Moble Sensor etworks wth Informaton-Drven Moblty, Proc. of the 2007 Amercan Control Conference, pages , July -3, [0] A. Speranzon, C. Fschone, K. H. Johansson and A. Sangovann- Vncentell, A Dstrbuted Mnmum Varance Estmator for Sensor etworks, IEEE Journal on Select Areas n Communcaton, vol. 26, no. 4, pages , May [] D. Teneketzs and P. Varaya, Consensus n Dstrbuted Estmaton, Advances n Statstcal Sgnal Processng, JAI Press, H.V. Poor Ed., pp , January 988. [2] Y. Zhu, Z. You, J. Zhao, K. Zhang, X.R. L, The optmalty for the dstrbuted Kalman flterng fuson wth feedback, Automatca no. 37, pages , 200. Appendx Gven a postve ntegers, a sequence of postve numbers p={p j }, and a set of matrces F={F }, we consder the followng matrx dfference equatons W (k+ )= p j F j W j (k)f j, W (0)=W 0,,...,. (37) Defnton. ([4]): Gven a set of matrces C={C }, we say that (p,l,a) s detectable f there exsts a set of matrces L={L } such that the dynamcs (37) s asymptotcally stable, where F = A L C, for,...,. Defnton.2 ([4]): Gven a set of matrces C={C }, we say that (A,L,p) s stablzable, f there exsts a set of matrces L={L } such that the dynamcs (37) s asymptotcally stable, where F = A C L, for... Proposton.: LetΣ v ={Σv /2 }, whereσ v =Σv /2 Σ /2 v. Suppose that (p,c,a) s detectable and that (A,Σv /2,p) s stablzable n the sense of Defntons. and.2, respectvely. Then there exsts a unque set of symmetrc postve defnte matrces Q={ Q } satsfyng (26). Moreover, for any ntal condtons Q 0 0, we have that lm k Q (k)= Q, where the dynamcs of Q (k) s gven by (25). Proof: See [6]. References [] V. Borkar and P. Varaya, Asymptotc agreement n dstrbuted estmaton, IEEE Transacton on Automatc Control, vol. ac-27, no. 3, 982. [2] R. Carl, A. Chuso, L. Schenato and S. Zamper, Dstrbuted Kalman Flterng Based on Consensus Strateges, IEEE Journal on selected area n communcaton, vol. 26, no. 4, pages , May

Yong Joon Ryang. 1. Introduction Consider the multicommodity transportation problem with convex quadratic cost function. 1 2 (x x0 ) T Q(x x 0 )

Yong Joon Ryang. 1. Introduction Consider the multicommodity transportation problem with convex quadratic cost function. 1 2 (x x0 ) T Q(x x 0 ) Kangweon-Kyungk Math. Jour. 4 1996), No. 1, pp. 7 16 AN ITERATIVE ROW-ACTION METHOD FOR MULTICOMMODITY TRANSPORTATION PROBLEMS Yong Joon Ryang Abstract. The optmzaton problems wth quadratc constrants often

More information

Time-Varying Systems and Computations Lecture 6

Time-Varying Systems and Computations Lecture 6 Tme-Varyng Systems and Computatons Lecture 6 Klaus Depold 14. Januar 2014 The Kalman Flter The Kalman estmaton flter attempts to estmate the actual state of an unknown dscrete dynamcal system, gven nosy

More information

The Order Relation and Trace Inequalities for. Hermitian Operators

The Order Relation and Trace Inequalities for. Hermitian Operators Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence

More information

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017 U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 4s Luca Trevsan September 5, 07 Summary of Lecture 4 In whch we ntroduce semdefnte programmng and apply t to Max Cut. Semdefnte Programmng Recall that

More information

Dynamic Systems on Graphs

Dynamic Systems on Graphs Prepared by F.L. Lews Updated: Saturday, February 06, 200 Dynamc Systems on Graphs Control Graphs and Consensus A network s a set of nodes that collaborates to acheve what each cannot acheve alone. A network,

More information

Composite Hypotheses testing

Composite Hypotheses testing Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter

More information

Outline. Communication. Bellman Ford Algorithm. Bellman Ford Example. Bellman Ford Shortest Path [1]

Outline. Communication. Bellman Ford Algorithm. Bellman Ford Example. Bellman Ford Shortest Path [1] DYNAMIC SHORTEST PATH SEARCH AND SYNCHRONIZED TASK SWITCHING Jay Wagenpfel, Adran Trachte 2 Outlne Shortest Communcaton Path Searchng Bellmann Ford algorthm Algorthm for dynamc case Modfcatons to our algorthm

More information

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016 U.C. Berkeley CS94: Spectral Methods and Expanders Handout 8 Luca Trevsan February 7, 06 Lecture 8: Spectral Algorthms Wrap-up In whch we talk about even more generalzatons of Cheeger s nequaltes, and

More information

THE GUARANTEED COST CONTROL FOR UNCERTAIN LARGE SCALE INTERCONNECTED SYSTEMS

THE GUARANTEED COST CONTROL FOR UNCERTAIN LARGE SCALE INTERCONNECTED SYSTEMS Copyrght 22 IFAC 5th rennal World Congress, Barcelona, Span HE GUARANEED COS CONROL FOR UNCERAIN LARGE SCALE INERCONNECED SYSEMS Hroak Mukadan Yasuyuk akato Yoshyuk anaka Koch Mzukam Faculty of Informaton

More information

MMA and GCMMA two methods for nonlinear optimization

MMA and GCMMA two methods for nonlinear optimization MMA and GCMMA two methods for nonlnear optmzaton Krster Svanberg Optmzaton and Systems Theory, KTH, Stockholm, Sweden. krlle@math.kth.se Ths note descrbes the algorthms used n the author s 2007 mplementatons

More information

LOW BIAS INTEGRATED PATH ESTIMATORS. James M. Calvin

LOW BIAS INTEGRATED PATH ESTIMATORS. James M. Calvin Proceedngs of the 007 Wnter Smulaton Conference S G Henderson, B Bller, M-H Hseh, J Shortle, J D Tew, and R R Barton, eds LOW BIAS INTEGRATED PATH ESTIMATORS James M Calvn Department of Computer Scence

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number

More information

Stability analysis for class of switched nonlinear systems

Stability analysis for class of switched nonlinear systems Stablty analyss for class of swtched nonlnear systems The MIT Faculty has made ths artcle openly avalable. Please share how ths access benefts you. Your story matters. Ctaton As Publshed Publsher Shaker,

More information

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons

More information

Estimation: Part 2. Chapter GREG estimation

Estimation: Part 2. Chapter GREG estimation Chapter 9 Estmaton: Part 2 9. GREG estmaton In Chapter 8, we have seen that the regresson estmator s an effcent estmator when there s a lnear relatonshp between y and x. In ths chapter, we generalzed the

More information

The Minimum Universal Cost Flow in an Infeasible Flow Network

The Minimum Universal Cost Flow in an Infeasible Flow Network Journal of Scences, Islamc Republc of Iran 17(2): 175-180 (2006) Unversty of Tehran, ISSN 1016-1104 http://jscencesutacr The Mnmum Unversal Cost Flow n an Infeasble Flow Network H Saleh Fathabad * M Bagheran

More information

Tracking with Kalman Filter

Tracking with Kalman Filter Trackng wth Kalman Flter Scott T. Acton Vrgna Image and Vdeo Analyss (VIVA), Charles L. Brown Department of Electrcal and Computer Engneerng Department of Bomedcal Engneerng Unversty of Vrgna, Charlottesvlle,

More information

Google PageRank with Stochastic Matrix

Google PageRank with Stochastic Matrix Google PageRank wth Stochastc Matrx Md. Sharq, Puranjt Sanyal, Samk Mtra (M.Sc. Applcatons of Mathematcs) Dscrete Tme Markov Chan Let S be a countable set (usually S s a subset of Z or Z d or R or R d

More information

6. Stochastic processes (2)

6. Stochastic processes (2) 6. Stochastc processes () Lect6.ppt S-38.45 - Introducton to Teletraffc Theory Sprng 5 6. Stochastc processes () Contents Markov processes Brth-death processes 6. Stochastc processes () Markov process

More information

6. Stochastic processes (2)

6. Stochastic processes (2) Contents Markov processes Brth-death processes Lect6.ppt S-38.45 - Introducton to Teletraffc Theory Sprng 5 Markov process Consder a contnuous-tme and dscrete-state stochastc process X(t) wth state space

More information

Convergence of random processes

Convergence of random processes DS-GA 12 Lecture notes 6 Fall 216 Convergence of random processes 1 Introducton In these notes we study convergence of dscrete random processes. Ths allows to characterze phenomena such as the law of large

More information

Maximizing the number of nonnegative subsets

Maximizing the number of nonnegative subsets Maxmzng the number of nonnegatve subsets Noga Alon Hao Huang December 1, 213 Abstract Gven a set of n real numbers, f the sum of elements of every subset of sze larger than k s negatve, what s the maxmum

More information

Appendix for Causal Interaction in Factorial Experiments: Application to Conjoint Analysis

Appendix for Causal Interaction in Factorial Experiments: Application to Conjoint Analysis A Appendx for Causal Interacton n Factoral Experments: Applcaton to Conjont Analyss Mathematcal Appendx: Proofs of Theorems A. Lemmas Below, we descrbe all the lemmas, whch are used to prove the man theorems

More information

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg prnceton unv. F 17 cos 521: Advanced Algorthm Desgn Lecture 7: LP Dualty Lecturer: Matt Wenberg Scrbe: LP Dualty s an extremely useful tool for analyzng structural propertes of lnear programs. Whle there

More information

Errors for Linear Systems

Errors for Linear Systems Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch

More information

Games of Threats. Elon Kohlberg Abraham Neyman. Working Paper

Games of Threats. Elon Kohlberg Abraham Neyman. Working Paper Games of Threats Elon Kohlberg Abraham Neyman Workng Paper 18-023 Games of Threats Elon Kohlberg Harvard Busness School Abraham Neyman The Hebrew Unversty of Jerusalem Workng Paper 18-023 Copyrght 2017

More information

On an Extension of Stochastic Approximation EM Algorithm for Incomplete Data Problems. Vahid Tadayon 1

On an Extension of Stochastic Approximation EM Algorithm for Incomplete Data Problems. Vahid Tadayon 1 On an Extenson of Stochastc Approxmaton EM Algorthm for Incomplete Data Problems Vahd Tadayon Abstract: The Stochastc Approxmaton EM (SAEM algorthm, a varant stochastc approxmaton of EM, s a versatle tool

More information

Lecture 12: Discrete Laplacian

Lecture 12: Discrete Laplacian Lecture 12: Dscrete Laplacan Scrbe: Tanye Lu Our goal s to come up wth a dscrete verson of Laplacan operator for trangulated surfaces, so that we can use t n practce to solve related problems We are mostly

More information

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton

More information

n α j x j = 0 j=1 has a nontrivial solution. Here A is the n k matrix whose jth column is the vector for all t j=0

n α j x j = 0 j=1 has a nontrivial solution. Here A is the n k matrix whose jth column is the vector for all t j=0 MODULE 2 Topcs: Lnear ndependence, bass and dmenson We have seen that f n a set of vectors one vector s a lnear combnaton of the remanng vectors n the set then the span of the set s unchanged f that vector

More information

Control of Uncertain Bilinear Systems using Linear Controllers: Stability Region Estimation and Controller Design

Control of Uncertain Bilinear Systems using Linear Controllers: Stability Region Estimation and Controller Design Control of Uncertan Blnear Systems usng Lnear Controllers: Stablty Regon Estmaton Controller Desgn Shoudong Huang Department of Engneerng Australan Natonal Unversty Canberra, ACT 2, Australa shoudong.huang@anu.edu.au

More information

Solutions to exam in SF1811 Optimization, Jan 14, 2015

Solutions to exam in SF1811 Optimization, Jan 14, 2015 Solutons to exam n SF8 Optmzaton, Jan 4, 25 3 3 O------O -4 \ / \ / The network: \/ where all lnks go from left to rght. /\ / \ / \ 6 O------O -5 2 4.(a) Let x = ( x 3, x 4, x 23, x 24 ) T, where the varable

More information

Complete subgraphs in multipartite graphs

Complete subgraphs in multipartite graphs Complete subgraphs n multpartte graphs FLORIAN PFENDER Unverstät Rostock, Insttut für Mathematk D-18057 Rostock, Germany Floran.Pfender@un-rostock.de Abstract Turán s Theorem states that every graph G

More information

Tokyo Institute of Technology Periodic Sequencing Control over Multi Communication Channels with Packet Losses

Tokyo Institute of Technology Periodic Sequencing Control over Multi Communication Channels with Packet Losses oyo Insttute of echnology Fujta Laboratory oyo Insttute of echnology erodc Sequencng Control over Mult Communcaton Channels wth acet Losses FL6-7- /8/6 zwrman Gusrald oyo Insttute of echnology Fujta Laboratory

More information

Vector Norms. Chapter 7 Iterative Techniques in Matrix Algebra. Cauchy-Bunyakovsky-Schwarz Inequality for Sums. Distances. Convergence.

Vector Norms. Chapter 7 Iterative Techniques in Matrix Algebra. Cauchy-Bunyakovsky-Schwarz Inequality for Sums. Distances. Convergence. Vector Norms Chapter 7 Iteratve Technques n Matrx Algebra Per-Olof Persson persson@berkeley.edu Department of Mathematcs Unversty of Calforna, Berkeley Math 128B Numercal Analyss Defnton A vector norm

More information

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U) Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they

More information

Perfect Competition and the Nash Bargaining Solution

Perfect Competition and the Nash Bargaining Solution Perfect Competton and the Nash Barganng Soluton Renhard John Department of Economcs Unversty of Bonn Adenauerallee 24-42 53113 Bonn, Germany emal: rohn@un-bonn.de May 2005 Abstract For a lnear exchange

More information

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could

More information

Supplementary material: Margin based PU Learning. Matrix Concentration Inequalities

Supplementary material: Margin based PU Learning. Matrix Concentration Inequalities Supplementary materal: Margn based PU Learnng We gve the complete proofs of Theorem and n Secton We frst ntroduce the well-known concentraton nequalty, so the covarance estmator can be bounded Then we

More information

Lecture 20: Lift and Project, SDP Duality. Today we will study the Lift and Project method. Then we will prove the SDP duality theorem.

Lecture 20: Lift and Project, SDP Duality. Today we will study the Lift and Project method. Then we will prove the SDP duality theorem. prnceton u. sp 02 cos 598B: algorthms and complexty Lecture 20: Lft and Project, SDP Dualty Lecturer: Sanjeev Arora Scrbe:Yury Makarychev Today we wll study the Lft and Project method. Then we wll prove

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

COS 521: Advanced Algorithms Game Theory and Linear Programming

COS 521: Advanced Algorithms Game Theory and Linear Programming COS 521: Advanced Algorthms Game Theory and Lnear Programmng Moses Charkar February 27, 2013 In these notes, we ntroduce some basc concepts n game theory and lnear programmng (LP). We show a connecton

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

ECE559VV Project Report

ECE559VV Project Report ECE559VV Project Report (Supplementary Notes Loc Xuan Bu I. MAX SUM-RATE SCHEDULING: THE UPLINK CASE We have seen (n the presentaton that, for downlnk (broadcast channels, the strategy maxmzng the sum-rate

More information

APPENDIX A Some Linear Algebra

APPENDIX A Some Linear Algebra APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,

More information

Chapter 2 Robust Covariance Intersection Fusion Steady-State Kalman Filter with Uncertain Parameters

Chapter 2 Robust Covariance Intersection Fusion Steady-State Kalman Filter with Uncertain Parameters Chapter 2 Robust Covarance Intersecton Fuson Steady-State Kalman Flter wth Uncertan Parameters Wenjuan Q, Xueme Wang, Wenqang Lu and Zl Deng Abstract For the lnear dscrete tme-nvarant system wth uncertan

More information

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4) I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes

More information

Feb 14: Spatial analysis of data fields

Feb 14: Spatial analysis of data fields Feb 4: Spatal analyss of data felds Mappng rregularly sampled data onto a regular grd Many analyss technques for geophyscal data requre the data be located at regular ntervals n space and/or tme. hs s

More information

Random Walks on Digraphs

Random Walks on Digraphs Random Walks on Dgraphs J. J. P. Veerman October 23, 27 Introducton Let V = {, n} be a vertex set and S a non-negatve row-stochastc matrx (.e. rows sum to ). V and S defne a dgraph G = G(V, S) and a drected

More information

ON A DETERMINATION OF THE INITIAL FUNCTIONS FROM THE OBSERVED VALUES OF THE BOUNDARY FUNCTIONS FOR THE SECOND-ORDER HYPERBOLIC EQUATION

ON A DETERMINATION OF THE INITIAL FUNCTIONS FROM THE OBSERVED VALUES OF THE BOUNDARY FUNCTIONS FOR THE SECOND-ORDER HYPERBOLIC EQUATION Advanced Mathematcal Models & Applcatons Vol.3, No.3, 2018, pp.215-222 ON A DETERMINATION OF THE INITIAL FUNCTIONS FROM THE OBSERVED VALUES OF THE BOUNDARY FUNCTIONS FOR THE SECOND-ORDER HYPERBOLIC EUATION

More information

Remarks on the Properties of a Quasi-Fibonacci-like Polynomial Sequence

Remarks on the Properties of a Quasi-Fibonacci-like Polynomial Sequence Remarks on the Propertes of a Quas-Fbonacc-lke Polynomal Sequence Brce Merwne LIU Brooklyn Ilan Wenschelbaum Wesleyan Unversty Abstract Consder the Quas-Fbonacc-lke Polynomal Sequence gven by F 0 = 1,

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

Erratum: A Generalized Path Integral Control Approach to Reinforcement Learning

Erratum: A Generalized Path Integral Control Approach to Reinforcement Learning Journal of Machne Learnng Research 00-9 Submtted /0; Publshed 7/ Erratum: A Generalzed Path Integral Control Approach to Renforcement Learnng Evangelos ATheodorou Jonas Buchl Stefan Schaal Department of

More information

Coupled Distributed Estimation and Control for Mobile Sensor Networks

Coupled Distributed Estimation and Control for Mobile Sensor Networks IEEE TRANSACTIONS ON AUTOMATIC CONTROL, VOL 57, NO 9, SEPTEMBER 1 1 Coupled Dstrbuted Estmaton and Control for Moble Sensor Networks Reza Olfat-Saber and Parsa Jalalkamal Abstract In ths paper, we ntroduce

More information

4DVAR, according to the name, is a four-dimensional variational method.

4DVAR, according to the name, is a four-dimensional variational method. 4D-Varatonal Data Assmlaton (4D-Var) 4DVAR, accordng to the name, s a four-dmensonal varatonal method. 4D-Var s actually a drect generalzaton of 3D-Var to handle observatons that are dstrbuted n tme. The

More information

Continuous Time Markov Chain

Continuous Time Markov Chain Contnuous Tme Markov Chan Hu Jn Department of Electroncs and Communcaton Engneerng Hanyang Unversty ERICA Campus Contents Contnuous tme Markov Chan (CTMC) Propertes of sojourn tme Relatons Transton probablty

More information

Feature Selection: Part 1

Feature Selection: Part 1 CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?

More information

2.3 Nilpotent endomorphisms

2.3 Nilpotent endomorphisms s a block dagonal matrx, wth A Mat dm U (C) In fact, we can assume that B = B 1 B k, wth B an ordered bass of U, and that A = [f U ] B, where f U : U U s the restrcton of f to U 40 23 Nlpotent endomorphsms

More information

Inexact Newton Methods for Inverse Eigenvalue Problems

Inexact Newton Methods for Inverse Eigenvalue Problems Inexact Newton Methods for Inverse Egenvalue Problems Zheng-jan Ba Abstract In ths paper, we survey some of the latest development n usng nexact Newton-lke methods for solvng nverse egenvalue problems.

More information

Fuzzy Boundaries of Sample Selection Model

Fuzzy Boundaries of Sample Selection Model Proceedngs of the 9th WSES Internatonal Conference on ppled Mathematcs, Istanbul, Turkey, May 7-9, 006 (pp309-34) Fuzzy Boundares of Sample Selecton Model L. MUHMD SFIIH, NTON BDULBSH KMIL, M. T. BU OSMN

More information

Solutions HW #2. minimize. Ax = b. Give the dual problem, and make the implicit equality constraints explicit. Solution.

Solutions HW #2. minimize. Ax = b. Give the dual problem, and make the implicit equality constraints explicit. Solution. Solutons HW #2 Dual of general LP. Fnd the dual functon of the LP mnmze subject to c T x Gx h Ax = b. Gve the dual problem, and make the mplct equalty constrants explct. Soluton. 1. The Lagrangan s L(x,

More information

On a direct solver for linear least squares problems

On a direct solver for linear least squares problems ISSN 2066-6594 Ann. Acad. Rom. Sc. Ser. Math. Appl. Vol. 8, No. 2/2016 On a drect solver for lnear least squares problems Constantn Popa Abstract The Null Space (NS) algorthm s a drect solver for lnear

More information

Hidden Markov Models & The Multivariate Gaussian (10/26/04)

Hidden Markov Models & The Multivariate Gaussian (10/26/04) CS281A/Stat241A: Statstcal Learnng Theory Hdden Markov Models & The Multvarate Gaussan (10/26/04) Lecturer: Mchael I. Jordan Scrbes: Jonathan W. Hu 1 Hdden Markov Models As a bref revew, hdden Markov models

More information

Lecture 10 Support Vector Machines II

Lecture 10 Support Vector Machines II Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed

More information

Chapter 2 - The Simple Linear Regression Model S =0. e i is a random error. S β2 β. This is a minimization problem. Solution is a calculus exercise.

Chapter 2 - The Simple Linear Regression Model S =0. e i is a random error. S β2 β. This is a minimization problem. Solution is a calculus exercise. Chapter - The Smple Lnear Regresson Model The lnear regresson equaton s: where y + = β + β e for =,..., y and are observable varables e s a random error How can an estmaton rule be constructed for the

More information

e - c o m p a n i o n

e - c o m p a n i o n OPERATIONS RESEARCH http://dxdoorg/0287/opre007ec e - c o m p a n o n ONLY AVAILABLE IN ELECTRONIC FORM 202 INFORMS Electronc Companon Generalzed Quantty Competton for Multple Products and Loss of Effcency

More information

4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA

4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA 4 Analyss of Varance (ANOVA) 5 ANOVA 51 Introducton ANOVA ANOVA s a way to estmate and test the means of multple populatons We wll start wth one-way ANOVA If the populatons ncluded n the study are selected

More information

Lecture 3: Probability Distributions

Lecture 3: Probability Distributions Lecture 3: Probablty Dstrbutons Random Varables Let us begn by defnng a sample space as a set of outcomes from an experment. We denote ths by S. A random varable s a functon whch maps outcomes nto the

More information

A new Approach for Solving Linear Ordinary Differential Equations

A new Approach for Solving Linear Ordinary Differential Equations , ISSN 974-57X (Onlne), ISSN 974-5718 (Prnt), Vol. ; Issue No. 1; Year 14, Copyrght 13-14 by CESER PUBLICATIONS A new Approach for Solvng Lnear Ordnary Dfferental Equatons Fawz Abdelwahd Department of

More information

CHAPTER III Neural Networks as Associative Memory

CHAPTER III Neural Networks as Associative Memory CHAPTER III Neural Networs as Assocatve Memory Introducton One of the prmary functons of the bran s assocatve memory. We assocate the faces wth names, letters wth sounds, or we can recognze the people

More information

10-801: Advanced Optimization and Randomized Methods Lecture 2: Convex functions (Jan 15, 2014)

10-801: Advanced Optimization and Randomized Methods Lecture 2: Convex functions (Jan 15, 2014) 0-80: Advanced Optmzaton and Randomzed Methods Lecture : Convex functons (Jan 5, 04) Lecturer: Suvrt Sra Addr: Carnege Mellon Unversty, Sprng 04 Scrbes: Avnava Dubey, Ahmed Hefny Dsclamer: These notes

More information

Simultaneous Optimization of Berth Allocation, Quay Crane Assignment and Quay Crane Scheduling Problems in Container Terminals

Simultaneous Optimization of Berth Allocation, Quay Crane Assignment and Quay Crane Scheduling Problems in Container Terminals Smultaneous Optmzaton of Berth Allocaton, Quay Crane Assgnment and Quay Crane Schedulng Problems n Contaner Termnals Necat Aras, Yavuz Türkoğulları, Z. Caner Taşkın, Kuban Altınel Abstract In ths work,

More information

Introduction. - The Second Lyapunov Method. - The First Lyapunov Method

Introduction. - The Second Lyapunov Method. - The First Lyapunov Method Stablty Analyss A. Khak Sedgh Control Systems Group Faculty of Electrcal and Computer Engneerng K. N. Toos Unversty of Technology February 2009 1 Introducton Stablty s the most promnent characterstc of

More information

STATS 306B: Unsupervised Learning Spring Lecture 10 April 30

STATS 306B: Unsupervised Learning Spring Lecture 10 April 30 STATS 306B: Unsupervsed Learnng Sprng 2014 Lecture 10 Aprl 30 Lecturer: Lester Mackey Scrbe: Joey Arthur, Rakesh Achanta 10.1 Factor Analyss 10.1.1 Recap Recall the factor analyss (FA) model for lnear

More information

Math 217 Fall 2013 Homework 2 Solutions

Math 217 Fall 2013 Homework 2 Solutions Math 17 Fall 013 Homework Solutons Due Thursday Sept. 6, 013 5pm Ths homework conssts of 6 problems of 5 ponts each. The total s 30. You need to fully justfy your answer prove that your functon ndeed has

More information

Additional Codes using Finite Difference Method. 1 HJB Equation for Consumption-Saving Problem Without Uncertainty

Additional Codes using Finite Difference Method. 1 HJB Equation for Consumption-Saving Problem Without Uncertainty Addtonal Codes usng Fnte Dfference Method Benamn Moll 1 HJB Equaton for Consumpton-Savng Problem Wthout Uncertanty Before consderng the case wth stochastc ncome n http://www.prnceton.edu/~moll/ HACTproect/HACT_Numercal_Appendx.pdf,

More information

BOUNDEDNESS OF THE RIESZ TRANSFORM WITH MATRIX A 2 WEIGHTS

BOUNDEDNESS OF THE RIESZ TRANSFORM WITH MATRIX A 2 WEIGHTS BOUNDEDNESS OF THE IESZ TANSFOM WITH MATIX A WEIGHTS Introducton Let L = L ( n, be the functon space wth norm (ˆ f L = f(x C dx d < For a d d matrx valued functon W : wth W (x postve sem-defnte for all

More information

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 16

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 16 STAT 39: MATHEMATICAL COMPUTATIONS I FALL 218 LECTURE 16 1 why teratve methods f we have a lnear system Ax = b where A s very, very large but s ether sparse or structured (eg, banded, Toepltz, banded plus

More information

Maximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models

Maximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models ECO 452 -- OE 4: Probt and Logt Models ECO 452 -- OE 4 Maxmum Lkelhood Estmaton of Bnary Dependent Varables Models: Probt and Logt hs note demonstrates how to formulate bnary dependent varables models

More information

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family IOSR Journal of Mathematcs IOSR-JM) ISSN: 2278-5728. Volume 3, Issue 3 Sep-Oct. 202), PP 44-48 www.osrjournals.org Usng T.O.M to Estmate Parameter of dstrbutons that have not Sngle Exponental Famly Jubran

More information

Appendix B. The Finite Difference Scheme

Appendix B. The Finite Difference Scheme 140 APPENDIXES Appendx B. The Fnte Dfference Scheme In ths appendx we present numercal technques whch are used to approxmate solutons of system 3.1 3.3. A comprehensve treatment of theoretcal and mplementaton

More information

Asymptotics of the Solution of a Boundary Value. Problem for One-Characteristic Differential. Equation Degenerating into a Parabolic Equation

Asymptotics of the Solution of a Boundary Value. Problem for One-Characteristic Differential. Equation Degenerating into a Parabolic Equation Nonl. Analyss and Dfferental Equatons, ol., 4, no., 5 - HIKARI Ltd, www.m-har.com http://dx.do.org/.988/nade.4.456 Asymptotcs of the Soluton of a Boundary alue Problem for One-Characterstc Dfferental Equaton

More information

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction ECONOMICS 5* -- NOTE (Summary) ECON 5* -- NOTE The Multple Classcal Lnear Regresson Model (CLRM): Specfcaton and Assumptons. Introducton CLRM stands for the Classcal Lnear Regresson Model. The CLRM s also

More information

Transition Probability Bounds for the Stochastic Stability Robustness of Continuous- and Discrete-Time Markovian Jump Linear Systems

Transition Probability Bounds for the Stochastic Stability Robustness of Continuous- and Discrete-Time Markovian Jump Linear Systems Transton Probablty Bounds for the Stochastc Stablty Robustness of Contnuous- and Dscrete-Tme Markovan Jump Lnear Systems Mehmet Karan, Peng Sh, and C Yalçın Kaya June 17, 2006 Abstract Ths paper consders

More information

Uncertainty as the Overlap of Alternate Conditional Distributions

Uncertainty as the Overlap of Alternate Conditional Distributions Uncertanty as the Overlap of Alternate Condtonal Dstrbutons Olena Babak and Clayton V. Deutsch Centre for Computatonal Geostatstcs Department of Cvl & Envronmental Engneerng Unversty of Alberta An mportant

More information

Identification of Linear Partial Difference Equations with Constant Coefficients

Identification of Linear Partial Difference Equations with Constant Coefficients J. Basc. Appl. Sc. Res., 3(1)6-66, 213 213, TextRoad Publcaton ISSN 29-434 Journal of Basc and Appled Scentfc Research www.textroad.com Identfcaton of Lnear Partal Dfference Equatons wth Constant Coeffcents

More information

MATH 241B FUNCTIONAL ANALYSIS - NOTES EXAMPLES OF C ALGEBRAS

MATH 241B FUNCTIONAL ANALYSIS - NOTES EXAMPLES OF C ALGEBRAS MATH 241B FUNCTIONAL ANALYSIS - NOTES EXAMPLES OF C ALGEBRAS These are nformal notes whch cover some of the materal whch s not n the course book. The man purpose s to gve a number of nontrval examples

More information

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal Inner Product Defnton 1 () A Eucldean space s a fnte-dmensonal vector space over the reals R, wth an nner product,. Defnton 2 (Inner Product) An nner product, on a real vector space X s a symmetrc, blnear,

More information

THE CHINESE REMAINDER THEOREM. We should thank the Chinese for their wonderful remainder theorem. Glenn Stevens

THE CHINESE REMAINDER THEOREM. We should thank the Chinese for their wonderful remainder theorem. Glenn Stevens THE CHINESE REMAINDER THEOREM KEITH CONRAD We should thank the Chnese for ther wonderful remander theorem. Glenn Stevens 1. Introducton The Chnese remander theorem says we can unquely solve any par of

More information

The internal structure of natural numbers and one method for the definition of large prime numbers

The internal structure of natural numbers and one method for the definition of large prime numbers The nternal structure of natural numbers and one method for the defnton of large prme numbers Emmanul Manousos APM Insttute for the Advancement of Physcs and Mathematcs 3 Poulou str. 53 Athens Greece Abstract

More information

Stability and Stabilization for Discrete Systems with Time-varying Delays Based on the Average Dwell-time Method

Stability and Stabilization for Discrete Systems with Time-varying Delays Based on the Average Dwell-time Method Proceedngs of the 29 IEEE Internatonal Conference on Systems, an, and Cybernetcs San Antono, TX, USA - October 29 Stablty and Stablzaton for Dscrete Systems wth Tme-varyng Delays Based on the Average Dwell-tme

More information

Bernoulli Numbers and Polynomials

Bernoulli Numbers and Polynomials Bernoull Numbers and Polynomals T. Muthukumar tmk@tk.ac.n 17 Jun 2014 The sum of frst n natural numbers 1, 2, 3,..., n s n n(n + 1 S 1 (n := m = = n2 2 2 + n 2. Ths formula can be derved by notng that

More information

Convexity preserving interpolation by splines of arbitrary degree

Convexity preserving interpolation by splines of arbitrary degree Computer Scence Journal of Moldova, vol.18, no.1(52), 2010 Convexty preservng nterpolaton by splnes of arbtrary degree Igor Verlan Abstract In the present paper an algorthm of C 2 nterpolaton of dscrete

More information

Changing Topology and Communication Delays

Changing Topology and Communication Delays Prepared by F.L. Lews Updated: Saturday, February 3, 00 Changng Topology and Communcaton Delays Changng Topology The graph connectvty or topology may change over tme. Let G { G, G,, G M } wth M fnte be

More information

Application of B-Spline to Numerical Solution of a System of Singularly Perturbed Problems

Application of B-Spline to Numerical Solution of a System of Singularly Perturbed Problems Mathematca Aeterna, Vol. 1, 011, no. 06, 405 415 Applcaton of B-Splne to Numercal Soluton of a System of Sngularly Perturbed Problems Yogesh Gupta Department of Mathematcs Unted College of Engneerng &

More information

Assortment Optimization under MNL

Assortment Optimization under MNL Assortment Optmzaton under MNL Haotan Song Aprl 30, 2017 1 Introducton The assortment optmzaton problem ams to fnd the revenue-maxmzng assortment of products to offer when the prces of products are fxed.

More information

8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS

8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS SECTION 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS 493 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS All the vector spaces you have studed thus far n the text are real vector spaces because the scalars

More information