An Augmented Lagrangian Coordination-Decomposition Algorithm for Solving Distributed Non-Convex Programs

Size: px
Start display at page:

Download "An Augmented Lagrangian Coordination-Decomposition Algorithm for Solving Distributed Non-Convex Programs"

Transcription

1 An Augmented Lagrangan Coordnaton-Decomposton Agorthm for Sovng Dstrbuted Non-Convex Programs Jean-Hubert Hours and Con N. Jones Abstract A nove augmented Lagrangan method for sovng non-convex programs wth nonnear cost and constrant coupngs n a dstrbuted framework s presented. The proposed decomposton agorthm s made of two ayers: The outer eve s a standard mutper method wth penaty on the nonnear equaty constrants, whe the nner eve conssts of a bockcoordnate descent BCD) scheme. Based on standard resuts on mutper methods and recent resuts on proxma reguarsed BCD technques, t s proven that the method converges to a KKT pont of the non-convex nonnear program under a sem-agebracty assumpton. Effcacy of the agorthm s demonstrated on a numerca exampe. I. INTRODUCTION When deang wth a arge-scae system, such as a power grd for nstance, sub-systems are couped n a compex manner through ther dynamcs, mpyng that oca contro actons may have a major mpact throughout the whoe network. Impementng NMPC controers for such systems may resut n arge-scae non-separabe non-convex nonnear programs NLP). Sovng such programs onne n a centrased fashon s a chaengng task from a computatona pont of vew. Moreover, autonomy of the agents s key to be hampered by such a centrased procedure. Therefore, dstrbuted MPC s currenty rasng much nterest, as an advanced contro strategy for decentrasng computatons, thus reducng the computatona burden and enabng autonomy of agents, whch then ony need to sove a oca NLP [8]. Potenta appcatons of dstrbuted NMPC abound, from networked systems contro nterconnected power pants) to coaboratve contro formaton fyng). Decomposng a convex NLP s generay performed by appyng Lagrangan decomposton methods [4]. A crtca requrement for ths strategy s strong duaty, whch s guaranteed by Sater s condton n the convex case. However, such an assumpton rarey hods n a non-convex settng. Nevertheess, n an augmented Lagrangan settng, t has been shown that the duaty gap can be ocay removed by usng approprate penaty functons [10]. Moreover, augmented Lagrangan methods turn out to be much more effcent than standard penaty approaches from a computatona pont of vew, as the rsk of -condtonng assocated wth arge penaty parameters s reduced by the fast convergence of the dua teraton. However, the resutng quadratc penaty term s not separabe even f the coupng constrants are. Severa approaches have been expored to remedy ths ssue [7]. For nstance, n [,1], Jean-Hubert Hours and Con N. Jones are wth the Laboratore d Automatque, Écoe Poytechnque Fédérae de Lausanne, Swtzerand. {jean-hubert.hours, con.jones}@epf.ch a oca convexfcaton procedure by means of proxma reguarsaton s anaysed. A more recent approach to the decomposton of non-convex programs s the sequenta programmng scheme presented n [9], whch conssts n teratve convex approxmatons and decompostons. Our decomposton strategy s not specfcay targeted at separabe NLPs, as objectve and constrants coupngs can be addressed by the proposed BCD scheme. It aso dffers from [9] n that ony oca convexfcaton s requred to guarantee convergence. Our approach s an augmented Lagrangan method wth parta constrant penasaton on the nonnear equaty constrants. It s a two-eve optmsaton scheme, whose outer ayer s a oop on the Lagrange mutpers estmates assocated wth the nonnear equaty constrants. The nner eve conssts n approxmatey sovng the prma probem to a crtca pont. Ths s performed va an nexact proxma reguarsed BCD scheme. By means of recent resuts on non-convex proxma reguarsed BCD methods [1], we prove convergence of the nner oop to a crtca pont of the non-convex nner probem under the assumpton that the NLP s sem-agebrac. Ths s the man novety of our approach, snce, unt recenty, BCD type methods were thought to ony have convergence guarantees under the very restrctve assumpton that the NLP s convex. Some studes have been conducted n the non-convex case, yet the convergence resuts are qute weak compared to [1]. The nner oop s the key step of the agorthm, whch enabes dstrbuted computatons. In genera, as the augmented Lagrangan s not separabe, nner teratons cannot be fuy paraesed, but some degree of paraesm can be acheved f the nterconnecton graph s sparse [4]. Fnay, convergence of our two-eve optmsaton technque to a KKT pont of the orgna NLP s proven under standard assumptons. Eventuay, the whoe agorthm conssts n teratvey sovng decouped Quadratc Programs QP) and updatng mutpers estmates n a coordnaton step. In Secton III, the genera framework of our agorthm s presented. In Sectons IV and V, the agorthm s descrbed and ts convergence anaysed. Fnay, a numerca exampe s presented n Secton VI. II. BACKGROUND Defnton 1 Norma cone to a convex set): Let be a convex set n R n and x. The norma cone to at x s the set n o N x) := v R n 8x, v > x x) appe 0. 1)

2 Lemma 1 Descent Lemma): Let f C, R) twce contnuousy dfferentabe n ), where s a convex compact set n R n. Let x, y. fy) appe fx)+rfx) > y x)+ r f 1 ky xk ) where n o r f 1 = max r fx) x. 3) Gven M R n n, kmk denotes the nduced matrx -norm. Defnton Reguar pont): Let h C 1 R n, R m ). A pont x R n such that hx) =0 s caed reguar f the gradents rh 1 x),...,rh m x) are neary ndependent. Defnton 3 Crtca pont): Let f be a proper ower semcontnuous functon. A necessary condton for x to be a mnmser of f s that ), 4) ) s the sub-dfferenta of f at x [11]. Ponts satsfyng 4) are caed crtca ponts. The ndcator functon of a cosed subset S of R n s denoted S and s defned as 0 f x S Sx) = 5) +1 f x/ S. Lemma Sub-dfferenta of ndcator functon [11]): Gven a convex set C, for a x C x) =N C x). 6) III. PROBLEM FORMULATION We consder NLPs of the foowng form: mnmse J z )+Qz 1,...,z N ) z 1,...,z N s.t. =1 F z )=0, {1,...,N}, Gz 1,...,z N )=0, z Z, {1,...,N}, 7) where z R n for {1,...,N} and Z are poytopes n R n. The term Qz 1,...,z N ) s a cost coupng term and Gz 1,...,z N ) R p a constrant coupng term. They mode the dfferent knds of systems nteractons, whch may appear n a dstrbuted NMPC context. For the remander of the paper, we aso defne the vector z := z 1 >,...,z N > > R n, wth dmenson n:= P N =1 n. For {1,...,N}, as Z s poytopc, there exsts A R q n and b R q such that Z := xr n A x appe b. One can thus ntroduce the poytope Z R n by Z := x R n Ax appe b, where AR q n, br q and q := P N =1 q. By posng m := P N =1 m, we aso defne Hz) := F 1 z 1 ) >,...,F N z N ) >,Gz 1,...,z N ) > > R r, 8) where r := m + p. The dstrbuted objectve s Jz) := J z )+Qz 1,...,z N ), 9) =1 so that NLP 7) can be rewrtten mnmse Jz) 10) z s.t. Hz) =0, z Z. Assumpton 1 Smoothness and sem-agebracty): N The functons J, Q, F N =1 and G are twce =1 contnuousy dfferentabe and sem-agebrac. Remark 1: The set of sem-agebrac functons s cosed wth respect to sum, product and composton. The ndcator functon of a sem-agebrac set s a sem-agebrac functon. Sem-agebracty s needed n order to appy the resuts of [1], whch are vad for functons satsfyng the Kurdyka- Lojasewcz KL) property, whch s the case for a rea sem-agebrac functons [5]. From a contro perspectve, ths means that our resuts are vad for poynoma systems subject to poynoma constrants and objectves. Assumpton : The NLP 10) admts an soated KKT pont z ) >, µ ) >, ) > ) > R n+r+q, whch satsfes z s reguar, z ) >, µ ) >, ) > ) > satsfes the second order optmaty condton, that s p > r Lz,µ, )p>0 for a p R n such that rhz ) > p =0, A I p =0, 11) where I {1,...,q} s the set of ndces of actve nequaty constrants at z. For a I, > 0. IV. OUTER LOOP: PARTIALLY AUGMENTED LAGRANGIAN Gven a penaty parameter >0, the augmented Lagrangan assocated wth probem 10) s defned as L z,µ) :=Jz)+µ > Hz)+ khz)k, 1) where µ R r. Ony equaty constrants Hz) =0are penased. The poytopc nequates are kept as constrants on the augmented Lagrangan. A. Agorthm descrpton The outer oop s smar to a dua ascent on the mutper estmates µ k, aong wth teratve updates of the penaty parameter k. At each outer teraton k, the prma probem mnmse L k z,µ k ) 13) zz s soved to a gven eve of accuracy k > 0. The outer oop s presented n Agorthm 1 beow. The notaton dx, S) stands for the dstance functon between a pont x and a set S n R n and s defned by dx, S) := nf zs kx zk. 14) At every outer teraton, a pont z s found, whch satsfes nexact optmaty condtons. Then, the dua estmate µ s updated, the penaty parameter ncreased and the toerance

3 Agorthm 1 Method of mutpers wth parta constrant penasaton Input: Objectve J, equaty constrant H, poytope Z, >1 and fna toerance on nonnear equaty constrants >0. Inta guess for optmser z 0 Z, nta guess for mutper estmates µ 0 R r, nta penaty parameter 0 > 1, nta toerance on optmaty condtons 0 > 0. Intazaton: z z 0, µ µ 0, 0, 0. repeat Fnd z Z such that d0, rl z,µ)+n Z z)) appe usng z as a warm-start z z µ µ + Hz),, unt khz)k appe on the frst order optmaty condtons s reduced. The man tunng varabes are the nta vaue of the penaty parameter 0 and the growth coeffcent. B. Convergence anayss We prove that the augmented Lagrangan teratons are ocay convergent to the soated KKT pont z ) >, µ ) >, ) > > satsfyng Assumpton. Our anayss s based on the resuts of [3]. We start by showng that the nner probem s we-defned, whch means that for any optmaty toerance >0, there exsts a pont z, as n Agorthm 1, satsfyng d0, rl z,µ)+n Z z)) appe, under approprate condtons on and µ. Lemma 3 Exstence of an nner crtca pont): There exsts >0, >0 and appe>0 such that 8 µ, S:= µ, R m+1 kµ µ k appe,, 15) the probem mnmse L z,µ) 16) s.t. z Z kz z k <appe. has a unque mnmser. Moreover, for a >0 and for a µ, S, there exsts z R n such that d0, rl z,µ)+n Z z )) appe. 17) Proof: The frst part of the statement s a drect consequence of Proposton.11 n [3], as the second order optmaty condton s satsfed Assumpton ). From Defnton 3, takng µ, S, ths mpes that we can fnd z Z\Bz,appe) such that d0, rl z,µ)+n Z\Bz,appe) z)) appe for a >0, where Bz,appe) s the open ba of radus appe centered at z. As r Bz,appe) \ r Z6= ;, where r stands for the reatve nteror, N Z\Bz,appe) z) =N Z z) \N Bz,appe) z), so that d0, rl z,µ)+n Z z)) appe d0, rl z,µ) + N Z\Bz,appe) z)) appe. 18) The end of the proof foows by takng z := z. The next Lemma s a reformuaton of the nexact optmaty condtons d0, rl z,µ)+n Z z)) appe. Lemma 4 Inexact optmaty condtons): Let z R n, f C 1 R n, R) and Z be a convex set n R n. The foowng equvaence hods: d0, rfz)+n Z z)) appe,9v R n such that kvk appe 19) rfz) N Z z)+v. Proof: Ths drecty foows from the defnton of the dstance as an nfmum. As z ) >, µ ) >, ) > > s an soated KKT pont, there exsts >0 such that z s the ony crtca pont of 10) n Bz, ). From the statement of Agorthm 1, the penaty parameter s guaranteed to ncrease to nfnty and the optmaty toerance to converge to zero. The foowng convergence theorem s vad f the terates z k stay n the ba Bz, mn {, appe}) for a arge enough k. As notced n [3], t s generay the case n practce, as warm-startng the nner probem 16) on the prevous souton eads the terates to stay around the same oca mnmum z. Thus one can reasonaby assume that z k Bz, mn {, appe}). Theorem 1 Loca convergence to a KKT pont): Let {z k } and {µ k } be sequences n R n such that zk Z\c B z, mn {appe, } 0) rl k z k,µ k ) N Z z k )+d k where { k } s ncreasng and k! + 1, {µ k } and { k }, assocated wth N Z z k ), are bounded, kd k k! 0. Assume that a mt ponts of {z k } are reguar. We then have that z k! z and µ k! µ, where µ k := µ k + k Hz k ), z and µ are defned n Assumpton. Proof: As Z\c B z, mn {appe, } s compact, by Weerstrass theorem, there exsts an ncreasng mappng : N! N and a pont z 0 Z\c B z, mn {appe, } such that z k) converges to z 0, whch s reguar by assumpton. We aso know by assumpton that 9v k) N Z z k) ), 0=rL k) z k),µ k) )+v k) + d k). 1) By defnton of the norma cone to Z at z k), ths means that rl z k) k),µ k) )+A > k) + d k) =0 9 k) 0, We thus have k) 0, > k) b Az k) )=0. ) rjz k) )+rhz k) ) > µ k) + A > k) = d k). 3) As z 0 s reguar, there exsts K N + such that rhz k) ) s fu-rank for a k K. Subsequenty, for a k K, µ k) = rhz k) )rhz k) ) > 1 rhz k) ) d k) rjz k) ) A > k). 4)

4 As z k)! z 0, by contnuty of > k) b Az k) =0, there exsts 0 0 such that k)! 0, 0 > b Az 0 )=0. 5) As d k)! 0, by contnuty of rh, ths mpes that µ k)! µ 0, where µ 0 := rhz 0 )rhz 0 ) > 1 rhz 0 ) rjz 0 ) A > 0. 6) By takng mt n 3), we obtan rjz 0 )+rhz 0 ) > µ 0 + A > 0 =0 0 0, 0 ) > b Az 0 )=0. 7) Feasbty of z 0 mmedatey foows from the convergence of µ k) and the fact that k)! +1. Thus we have Hz 0 ) = 0. Ths mpes that z 0 >, µ 0 > 0 > >, satsfes the KKT condtons. As z 0 Z \c B z, mn {appe, }, n whch z s the unque KKT pont by assumpton, one can cam that z 0 = z. Takng the KKT condtons on z and z 0, t foows that rhz ) > µ + A > I I = rhz0 ) > µ 0 + A > I 0 0 I 0, 8) where I and I 0 are the sets of ndces of actve constrants at z and z 0 respectvey. Obvousy, as z = z 0, I = I 0. Thus rhz ) > µ µ 0 )+A > I I I0)=0. 9) As z s reguar, we fnay obtan that µ 0 = µ. As a mt ponts of {z k } converge to z, one can concude that z k! z and µ k! µ. Remark : The convergence resut of Theorem 1 s oca. Yet convergence can be gobased meanng that the assumpton accordng to whch the terates e n a ba centered at z can be removed) by appyng the dua update scheme proposed n [6]. V. INNER LOOP: INEXACT PROXIMAL REGULARISED BCD The non-convex nner probem 13) s soved n a dstrbuted manner. An nexact BCD scheme s proposed, whch s based on the abstract resut of [1]. The man dea s to perform oca convex approxmatons of each agent s cost and compute descent updates from t. Contrary to the approach of [9], where spttng s, n some sense, apped after convexfcaton, we show that convexfcaton can be performed after spttng, under the assumpton that the KL property s satsfed. A. Agorthm descrpton The proposed agorthm s a proxma reguarsed nexact BCD, whch s based on Theorem 6. n [1], provdng genera propertes ensurng goba convergence to a crtca pont of the nonsmooth objectve, assumng that t satsfed the KL property. The nner mnmsaton method s presented n Agorthm beow. The postve rea numbers N =1 are the reguarsaton parameters. We assume that Matrces B 8 {1,...,N}, 9, + > 0 such that < + 8 N,, + 30). are postve defnte and need to be chosen Agorthm Inexact proxma reguarsed BCD Input: Augmented Lagrangan L., µ), poytopes {Z } N =1 and termnaton toerance >0. Intazaton: z1 0 Z 1,...,zN 0 Z N 0. whe z > do 1 for {1,...,N} do argmn zz r z L z1 +1,...,zN,µ)> z z ) + 1 z z )> B + I n )z z ) end for +1 end whe carefuy n order to guarantee convergence of Agorthm, as expaned n paragraph V-B. Thus, Agorthm conssts n sovng convex QPs sequentay. Under approprate assumptons [4], computatons can be party paraesed. Moreover, computatona effcacy can be mproved by warm-startng. B. Convergence anayss When consderng the mnmsaton of functons f : R n1... R n N! R [{+1} satsfyng the KL property [5] and havng the foowng structure fz) = f z )+Pz 1,...,z N ), 31) =1 where f are proper ower semcontnuous and P s twce contnuousy dfferentabe, two key assumptons need to be satsfed by the terates z of the nexact BCD scheme n order to ensure goba convergence [1]: a suffcent decrease property and the reatve error condton. The suffcent decrease property asserts that 8 {1,...,N}, 8 N, f )+Pz1 +1,...,,...,zN )+ appe f z )+P 1,..., 1,z,...,z N ). 3) The reatve error condton states that 8 1,...,N, 9b > 0 such that 8 N, ), + r z P z1 +1,...,,z+1,...,z N ) appe b z. 33) Ths means that at every teraton one can fnd a vector n the sub-dfferenta of f computed at the current terate, whch s bounded by the momentum of the sequence z. It z

5 s actuay a key step n order to appy the KL nequaty, as done n [1]. For carty, we state Theorem 6. n [1]. Theorem Convergence of nexact BCD): Let f be a proper ower sem-contnuous functon wth structure 31) satsfyng the KL property and bounded from beow. Let > 0 such that appe for a {1,...,N} and N. Let z be a sequence satsfyng 3) and 33). If z s bounded, then z converges to a crtca pont of f. Remark 3: Note that Z + L., µ) has the structure 31). Our convergence anayss then many conssts n verfyng that the terates generated by Agorthm satsfy 3) and 33). The foowng defntons are necessary for the remander of the proof: 8 {1,...,N}, 8 N, Sz1 +1,..., 1,z,z+1,...,z N ):= µ > GGz1 +1,..., 1,z,z+1,...,z N ) + Gz+1 1,..., 1,z,z +1,...,z N ), Rz ):=Qz1 +1,..., 1,z,z+1,...,z N ) + Sz1 +1,..., 1,z,z+1,...,z N ), z ):=µ > F z )+ kf z )k, L z ):=J z )+ z )+R z ), z ):=J z )+ z )+ Z z ), z ):=L z )+ Z z ), 34) where µ G R r s the subvector of mutpers assocated wth the equaty constrant Gz 1,...,z N )=0. Lemma 5 Bound on hessan norm): where 8 {1,...,N}, 8 N, C := r J Z 1 + r r L Z 1 appe C, 35) Z 1 + max {1,...,N} r S + Q) Z 1. 36) Proof: The proof drecty foows from the defntons 34) and the fact that the functons nvoved n 34) are twce contnuousy dfferentabe over compact sets Z. Assumpton 3: The matrces B can be chosen so that B C I n 0 and C I n B 0, for a {1,...,N} and a N. Remark 4: Such an assumpton requres knowedge of the Lpschtz constant of the gradent of the augmented Lagrangan. However, a smpe backtrackng procedure may be apped n practce. Lemma 6 Suffcent decrease): For a {1,...,N} and a N, )+ z appe z ). 37) Proof: By defnton of n Agorthm, 8z Z, rl z ) > z ) + 1 z+1 z ) > B + I n ) z ) apperl z ) > z z )+ 1 z z ) > B + I n )z z ), whch by substtutng z wth z Z mpes that rl z ) > z ) 38) + 1 z+1 z ) > B + I n ) z ) appe 0. 39) However, by appyng the descent Lemma, one gets L ) appe L z)+rl z) > z )+ C z 40) Combnng ths ast nequaty wth nequaty 39), one obtans L ) appe L z ) + 1 z+1 z) > C )I n B)z +1 whch, by Assumpton 3, mpes that L )+ However, Z z )= Z z )=0. Thus, )+ Lemma 7 Reatve error condton): 8 {1,...,N}, 8 N, z z ), 41) appe L z ). 4) appe z ). 43) ), + rq + S) 1,...,,z +1,...,z N ) appe b z, 44) where b := 3C + + and + s defned n 30). Proof: Wrtng the necessary optmaty condtons for the th QP of Agorthm, one obtans 0 rl z )+ B + I n z )+N Z ). 45) Thus there exsts w +1 N Z ) such that 0=w +1 + rl z)+b + I n ) z) = w +1 + rl )+rl z) rl ) +B + I n ) z). 46) The ast equaty yeds w +1 + rj )+rf ) > µ + rf ) > F ) + r Q + S)z1 +1,...,,z+1,...,z N ) = rl ) rl z)+b + I n )z ), 47).

6 from whch t mmedatey foows ), + r Q + S)z1 +1,..., 1,z+1,z+1,...,z N ) = rl ) rl z)+b + I n )z ). 48) However, as L C Z, R), one gets + r Q + S) 1,...,,z +1,...,z N ) appe C + B + ) z appe 3C + + ) z+1 z. 49) Now that the two man ngredents are proven, one can state the theorem guaranteeng convergence of the terates of Agorthm to a crtca pont of Z + L., µ). Theorem 3 Convergence of Agorthm ): The sequence z generated by Agorthm converges to a pont z satsfyng 0 rl z,µ)+n Z z ) 50) or equvaenty d0, rl z,µ)+n Z z )) = 0. Proof: The sequence z s obvousy bounded, as the terates are constraned to stay n the poytope Z. The functon Z + L., µ) s bounded from beow. Moreover, by Assumpton 1, Z + L., µ) satsfes the KL property for a µ R m and >0. The sequence z satsfes the condtons 3) and 33). The parameter of Theorem can be taken as mn {1,...,N}. Convergence to a crtca pont z of Z + L., µ) then foows by appyng Theorem. We can then guarantee that, gven >0, by takng a suffcenty arge number of nner teratons, Agorthm converges to a pont z Z satsfyng d0, rl z,µ)+n Z z)) appe, as needed by Agorthm 1. VI. NUMERICAL EXAMPLE Our agorthm s tested on the foowng cass of nonconvex NLPs: 1 mnmse x > H x + x > H,+1 x +1 x 1,...,x N s.t. =1 kx k = a, 1,...,N =1 b appe x,j appe b, 1,...,N,j 1,...,d, 51) where N s the number of agents and d ther dmenson. Matrces H and H,+1 are ndefnte. When appyng Agorthm to mnmse the augmented Lagrangan assocated wth 51), the update of agent ony depends on agents 1 and +1, whch mpes that, after approprate re-orderng [4], every nner teraton actuay conssts of two sequenta update steps where N/ QPs are soved n parae. We fx d =3, N = 0, a = p R and b =0.6R wth R =. Matrces H and H,+1 are randomy generated. For a fxed number of outer teratons, we graduay ncrease the number of nner teratons and count the number of random probems, on whch the agorthm ends up wthn a fxed feasbty Percentage of postves Tota number of teratons 10 4 Fg. 1. Percentage of postve probems for a fxed feasbty toerance on the nonnear equaty constrants. Feasbty margns 10 3 n bue, 10 4 n back and 10 6 n red. toerance wth respect to the nonnear equaty constrants kx k = a. Statstcs are reported for 500 random NLPs of the type 51). The nta penaty parameter s set to 0 =0.1 and the growth coeffcent = 100. The hessan matrces of Agorthm are set to 30 I d. The agorthm s ntased on random prma and dua nta guesses, feasbe wth respect to the nequaty constrants. Resuts for a feasbty toerances of 10 3, 10 4 and 10 6 are presented n Fgure 1. It ceary appears that obtanng an accurate feasbty margn requres a arge number of teratons, yet a reasonabe feasbty 10 3 ) can be obtaned wth approxmatey hundred teratons. REFERENCES [1] H. Attouch, J. Bote, and B.F. Svater. Convergence of descent methods for sem-agebrac and tame probems: proxma agorthms, forward-backward spttng and reguarsed Gauss-Sede methods. Mathematca Programmng, 137:91 19, 013. [] D.P. Bertsekas. Convexfcaton procedures and decomposton methods for nonconvex optmsaton probems. Journa of Optmzaton Theory and Appcatons, 9: , [3] D.P. Bertsekas. Constraned optmsaton and Lagrange mutper methods. Athena Scentfc, 198. [4] D.P. Bertsekas and J.N. Tstsks. Parae and dstrbuted computaton: numerca methods. Athena Scentfc, Bemont, MA, [5] J. Bote, A. Dands, and A. Lews. The Lojasewcz nequaty for nonsmooth subanaytc functons wth appcatons to subgradent dynamca systems. SIAM Journa on Optmzaton, 17:105 13, 006. [6] A. R. Conn, G. I. M. Goud, and P. L. Tont. A gobay convergent augmented Lagrangan agorthm for optmsaton wth genera constrants and smpe bounds. SIAM Journa on Numerca Anayss, 8):545 57, [7] A. Hamd and S.K. Mshra. Decomposton methods based on augmented Lagrangan: a survey. In Topcs n nonconvex optmzaton. Mshra, S.K., 011. [8] I. Necoara, V. Nedecu, and I. Dumtrache. Parae and dstrbuted optmzaton methods for optmsaton and contro n networks. Journa of Process Contro, 1: , 011. [9] I. Necoara, C. Savorgnan, Q. Tran Dnh, J. Suykens, and M. Deh. Dstrbuted nonnear optma contro usng sequenta convex programmng and smoothng technques. In Proceedngs of the 48 th Conference on Decson and Contro, 009. [10] R.T. Rockafear. Augmented Lagrangan mutper functons and duaty n nonconvex programmng. SIAM Journa on Contro and Optmzaton, 1):68 85, [11] R.T. Rockafear and R. Wets. Varatona Anayss. Sprnger, [1] A. Tankawa and H. Muka. A new technque for nonconvex prmadua decomposton of a arge-scae separabe optmsaton probem. IEEE Transactons on Automatc Contro, 1985.

Research on Complex Networks Control Based on Fuzzy Integral Sliding Theory

Research on Complex Networks Control Based on Fuzzy Integral Sliding Theory Advanced Scence and Technoogy Letters Vo.83 (ISA 205), pp.60-65 http://dx.do.org/0.4257/ast.205.83.2 Research on Compex etworks Contro Based on Fuzzy Integra Sdng Theory Dongsheng Yang, Bngqng L, 2, He

More information

A Class of Distributed Optimization Methods with Event-Triggered Communication

A Class of Distributed Optimization Methods with Event-Triggered Communication A Cass of Dstrbuted Optmzaton Methods wth Event-Trggered Communcaton Martn C. Mene Mchae Ubrch Sebastan Abrecht the date of recept and acceptance shoud be nserted ater Abstract We present a cass of methods

More information

Accelerated gradient methods and dual decomposition in distributed model predictive control

Accelerated gradient methods and dual decomposition in distributed model predictive control Deft Unversty of Technoogy Deft Center for Systems and Contro Technca report 12-011-bs Acceerated gradent methods and dua decomposton n dstrbuted mode predctve contro P. Gsesson, M.D. Doan, T. Kevczky,

More information

Quantum Runge-Lenz Vector and the Hydrogen Atom, the hidden SO(4) symmetry

Quantum Runge-Lenz Vector and the Hydrogen Atom, the hidden SO(4) symmetry Quantum Runge-Lenz ector and the Hydrogen Atom, the hdden SO(4) symmetry Pasca Szrftgser and Edgardo S. Cheb-Terrab () Laboratore PhLAM, UMR CNRS 85, Unversté Le, F-59655, France () Mapesoft Let's consder

More information

ON AUTOMATIC CONTINUITY OF DERIVATIONS FOR BANACH ALGEBRAS WITH INVOLUTION

ON AUTOMATIC CONTINUITY OF DERIVATIONS FOR BANACH ALGEBRAS WITH INVOLUTION European Journa of Mathematcs and Computer Scence Vo. No. 1, 2017 ON AUTOMATC CONTNUTY OF DERVATONS FOR BANACH ALGEBRAS WTH NVOLUTON Mohamed BELAM & Youssef T DL MATC Laboratory Hassan Unversty MORO CCO

More information

A MIN-MAX REGRET ROBUST OPTIMIZATION APPROACH FOR LARGE SCALE FULL FACTORIAL SCENARIO DESIGN OF DATA UNCERTAINTY

A MIN-MAX REGRET ROBUST OPTIMIZATION APPROACH FOR LARGE SCALE FULL FACTORIAL SCENARIO DESIGN OF DATA UNCERTAINTY A MIN-MAX REGRET ROBST OPTIMIZATION APPROACH FOR ARGE SCAE F FACTORIA SCENARIO DESIGN OF DATA NCERTAINTY Travat Assavapokee Department of Industra Engneerng, nversty of Houston, Houston, Texas 7704-4008,

More information

On the Power Function of the Likelihood Ratio Test for MANOVA

On the Power Function of the Likelihood Ratio Test for MANOVA Journa of Mutvarate Anayss 8, 416 41 (00) do:10.1006/jmva.001.036 On the Power Functon of the Lkehood Rato Test for MANOVA Dua Kumar Bhaumk Unversty of South Aabama and Unversty of Inos at Chcago and Sanat

More information

A General Column Generation Algorithm Applied to System Reliability Optimization Problems

A General Column Generation Algorithm Applied to System Reliability Optimization Problems A Genera Coumn Generaton Agorthm Apped to System Reabty Optmzaton Probems Lea Za, Davd W. Cot, Department of Industra and Systems Engneerng, Rutgers Unversty, Pscataway, J 08854, USA Abstract A genera

More information

MARKOV CHAIN AND HIDDEN MARKOV MODEL

MARKOV CHAIN AND HIDDEN MARKOV MODEL MARKOV CHAIN AND HIDDEN MARKOV MODEL JIAN ZHANG JIANZHAN@STAT.PURDUE.EDU Markov chan and hdden Markov mode are probaby the smpest modes whch can be used to mode sequenta data,.e. data sampes whch are not

More information

Deriving the Dual. Prof. Bennett Math of Data Science 1/13/06

Deriving the Dual. Prof. Bennett Math of Data Science 1/13/06 Dervng the Dua Prof. Bennett Math of Data Scence /3/06 Outne Ntty Grtty for SVM Revew Rdge Regresson LS-SVM=KRR Dua Dervaton Bas Issue Summary Ntty Grtty Need Dua of w, b, z w 2 2 mn st. ( x w ) = C z

More information

Supplementary Material: Learning Structured Weight Uncertainty in Bayesian Neural Networks

Supplementary Material: Learning Structured Weight Uncertainty in Bayesian Neural Networks Shengyang Sun, Changyou Chen, Lawrence Carn Suppementary Matera: Learnng Structured Weght Uncertanty n Bayesan Neura Networks Shengyang Sun Changyou Chen Lawrence Carn Tsnghua Unversty Duke Unversty Duke

More information

A General Distributed Dual Coordinate Optimization Framework for Regularized Loss Minimization

A General Distributed Dual Coordinate Optimization Framework for Regularized Loss Minimization Journa of Machne Learnng Research 18 17 1-5 Submtted 9/16; Revsed 1/17; Pubshed 1/17 A Genera Dstrbuted Dua Coordnate Optmzaton Framework for Reguarzed Loss Mnmzaton Shun Zheng Insttute for Interdscpnary

More information

New Inexact Parallel Variable Distribution Algorithms

New Inexact Parallel Variable Distribution Algorithms Computatona Optmzaton and Appcatons 7, 165 18 1997) c 1997 Kuwer Academc Pubshers. Manufactured n The Netherands. New Inexact Parae Varabe Dstrbuton Agorthms MICHAEL V. SOLODOV soodov@mpa.br Insttuto de

More information

Lecture 10 Support Vector Machines II

Lecture 10 Support Vector Machines II Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed

More information

A Derivative-Free Algorithm for Bound Constrained Optimization

A Derivative-Free Algorithm for Bound Constrained Optimization Computatona Optmzaton and Appcatons, 21, 119 142, 2002 c 2002 Kuwer Academc Pubshers. Manufactured n The Netherands. A Dervatve-Free Agorthm for Bound Constraned Optmzaton STEFANO LUCIDI ucd@ds.unroma.t

More information

Monica Purcaru and Nicoleta Aldea. Abstract

Monica Purcaru and Nicoleta Aldea. Abstract FILOMAT (Nš) 16 (22), 7 17 GENERAL CONFORMAL ALMOST SYMPLECTIC N-LINEAR CONNECTIONS IN THE BUNDLE OF ACCELERATIONS Monca Purcaru and Ncoeta Adea Abstract The am of ths paper 1 s to fnd the transformaton

More information

A finite difference method for heat equation in the unbounded domain

A finite difference method for heat equation in the unbounded domain Internatona Conerence on Advanced ectronc Scence and Technoogy (AST 6) A nte derence method or heat equaton n the unbounded doman a Quan Zheng and Xn Zhao Coege o Scence North Chna nversty o Technoogy

More information

Networked Cooperative Distributed Model Predictive Control Based on State Observer

Networked Cooperative Distributed Model Predictive Control Based on State Observer Apped Mathematcs, 6, 7, 48-64 ubshed Onne June 6 n ScRes. http://www.scrp.org/journa/am http://dx.do.org/.436/am.6.73 Networed Cooperatve Dstrbuted Mode redctve Contro Based on State Observer Ba Su, Yanan

More information

Lower Bounding Procedures for the Single Allocation Hub Location Problem

Lower Bounding Procedures for the Single Allocation Hub Location Problem Lower Boundng Procedures for the Snge Aocaton Hub Locaton Probem Borzou Rostam 1,2 Chrstoph Buchhem 1,4 Fautät für Mathemat, TU Dortmund, Germany J. Faban Meer 1,3 Uwe Causen 1 Insttute of Transport Logstcs,

More information

Singular Value Decomposition: Theory and Applications

Singular Value Decomposition: Theory and Applications Sngular Value Decomposton: Theory and Applcatons Danel Khashab Sprng 2015 Last Update: March 2, 2015 1 Introducton A = UDV where columns of U and V are orthonormal and matrx D s dagonal wth postve real

More information

A generalization of Picard-Lindelof theorem/ the method of characteristics to systems of PDE

A generalization of Picard-Lindelof theorem/ the method of characteristics to systems of PDE A generazaton of Pcard-Lndeof theorem/ the method of characterstcs to systems of PDE Erfan Shachan b,a,1 a Department of Physcs, Unversty of Toronto, 60 St. George St., Toronto ON M5S 1A7, Canada b Department

More information

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons

More information

10-801: Advanced Optimization and Randomized Methods Lecture 2: Convex functions (Jan 15, 2014)

10-801: Advanced Optimization and Randomized Methods Lecture 2: Convex functions (Jan 15, 2014) 0-80: Advanced Optmzaton and Randomzed Methods Lecture : Convex functons (Jan 5, 04) Lecturer: Suvrt Sra Addr: Carnege Mellon Unversty, Sprng 04 Scrbes: Avnava Dubey, Ahmed Hefny Dsclamer: These notes

More information

Subgradient Methods and Consensus Algorithms for Solving Convex Optimization Problems

Subgradient Methods and Consensus Algorithms for Solving Convex Optimization Problems Proceedngs of the 47th IEEE Conference on Decson and Contro Cancun, Mexco, Dec. 9-11, 2008 Subgradent Methods and Consensus Agorthms for Sovng Convex Optmzaton Probems Björn Johansson, Tamás Kevczy, Mae

More information

Research Article H Estimates for Discrete-Time Markovian Jump Linear Systems

Research Article H Estimates for Discrete-Time Markovian Jump Linear Systems Mathematca Probems n Engneerng Voume 213 Artce ID 945342 7 pages http://dxdoorg/11155/213/945342 Research Artce H Estmates for Dscrete-Tme Markovan Jump Lnear Systems Marco H Terra 1 Gdson Jesus 2 and

More information

Parallel Variable Distribution for Constrained Optimization*

Parallel Variable Distribution for Constrained Optimization* Computatona Optmzaton and Appcatons, 22, 111 131, 2002 c 2002 Kuwer Academc Pubshers. Manufactured n The Netherands. Parae Varabe Dstrbuton for Constraned Optmzaton* C.A. SAGASTIZÁBAL sagastz@mpa.br INRIA-Rocquencourt,

More information

Cyclic Codes BCH Codes

Cyclic Codes BCH Codes Cycc Codes BCH Codes Gaos Feds GF m A Gaos fed of m eements can be obtaned usng the symbos 0,, á, and the eements beng 0,, á, á, á 3 m,... so that fed F* s cosed under mutpcaton wth m eements. The operator

More information

Optimal Guaranteed Cost Control of Linear Uncertain Systems with Input Constraints

Optimal Guaranteed Cost Control of Linear Uncertain Systems with Input Constraints Internatona Journa Optma of Contro, Guaranteed Automaton, Cost Contro and Systems, of Lnear vo Uncertan 3, no Systems 3, pp 397-4, wth Input September Constrants 5 397 Optma Guaranteed Cost Contro of Lnear

More information

Lecture 20: November 7

Lecture 20: November 7 0-725/36-725: Convex Optmzaton Fall 205 Lecturer: Ryan Tbshran Lecture 20: November 7 Scrbes: Varsha Chnnaobreddy, Joon Sk Km, Lngyao Zhang Note: LaTeX template courtesy of UC Berkeley EECS dept. Dsclamer:

More information

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton

More information

Inexact Alternating Minimization Algorithm for Distributed Optimization with an Application to Distributed MPC

Inexact Alternating Minimization Algorithm for Distributed Optimization with an Application to Distributed MPC Inexact Alternatng Mnmzaton Algorthm for Dstrbuted Optmzaton wth an Applcaton to Dstrbuted MPC Ye Pu, Coln N. Jones and Melane N. Zelnger arxv:608.0043v [math.oc] Aug 206 Abstract In ths paper, we propose

More information

On the Equality of Kernel AdaTron and Sequential Minimal Optimization in Classification and Regression Tasks and Alike Algorithms for Kernel

On the Equality of Kernel AdaTron and Sequential Minimal Optimization in Classification and Regression Tasks and Alike Algorithms for Kernel Proceedngs of th European Symposum on Artfca Neura Networks, pp. 25-222, ESANN 2003, Bruges, Begum, 2003 On the Equaty of Kerne AdaTron and Sequenta Mnma Optmzaton n Cassfcaton and Regresson Tasks and

More information

8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS

8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS SECTION 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS 493 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS All the vector spaces you have studed thus far n the text are real vector spaces because the scalars

More information

PHYS 705: Classical Mechanics. Calculus of Variations II

PHYS 705: Classical Mechanics. Calculus of Variations II 1 PHYS 705: Classcal Mechancs Calculus of Varatons II 2 Calculus of Varatons: Generalzaton (no constrant yet) Suppose now that F depends on several dependent varables : We need to fnd such that has a statonary

More information

Report on Image warping

Report on Image warping Report on Image warpng Xuan Ne, Dec. 20, 2004 Ths document summarzed the algorthms of our mage warpng soluton for further study, and there s a detaled descrpton about the mplementaton of these algorthms.

More information

Distributed Moving Horizon State Estimation of Nonlinear Systems. Jing Zhang

Distributed Moving Horizon State Estimation of Nonlinear Systems. Jing Zhang Dstrbuted Movng Horzon State Estmaton of Nonnear Systems by Jng Zhang A thess submtted n parta fufment of the requrements for the degree of Master of Scence n Chemca Engneerng Department of Chemca and

More information

Key words. corner singularities, energy-corrected finite element methods, optimal convergence rates, pollution effect, re-entrant corners

Key words. corner singularities, energy-corrected finite element methods, optimal convergence rates, pollution effect, re-entrant corners NESTED NEWTON STRATEGIES FOR ENERGY-CORRECTED FINITE ELEMENT METHODS U. RÜDE1, C. WALUGA 2, AND B. WOHLMUTH 2 Abstract. Energy-corrected fnte eement methods provde an attractve technque to dea wth eptc

More information

IV. Performance Optimization

IV. Performance Optimization IV. Performance Optmzaton A. Steepest descent algorthm defnton how to set up bounds on learnng rate mnmzaton n a lne (varyng learnng rate) momentum learnng examples B. Newton s method defnton Gauss-Newton

More information

n-step cycle inequalities: facets for continuous n-mixing set and strong cuts for multi-module capacitated lot-sizing problem

n-step cycle inequalities: facets for continuous n-mixing set and strong cuts for multi-module capacitated lot-sizing problem n-step cyce nequates: facets for contnuous n-mxng set and strong cuts for mut-modue capactated ot-szng probem Mansh Bansa and Kavash Kanfar Department of Industra and Systems Engneerng, Texas A&M Unversty,

More information

Polite Water-filling for Weighted Sum-rate Maximization in MIMO B-MAC Networks under. Multiple Linear Constraints

Polite Water-filling for Weighted Sum-rate Maximization in MIMO B-MAC Networks under. Multiple Linear Constraints 2011 IEEE Internatona Symposum on Informaton Theory Proceedngs Pote Water-fng for Weghted Sum-rate Maxmzaton n MIMO B-MAC Networks under Mutpe near Constrants An u 1, Youjan u 2, Vncent K. N. au 3, Hage

More information

APPENDIX A Some Linear Algebra

APPENDIX A Some Linear Algebra APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,

More information

MMA and GCMMA two methods for nonlinear optimization

MMA and GCMMA two methods for nonlinear optimization MMA and GCMMA two methods for nonlnear optmzaton Krster Svanberg Optmzaton and Systems Theory, KTH, Stockholm, Sweden. krlle@math.kth.se Ths note descrbes the algorthms used n the author s 2007 mplementatons

More information

Boundary Value Problems. Lecture Objectives. Ch. 27

Boundary Value Problems. Lecture Objectives. Ch. 27 Boundar Vaue Probes Ch. 7 Lecture Obectves o understand the dfference between an nta vaue and boundar vaue ODE o be abe to understand when and how to app the shootng ethod and FD ethod. o understand what

More information

Some modelling aspects for the Matlab implementation of MMA

Some modelling aspects for the Matlab implementation of MMA Some modellng aspects for the Matlab mplementaton of MMA Krster Svanberg krlle@math.kth.se Optmzaton and Systems Theory Department of Mathematcs KTH, SE 10044 Stockholm September 2004 1. Consdered optmzaton

More information

Solutions to exam in SF1811 Optimization, Jan 14, 2015

Solutions to exam in SF1811 Optimization, Jan 14, 2015 Solutons to exam n SF8 Optmzaton, Jan 4, 25 3 3 O------O -4 \ / \ / The network: \/ where all lnks go from left to rght. /\ / \ / \ 6 O------O -5 2 4.(a) Let x = ( x 3, x 4, x 23, x 24 ) T, where the varable

More information

Lower bounds for the Crossing Number of the Cartesian Product of a Vertex-transitive Graph with a Cycle

Lower bounds for the Crossing Number of the Cartesian Product of a Vertex-transitive Graph with a Cycle Lower bounds for the Crossng Number of the Cartesan Product of a Vertex-transtve Graph wth a Cyce Junho Won MIT-PRIMES December 4, 013 Abstract. The mnmum number of crossngs for a drawngs of a gven graph

More information

The line method combined with spectral chebyshev for space-time fractional diffusion equation

The line method combined with spectral chebyshev for space-time fractional diffusion equation Apped and Computatona Mathematcs 014; 3(6): 330-336 Pubshed onne December 31, 014 (http://www.scencepubshnggroup.com/j/acm) do: 10.1164/j.acm.0140306.17 ISS: 3-5605 (Prnt); ISS: 3-5613 (Onne) The ne method

More information

Application of Particle Swarm Optimization to Economic Dispatch Problem: Advantages and Disadvantages

Application of Particle Swarm Optimization to Economic Dispatch Problem: Advantages and Disadvantages Appcaton of Partce Swarm Optmzaton to Economc Dspatch Probem: Advantages and Dsadvantages Kwang Y. Lee, Feow, IEEE, and Jong-Bae Par, Member, IEEE Abstract--Ths paper summarzes the state-of-art partce

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Image Classification Using EM And JE algorithms

Image Classification Using EM And JE algorithms Machne earnng project report Fa, 2 Xaojn Sh, jennfer@soe Image Cassfcaton Usng EM And JE agorthms Xaojn Sh Department of Computer Engneerng, Unversty of Caforna, Santa Cruz, CA, 9564 jennfer@soe.ucsc.edu

More information

arxiv: v1 [cs.gt] 28 Mar 2017

arxiv: v1 [cs.gt] 28 Mar 2017 A Dstrbuted Nash qubrum Seekng n Networked Graphca Games Farzad Saehsadaghan, and Lacra Pave arxv:7009765v csgt 8 Mar 07 Abstract Ths paper consders a dstrbuted gossp approach for fndng a Nash equbrum

More information

Deriving the X-Z Identity from Auxiliary Space Method

Deriving the X-Z Identity from Auxiliary Space Method Dervng the X-Z Identty from Auxlary Space Method Long Chen Department of Mathematcs, Unversty of Calforna at Irvne, Irvne, CA 92697 chenlong@math.uc.edu 1 Iteratve Methods In ths paper we dscuss teratve

More information

Price Competition under Linear Demand and Finite Inventories: Contraction and Approximate Equilibria

Price Competition under Linear Demand and Finite Inventories: Contraction and Approximate Equilibria Prce Competton under Lnear Demand and Fnte Inventores: Contracton and Approxmate Equbra Jayang Gao, Krshnamurthy Iyer, Huseyn Topaogu 1 Abstract We consder a compettve prcng probem where there are mutpe

More information

MAXIMUM NORM STABILITY OF DIFFERENCE SCHEMES FOR PARABOLIC EQUATIONS ON OVERSET NONMATCHING SPACE-TIME GRIDS

MAXIMUM NORM STABILITY OF DIFFERENCE SCHEMES FOR PARABOLIC EQUATIONS ON OVERSET NONMATCHING SPACE-TIME GRIDS MATHEMATICS OF COMPUTATION Voume 72 Number 242 Pages 619 656 S 0025-57180201462-X Artce eectroncay pubshed on November 4 2002 MAXIMUM NORM STABILITY OF DIFFERENCE SCHEMES FOR PARABOLIC EQUATIONS ON OVERSET

More information

The Minimum Universal Cost Flow in an Infeasible Flow Network

The Minimum Universal Cost Flow in an Infeasible Flow Network Journal of Scences, Islamc Republc of Iran 17(2): 175-180 (2006) Unversty of Tehran, ISSN 1016-1104 http://jscencesutacr The Mnmum Unversal Cost Flow n an Infeasble Flow Network H Saleh Fathabad * M Bagheran

More information

SELECTED SOLUTIONS, SECTION (Weak duality) Prove that the primal and dual values p and d defined by equations (4.3.2) and (4.3.3) satisfy p d.

SELECTED SOLUTIONS, SECTION (Weak duality) Prove that the primal and dual values p and d defined by equations (4.3.2) and (4.3.3) satisfy p d. SELECTED SOLUTIONS, SECTION 4.3 1. Weak dualty Prove that the prmal and dual values p and d defned by equatons 4.3. and 4.3.3 satsfy p d. We consder an optmzaton problem of the form The Lagrangan for ths

More information

Solution of a nonsymmetric algebraic Riccati equation from a one-dimensional multistate transport model

Solution of a nonsymmetric algebraic Riccati equation from a one-dimensional multistate transport model IMA Journa of Numerca Anayss (2011) 1, 145 1467 do:10.109/manum/drq04 Advance Access pubcaton on May 0, 2011 Souton of a nonsymmetrc agebrac Rccat equaton from a one-dmensona mutstate transport mode TIEXIANG

More information

Integrating advanced demand models within the framework of mixed integer linear problems: A Lagrangian relaxation method for the uncapacitated

Integrating advanced demand models within the framework of mixed integer linear problems: A Lagrangian relaxation method for the uncapacitated Integratng advanced demand modes wthn the framework of mxed nteger near probems: A Lagrangan reaxaton method for the uncapactated case Mertxe Pacheco Paneque Shad Sharf Azadeh Mche Berare Bernard Gendron

More information

Lecture 20: Lift and Project, SDP Duality. Today we will study the Lift and Project method. Then we will prove the SDP duality theorem.

Lecture 20: Lift and Project, SDP Duality. Today we will study the Lift and Project method. Then we will prove the SDP duality theorem. prnceton u. sp 02 cos 598B: algorthms and complexty Lecture 20: Lft and Project, SDP Dualty Lecturer: Sanjeev Arora Scrbe:Yury Makarychev Today we wll study the Lft and Project method. Then we wll prove

More information

Yong Joon Ryang. 1. Introduction Consider the multicommodity transportation problem with convex quadratic cost function. 1 2 (x x0 ) T Q(x x 0 )

Yong Joon Ryang. 1. Introduction Consider the multicommodity transportation problem with convex quadratic cost function. 1 2 (x x0 ) T Q(x x 0 ) Kangweon-Kyungk Math. Jour. 4 1996), No. 1, pp. 7 16 AN ITERATIVE ROW-ACTION METHOD FOR MULTICOMMODITY TRANSPORTATION PROBLEMS Yong Joon Ryang Abstract. The optmzaton problems wth quadratc constrants often

More information

Some results on a cross-section in the tensor bundle

Some results on a cross-section in the tensor bundle Hacettepe Journa of Matematcs and Statstcs Voume 43 3 214, 391 397 Some resuts on a cross-secton n te tensor bunde ydın Gezer and Murat tunbas bstract Te present paper s devoted to some resuts concernng

More information

ON THE BEHAVIOR OF THE CONJUGATE-GRADIENT METHOD ON ILL-CONDITIONED PROBLEMS

ON THE BEHAVIOR OF THE CONJUGATE-GRADIENT METHOD ON ILL-CONDITIONED PROBLEMS ON THE BEHAVIOR OF THE CONJUGATE-GRADIENT METHOD ON I-CONDITIONED PROBEM Anders FORGREN Technca Report TRITA-MAT-006-O Department of Mathematcs Roya Insttute of Technoogy January 006 Abstract We study

More information

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg prnceton unv. F 17 cos 521: Advanced Algorthm Desgn Lecture 7: LP Dualty Lecturer: Matt Wenberg Scrbe: LP Dualty s an extremely useful tool for analyzng structural propertes of lnear programs. Whle there

More information

Feature Selection: Part 1

Feature Selection: Part 1 CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?

More information

Decentralized Adaptive Control for a Class of Large-Scale Nonlinear Systems with Unknown Interactions

Decentralized Adaptive Control for a Class of Large-Scale Nonlinear Systems with Unknown Interactions Decentrazed Adaptve Contro for a Cass of Large-Scae onnear Systems wth Unknown Interactons Bahram Karm 1, Fatemeh Jahangr, Mohammad B. Menhaj 3, Iman Saboor 4 1. Center of Advanced Computatona Integence,

More information

Supplement: Proofs and Technical Details for The Solution Path of the Generalized Lasso

Supplement: Proofs and Technical Details for The Solution Path of the Generalized Lasso Supplement: Proofs and Techncal Detals for The Soluton Path of the Generalzed Lasso Ryan J. Tbshran Jonathan Taylor In ths document we gve supplementary detals to the paper The Soluton Path of the Generalzed

More information

Dynamic Analysis Of An Off-Road Vehicle Frame

Dynamic Analysis Of An Off-Road Vehicle Frame Proceedngs of the 8th WSEAS Int. Conf. on NON-LINEAR ANALYSIS, NON-LINEAR SYSTEMS AND CHAOS Dnamc Anass Of An Off-Road Vehce Frame ŞTEFAN TABACU, NICOLAE DORU STĂNESCU, ION TABACU Automotve Department,

More information

Solutions HW #2. minimize. Ax = b. Give the dual problem, and make the implicit equality constraints explicit. Solution.

Solutions HW #2. minimize. Ax = b. Give the dual problem, and make the implicit equality constraints explicit. Solution. Solutons HW #2 Dual of general LP. Fnd the dual functon of the LP mnmze subject to c T x Gx h Ax = b. Gve the dual problem, and make the mplct equalty constrants explct. Soluton. 1. The Lagrangan s L(x,

More information

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could

More information

we have E Y x t ( ( xl)) 1 ( xl), e a in I( Λ ) are as follows:

we have E Y x t ( ( xl)) 1 ( xl), e a in I( Λ ) are as follows: APPENDICES Aendx : the roof of Equaton (6 For j m n we have Smary from Equaton ( note that j '( ( ( j E Y x t ( ( x ( x a V ( ( x a ( ( x ( x b V ( ( x b V x e d ( abx ( ( x e a a bx ( x xe b a bx By usng

More information

Relaxation Methods for Iterative Solution to Linear Systems of Equations

Relaxation Methods for Iterative Solution to Linear Systems of Equations Relaxaton Methods for Iteratve Soluton to Lnear Systems of Equatons Gerald Recktenwald Portland State Unversty Mechancal Engneerng Department gerry@pdx.edu Overvew Techncal topcs Basc Concepts Statonary

More information

QUARTERLY OF APPLIED MATHEMATICS

QUARTERLY OF APPLIED MATHEMATICS QUARTERLY OF APPLIED MATHEMATICS Voume XLI October 983 Number 3 DIAKOPTICS OR TEARING-A MATHEMATICAL APPROACH* By P. W. AITCHISON Unversty of Mantoba Abstract. The method of dakoptcs or tearng was ntroduced

More information

NP-Completeness : Proofs

NP-Completeness : Proofs NP-Completeness : Proofs Proof Methods A method to show a decson problem Π NP-complete s as follows. (1) Show Π NP. (2) Choose an NP-complete problem Π. (3) Show Π Π. A method to show an optmzaton problem

More information

Assortment Optimization under MNL

Assortment Optimization under MNL Assortment Optmzaton under MNL Haotan Song Aprl 30, 2017 1 Introducton The assortment optmzaton problem ams to fnd the revenue-maxmzng assortment of products to offer when the prces of products are fxed.

More information

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number

More information

Additional Codes using Finite Difference Method. 1 HJB Equation for Consumption-Saving Problem Without Uncertainty

Additional Codes using Finite Difference Method. 1 HJB Equation for Consumption-Saving Problem Without Uncertainty Addtonal Codes usng Fnte Dfference Method Benamn Moll 1 HJB Equaton for Consumpton-Savng Problem Wthout Uncertanty Before consderng the case wth stochastc ncome n http://www.prnceton.edu/~moll/ HACTproect/HACT_Numercal_Appendx.pdf,

More information

Associative Memories

Associative Memories Assocatve Memores We consder now modes for unsupervsed earnng probems, caed auto-assocaton probems. Assocaton s the task of mappng patterns to patterns. In an assocatve memory the stmuus of an ncompete

More information

Approximate Circle Packing in a Rectangular Container: Integer Programming Formulations and Valid Inequalities

Approximate Circle Packing in a Rectangular Container: Integer Programming Formulations and Valid Inequalities Appromate Crce Pacng n a Rectanguar Contaner: Integer Programmng Formuatons and Vad Inequates Igor Ltvnchev, Lus Infante, and Edth Lucero Ozuna Espnosa Department of Mechanca and Eectrca Engneerng Nuevo

More information

LOW-DENSITY Parity-Check (LDPC) codes have received

LOW-DENSITY Parity-Check (LDPC) codes have received IEEE TRANSACTIONS ON COMMUNICATIONS, VOL. 59, NO. 7, JULY 2011 1807 Successve Maxmzaton for Systematc Desgn of Unversay Capacty Approachng Rate-Compatbe Sequences of LDPC Code Ensembes over Bnary-Input

More information

Xin Li Department of Information Systems, College of Business, City University of Hong Kong, Hong Kong, CHINA

Xin Li Department of Information Systems, College of Business, City University of Hong Kong, Hong Kong, CHINA RESEARCH ARTICLE MOELING FIXE OS BETTING FOR FUTURE EVENT PREICTION Weyun Chen eartment of Educatona Informaton Technoogy, Facuty of Educaton, East Chna Norma Unversty, Shangha, CHINA {weyun.chen@qq.com}

More information

Randomized block proximal damped Newton method for composite self-concordant minimization

Randomized block proximal damped Newton method for composite self-concordant minimization Randomzed block proxmal damped Newton method for composte self-concordant mnmzaton Zhaosong Lu June 30, 2016 Revsed: March 28, 2017 Abstract In ths paper we consder the composte self-concordant CSC mnmzaton

More information

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography CSc 6974 and ECSE 6966 Math. Tech. for Vson, Graphcs and Robotcs Lecture 21, Aprl 17, 2006 Estmatng A Plane Homography Overvew We contnue wth a dscusson of the major ssues, usng estmaton of plane projectve

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

Optimization Models for Heterogeneous Protocols

Optimization Models for Heterogeneous Protocols Optmzaton Modes for Heterogeneous Protocos Steven Low CS, EE netab.caltech.edu wth J. Doye, S. Hegde, L. L, A. Tang, J. Wang, Catech M. Chang, Prnceton Outne Internet protocos Horzonta decomposton TCP-AQM

More information

Dynamic Systems on Graphs

Dynamic Systems on Graphs Prepared by F.L. Lews Updated: Saturday, February 06, 200 Dynamc Systems on Graphs Control Graphs and Consensus A network s a set of nodes that collaborates to acheve what each cannot acheve alone. A network,

More information

LOW BIAS INTEGRATED PATH ESTIMATORS. James M. Calvin

LOW BIAS INTEGRATED PATH ESTIMATORS. James M. Calvin Proceedngs of the 007 Wnter Smulaton Conference S G Henderson, B Bller, M-H Hseh, J Shortle, J D Tew, and R R Barton, eds LOW BIAS INTEGRATED PATH ESTIMATORS James M Calvn Department of Computer Scence

More information

SIMULTANEOUS wireless information and power transfer. Joint Optimization of Power and Data Transfer in Multiuser MIMO Systems

SIMULTANEOUS wireless information and power transfer. Joint Optimization of Power and Data Transfer in Multiuser MIMO Systems Jont Optmzaton of Power and Data ransfer n Mutuser MIMO Systems Javer Rubo, Antono Pascua-Iserte, Dane P. Paomar, and Andrea Godsmth Unverstat Potècnca de Cataunya UPC, Barceona, Span ong Kong Unversty

More information

Neural network-based athletics performance prediction optimization model applied research

Neural network-based athletics performance prediction optimization model applied research Avaabe onne www.jocpr.com Journa of Chemca and Pharmaceutca Research, 04, 6(6):8-5 Research Artce ISSN : 0975-784 CODEN(USA) : JCPRC5 Neura networ-based athetcs performance predcton optmzaton mode apped

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

Modelli Clamfim Equazioni differenziali 7 ottobre 2013

Modelli Clamfim Equazioni differenziali 7 ottobre 2013 CLAMFIM Bologna Modell 1 @ Clamfm Equazon dfferenzal 7 ottobre 2013 professor Danele Rtell danele.rtell@unbo.t 1/18? Ordnary Dfferental Equatons A dfferental equaton s an equaton that defnes a relatonshp

More information

Perron Vectors of an Irreducible Nonnegative Interval Matrix

Perron Vectors of an Irreducible Nonnegative Interval Matrix Perron Vectors of an Irreducble Nonnegatve Interval Matrx Jr Rohn August 4 2005 Abstract As s well known an rreducble nonnegatve matrx possesses a unquely determned Perron vector. As the man result of

More information

More metrics on cartesian products

More metrics on cartesian products More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of

More information

3. Stress-strain relationships of a composite layer

3. Stress-strain relationships of a composite layer OM PO I O U P U N I V I Y O F W N ompostes ourse 8-9 Unversty of wente ng. &ech... tress-stran reatonshps of a composte ayer - Laurent Warnet & emo Aerman.. tress-stran reatonshps of a composte ayer Introducton

More information

6.854J / J Advanced Algorithms Fall 2008

6.854J / J Advanced Algorithms Fall 2008 MIT OpenCourseWare http://ocw.mt.edu 6.854J / 18.415J Advanced Algorthms Fall 2008 For nformaton about ctng these materals or our Terms of Use, vst: http://ocw.mt.edu/terms. 18.415/6.854 Advanced Algorthms

More information

Domain Decomposition of an Optimal Control Problem for Semi-Linear Elliptic Equations on Metric Graphs with Application to Gas Networks

Domain Decomposition of an Optimal Control Problem for Semi-Linear Elliptic Equations on Metric Graphs with Application to Gas Networks Apped athematcs 7 8 74-99 http://wwwscrporg/journa/am IN Onne: 5-7393 IN Prnt: 5-7385 Doman Decomposton of an Optma Contro Probem for em-lnear Eptc Equatons on etrc Graphs wth Appcaton to Gas Networs Günter

More information

Some Comments on Accelerating Convergence of Iterative Sequences Using Direct Inversion of the Iterative Subspace (DIIS)

Some Comments on Accelerating Convergence of Iterative Sequences Using Direct Inversion of the Iterative Subspace (DIIS) Some Comments on Acceleratng Convergence of Iteratve Sequences Usng Drect Inverson of the Iteratve Subspace (DIIS) C. Davd Sherrll School of Chemstry and Bochemstry Georga Insttute of Technology May 1998

More information

Vector Norms. Chapter 7 Iterative Techniques in Matrix Algebra. Cauchy-Bunyakovsky-Schwarz Inequality for Sums. Distances. Convergence.

Vector Norms. Chapter 7 Iterative Techniques in Matrix Algebra. Cauchy-Bunyakovsky-Schwarz Inequality for Sums. Distances. Convergence. Vector Norms Chapter 7 Iteratve Technques n Matrx Algebra Per-Olof Persson persson@berkeley.edu Department of Mathematcs Unversty of Calforna, Berkeley Math 128B Numercal Analyss Defnton A vector norm

More information

Example: Suppose we want to build a classifier that recognizes WebPages of graduate students.

Example: Suppose we want to build a classifier that recognizes WebPages of graduate students. Exampe: Suppose we want to bud a cassfer that recognzes WebPages of graduate students. How can we fnd tranng data? We can browse the web and coect a sampe of WebPages of graduate students of varous unverstes.

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

A parametric Linear Programming Model Describing Bandwidth Sharing Policies for ABR Traffic

A parametric Linear Programming Model Describing Bandwidth Sharing Policies for ABR Traffic parametrc Lnear Programmng Mode Descrbng Bandwdth Sharng Poces for BR Traffc I. Moschoos, M. Logothets and G. Kokknaks Wre ommuncatons Laboratory, Dept. of Eectrca & omputer Engneerng, Unversty of Patras,

More information