Proseminar Optimierung II. Victor A. Kovtunenko SS 2012/2013: LV

Size: px
Start display at page:

Download "Proseminar Optimierung II. Victor A. Kovtunenko SS 2012/2013: LV"

Transcription

1 Prosemnar Optmerung II Vctor A. Kovtunenko Insttute for Mathematcs and Scentfc Computng, Karl-Franzens Unversty of Graz, Henrchstr. 36, 8010 Graz, Austra; Lavrent ev Insttute of Hydrodynamcs, Sberan Dvson of the Russan Academy of Scences, Novosbrsk, Russa SS 01/013: LV

2 V.A. Kovtunenko: Prosemnar Optmerung II Список литературы [1] D.P. Bertsekas, A. Nedć, A.E. Ozdaglar, Convex Analyss and Optmzaton. Athena Scentfc, Belmont, 003, 534 pp. [] C. Geger, C. Kanzow, Numersche Verfahren zur Lösung unrestrngerter Optmerungsaufgaben. Sprnger-Verlag, Berln, 1999, 487 S. [3] C. T. Kelley, Iteratve Methods for Optmzaton. SIAM, Phladelpha, PA, 1999, 180 pp. [4] A.M. Khludnev, V.A. Kovtunenko, Analyss of Cracks n Solds. WIT-Press, Southampton, Boston, 000, 408 pp. [5] D.G. Luenberger, Y. Ye, Lnear and Nonlnear Programmng. Sprnger, New York, 008, 546 pp. [6] J. Nocedal, S.J. Wrght, Numercal Optmzaton. Sprnger-Verlag, New York, 1999, 636 pp. [7] В.М. Алексеев, Э.М. Галеев, В.М. Тихомиров, Сборник задач по оптимизации. Москва: Наука, 1984, 88 с.

3 (1.1) V.A. Kovtunenko: Prosemnar Optmerung II 3 1. Lnear programs Lnear programs (LP) are stated n the canoncal (standard) form: mn c x over x R n subject to Ax = b, x 0, where c R n, b R m, A R m n, and m < n. The feasble set s defned (1.) F = {x R n : Ax = b, x 0}, and ts boundary corresponds to the actve set (1.3) A = {x R n : Ax = b, x j = 0}. The fundamental theorem of LP states that optmal solutons of (1.1) occur along A at extreme (corner) ponts of F. Exercse 1.1. Rewrte the followng LP n the standard form: mn{x 1 + x + 3x 3 } over x R 3 subject to 4 x 1 + x 5, 6 x 1 + x 3 7, x 1 0, x 0, x 3 0, and determne A, b, c n (1.1). Hnt: ntroduce slack varables x 4 0, x 5 0, x 6 0, x 7 0 to satsfy the non-standard nequalty constrants. Exercse 1.. Rewrte the followng LP n the standard form: mn{x + y + z} x + y 1, x + z = 3, over (x, y, z) R 3 subject to and determne A, b, c n (1.1). Hnt: ntroduce non-negatve varables (x 1,..., x 6 ) by x = x 1 x, y = x 3 x 4, z = x 5 x 6. Exercse 1.3. Convert the l 1 -mnmzaton problem to LP: mn x over x R 1 subject to Ax = b. Hnt: use the nequalty x x x. Exercse 1.4. Gven the constrants: x 1 + x 16, x 1 + x 1, x 1 + x, x 1 0, x 0, a) plot the feasble set F and determne the extreme (corner) ponts, b) plot the actve set A. Exercse 1.5. Consder feasble set F R satsfyng the followng nequaltes: x 1 0, x 0, x x 1, x 1 + x 7. a) Plot F and lst ts extreme ponts, b) compute mn{x 1 x } over x R subject to x F, c) compute max{x 1 x } over x R subject to x F. Hnt: apply the fundamental theorem of LP.

4 4 V.A. Kovtunenko: Prosemnar Optmerung II (.1). Nonlnear programs We consder nonlnear programs (NLP) of the general form: mn J(x) over x R n subject to e(x) = 0, g(x) 0, where J : R n R, e : R n R m, e = (e 1,..., e m ), m < n, and g : R n R p, g = (g 1,..., g p ). The feasble set s defned (.) F = {x R n : e(x) = 0, g(x) 0}, and ts boundary corresponds to the actve set (.3) A = {x R n : e(x) = 0, g j (x) = 0}. A vector x F s called a global soluton of (.1) f J(x ) J(x) for all x F, respectvely, a local soluton f there s a neghborhood U(x ) F such that J(x ) J(x) for all x U(x ). Exercse.1. In R consder the constrants: x 1 0, x 0, x (x 1 1) 0, x 1. Plot the feasble set F n (.) and show that x = (1, 0) s feasble but not regular. Hnt: A pont x at the hyperplane E = {x R n : e 1 (x) = 0,..., e k (x) = 0} s regular f vectors e 1 (x ),..., e k (x ) are lnearly ndependent (called lnear ndependence constraned qualfcaton (LICQ)). Exercse.. Let F be a convex set n R. Ths mples that for all x, y F and t [0, 1] ponts tx + (1 t)y F. Prove that, for nonnegatve weghts c 1,..., c k R such that k =1 c = 1, f x 1,..., x k F then k =1 c x F. Hnt: use nducton over k. Exercse.3. Determne, whether the followng functons defned on R + a) J(x) = x 1 x b) J(x) = 1 x c) J(x) = x 1 (ln x 1 1) + x (ln x 1) 1 x are convex, concave, or nether. Hnt: twce dfferentable functon J s convex on a convex set f ts Hessan matrx D( J) s postve semdefnte (spd). Exercse.4. If the objectve functon J s convex on the convex feasble set F, prove that local solutons of (.1) are also global solutons. Exercse.5. Consder the problem: mn{x 1 + x } subject to x 1 + x = 1. a) Solve the problem by elmnatng the varable x. b) Plot F and fnd tangent plane at feasble pont x = ( 1, 1 ). Hnt: at a regular pont x of hyperplane E the tangent plane (tangent cone) s (.4) T (x ) = {v R n : De(x ) v = 0} wth De := ( e 1,..., e k ). c) Fnd (the Lagrange multpler) λ R such that J(x ) = De(x )λ. d) Illustrate ths problem geometrcally.

5 (3.1) V.A. Kovtunenko: Prosemnar Optmerung II 5 3. Equalty constraned optmzaton For the equalty constraned mnmzaton problem mn J(x) over x R n subject to {e 1 (x) = 0,..., e m (x) = 0} the Lagrange functon (Lagrangan) s defned by (3.) L : R n+m R, L(x, λ) = J(x) + e(x) λ wth the Lagrange multplers (dual varables) λ R m. The frst order necessary condton of optmalty mples a statonary pont (x, λ ) solvng (3.3) (x,λ) L(x, λ ) = 0 { J(x ) + De(x )λ = 0, e(x ) = 0}, and the second order suffcent condton along the tangent plane T (x ) reads (3.4) v D( L)(x, λ ) v > 0 for all v T (x ), where the Hessan matrx (Hessan) D( L) : R n+m R n n. Exercse 3.1. Solve usng Lagrange multplers: mn{ x 3 y 3 } subject to x + y = 1. Exercse 3.. Consder the followng mnmzaton problem: x 1 1 x + 1 x 3 + x 1 x } subject to x1 + x = 4, x 3 9x = 0. a) Construct Lagrangan and fnd the statonary pont (x, λ ) of ths problem. b) Check a second order suffcent condton along the tangent plane T (x ). Exercse 3.3. If a, b, c are postve real numbers, prove that the nequalty a 3 + b 3 + c 3 3abc holds. Hnt: t s enough to prove that, f 3abc = K (wth arbtrarly fxed K 0), then mn{a 3 + b 3 + c 3 } = K. Exercse 3.4. A box wth sdes x, y and z s to be manufactured such that ts top (xy), bottom (xy), and front (xz) faces must be doubled. Fnd the dmensons of such a box that maxmze the volume xyz for a gven face area, equal to 18 dm. Hnt: use frst order and verfy second order condtons. Exercse 3.5. Fnd an orented trangle of maxmal area such that one vertex s (x 3, y 3 ) = (1, 0) and the other two vertexes (x 1, y 1 ), (x, y ) le on the unt crcle n R. Hnt: use the area formula S = 1 x 1 x x 3 y 1 y y

6 6 V.A. Kovtunenko: Prosemnar Optmerung II (4.1) 4. Inequalty constraned optmzaton For the general, equalty and nequalty constraned mnmzaton problem mn J(x) over x R n subject to e(x) = 0, g(x) 0, the Lagrange functon (Lagrangan) s defned by (4.) L : R n+m+p R, L(x, λ, µ) = J(x) + e(x) λ + g(x) µ wth the Lagrange multplers (λ, µ) R m+p. The frst order necessary optmalty condton mples Karush Kuhn Tucker (KKT) condtons: { (x,λ) L(x, λ, µ J(x ) = 0 ) + De(x )λ + Dg(x )µ = 0, (4.3a) e(x ) = 0 (4.3b) µ 0, g(x ) 0, g (x )µ = 0 for = 1,..., p. The second order suffcent condton along the tangent plane T (x ) reads (4.4) v D( L)(x, λ, µ ) v > 0 for all v T (x ). Exercse 4.1. Solve usng KKT condtons: mn{3x x 3 } over x R subject to x and plot the objectve functon on the feasble set. Exercse 4.. Solve usng KKT condtons: mn{e x 1 } over x R subject to e x 1 + x = 7, x 1 0, x 0. Exercse 4.3. Verfy that complementarty condtons (4.3b) follows from the varatonal nequalty (4.5) µ 0, µ L(x, λ, µ ) (µ µ ) 0 for all {µ R p : µ 0}. Hnt: plug µ = (µ 1,..., tµ,..., µ p) wth arbtrary t > 0 and {1,..., p}. Exercse 4.4. Let x, λ, µ 0 be a soluton of the mnmax problem: (4.6) mn max L(x, λ, µ) over (x, λ, µ) (x,λ) µ Rn+m+p subject to µ 0. (a) Verfy that ths soluton satsfy KKT condtons (4.3). (b) From (4.6) derve that x s feasble and that t solves (4.1). Exercse 4.5. Consder the followng mnmzaton problem: mn{ (x ) (y 1) } subject to x + 4y 3 and x y. Solve ths problem usng KKT condtons.

7 V.A. Kovtunenko: Prosemnar Optmerung II 7 5. Regularty and senstvty Exercse 5.1. Consder the mnmzaton problem: mn{ x 1 } over x R subject to x 1 x and x 0. Fnd the global soluton of ths problem and verfy that there s no KKT ponts. Hnt: consder the feasble set and check ts regularty. Exercse 5.. For the problem: mn{y x} over R subject to y (1 x) 3, x + y 1, y 0, verfy that the objectve functon s convex, the feasble set s convex and has a non-empty nteror (Slater condton). Ths wll guarantee that the KKT pont s the soluton of the convex mnmzaton problem. Fnd ths soluton. Check a second order suffcent condton as well. Exercse 5.3. Consder the problem: mn{4x 1 + x } over x R subject to x 1 + x 9, x 1 0, x 0. (a) Verfy that the feasble set F s non-convex (hence, a KKT pont may be not the soluton of the mnmzaton problem). (b) Snce the objectve functon s lnear, fnd the global soluton of the problem at the boundary (the actve set A) of F. (c) Fnd all solutons of the KKT system at A and determne ts knd of extrema. Exercse 5.4. Consder the followng mnmzaton problem: αx x + 5x 1 } over x R subject to x 1 0. Usng frst and second order optmalty condtons fnd ts soluton n dependence of the parameter α R. Exercse 5.5. Consder the problem: mn { (x 1) + y } over R subject to x 1 β y. For what values of the parameter β R ts KKT pont s a local soluton? Hnt: use a second order suffcent condton.

8 8 V.A. Kovtunenko: Prosemnar Optmerung II 6. Quadratc programs We consder quadratc programs (QP) as NLP wth the specfc (quadratc) objectve functon J(x) = 1 x Qx + d x, where the matrx Q R n n s symmetrc postve defnte (spd) and d R n. Exercse 6.1. Verfy that, for Q spd(r n n ), the quadratc objectve functon J s convex by the defnton: J ( tx+(1 t)y ) tj(x)+(1 t)j(y). Hnt: use the Cauchy Schwarz nequalty x Qy 1 x Qx + 1 y Qy. Exercse 6.. Consder the quadratc mnmzaton problem: (6.1) x Qx + d x } subject to Ax = b, where A R m n and b R m. Prove that x s a local soluton of the problem f and only f t s a global soluton. Exercse 6.3. The problem of fndng the shortest dstance from a pont x 0 R n to the hyperplane {x R n : Ax = b} can be formulated as (x x0 ) (x x 0 ) } subject to Ax = b. (a) Verfy that the problem s of the form (6.1) and determne Q and d. (b) Show that: the matrx AA s nonsngular f A has full rank, (c) the statonary pont s x = x 0 A λ and λ = (AA ) 1 (Ax 0 b), (d) the statonary pont s the optmal soluton. Exercse 6.4. For gven data ponts (x 1, y 1 ),..., (x n, y n ) n R, the lnear regresson problem conssts n fttng the lne y = d+kx such that to mnmze the resduals: n } mn{ 1 (d + kx y ) (d + kx y ) over (d, k) R. =1 (a) Rewrte the problem n the followng form of unconstraned QP: (Az b) (Az b) } over z R. (b) Usng optmalty derve the normal equatons: A Az = A b. (c) Solvng these two equatons verfy the formula: k = Sxy xȳ S xx x, d = ȳ k x wrtten n statstcal terms S xx = 1 n n =1 x, S xy = 1 n n =1 x y, x = 1 n n =1 x, ȳ = 1 n n =1 y. Exercse 6.5. Consder the problem of fndng the pont on the parabola y = 1 5 (x 1) that s closest to (x 0, y 0 ) = (1, ). Ths can be formulated as (x 1) + 1 (y 1)} subject to (x 1) = 5y. Fnd the statonary pont of the problem and show that t s the mnmum pont usng a second order condton.

9 V.A. Kovtunenko: Prosemnar Optmerung II 9 7. Complementarty Exercse 7.1. The Eucldean projecton P S x 0 of a pont x 0 R n on smplex S = {x R n : x 0, x e = 1} ( e = (1,..., 1) ) solves the problem: x x0 } over x R n subject to x S. Verfy the formula P S x 0 = max(0, x 0 λ e), where λ R s a Lagrange multpler assocated to the constrant x e = 1. Exercse 7.. For the maxmzaton of entropy fnd the soluton: max { ( x 1 log(x 1 ) + + x n log(x n ) )} over x R n subject to x S. Exercse 7.3. φ : R R s called a nonlnear complementarty problem (NCP) functon f t satsfes φ(x, y) = 0 x 0, y 0, x y = 0. Verfy that the followng are NCP functons: φ mn (x, y) = mn(x, y), φ max (x, y) = y max(0, y cx) wth c > 0, φ FB (x, y) = x + y (x + y) (Fscher Burmester) Exercse 7.4. Consder an abstract nequalty constraned mnmzaton (7.1) mn J(x) over x R n subject to Ax = b, x 0. (a) Usng KKT condtons and a NCP functon φ derve the NCP Φ(x, λ, µ ) := J(x ) + A λ µ (7.) Ax b = 0. φ(x, µ ) (b) If there exsts ndex such that both x = µ = 0, then prove that the matrx of dervatves D (x,λ,µ) Φ(x, λ, µ ) s sngular. (c) Usng φ max rewrte φ max (x, µ ) = 0 as the lnear system x = 0 for A(x, µ ), µ = 0 for I(x, µ ) wth respect to prmal-dual strctly actve A and nactve I sets of ndexes A(x, µ ) = { : µ cx > 0 }, I(x, µ ) = { : µ cx 0 }. (d) On ths bass suggest teratons solvng nonlnear problem (7.). Exercse 7.5. Consder the nequalty constraned optmzaton mn 1 { x 1 + (x ) + (x 3 + 3) } subject to x 1 0, x 0, x 3 0. Intalzng A ( 1) =, I ( 1) = {1,, 3} terate the actve and nactve sets A (k) = { : µ (k) cx (k) > 0 }, I (k) = { : µ (k) cx (k) 0 } together wth x (k) = 0 for A (k 1) and µ (k) = 0 for I (k 1). From the teraton fnd a soluton of the KKT system for ths problem.

10 10 V.A. Kovtunenko: Prosemnar Optmerung II 8. Approxmatons Exercse 8.1. Let x = argmn{j(x)} over x R n subject to e(x) = 0, g(x) 0. Verfy that there exsts constant c > 0 such that x s also a mnmzer of the penalty problem mn { J(x) + c ( e(x) 1 + max(0, g(x)) 1 )} over x R n. Hnt: use the fact that the assocated Lagrangan L satsfes L(x, λ, µ ) L(x, λ, µ ) for all x R n. Exercse 8.. (a) For the constraned mnmzaton problem mn { J(x) = (x 1 6) + (x 7) } over x R subject to x 1 + x 7 fnd the soluton x and the assocated Lagrange multpler λ. (b) Consder the penalty problem mn { J α (x) = J(x) + α ( max(0, x 1 + x 7) ) } over x R and fnd ts soluton x α n dependence of the parameter α > 0. Is x α nteror or exteror to the feasble set? (c) Verfy that x α x and α max(0, x α 1 + xα 7) λ as α. Exercse 8.3. (a) Fnd a soluton x of the constraned mnmzaton problem mn { J(x) = x 1 + 9x } over x R subject to x 1 + x 4. τ x 1 +x 4 (b) Consder the assocated nverse barrer functon J τ (x) = J(x) + n dependence of the parameter τ > 0. Fnd the soluton x τ of the unconstraned problem: mn{j τ (x)} over x R, and compare x τ wth x. Exercse 8.4. For τ > 0 consder a log-barrer functon constraned mnmzaton mn { n (8.1) J(x) τ log(x ) } over x R n subject to Ax = b, x > 0. =1 (a) Verfy that ts KKT condtons lead to the nteror pont system J(x) + A λ ν = 0 (8.) Ax b = 0. x ν = τ for = 1,..., n, x > 0, ν > 0. (b) Verfy that, conversely, f the nteror pont system (8.) s solvable, then t follows the KKT system for (8.1). Exercse 8.5. For the nequalty constraned optmzaton problem mn { J(x) = 5x 1 + x } over x R subject to x 1 1, x 0, wrte the nteror pont system and fnd ts soluton n dependence of parameter τ > 0. Passng τ 0 fnd a soluton of KKT system for ths problem.

11 V.A. Kovtunenko: Prosemnar Optmerung II Smplex method For LP: mn{c x} over x R n subject to Ax = b, x 0 wth A R m n, parttonng A = (B, D), x = (x B, x D ), c = (c B, c D ) mples the system Ix b + B 1 Dx D = B 1 b, (c D c BB 1 D)x D = c x c BB 1 b expressed by Tableau: ( I B 1 D B 1 ) b T := 0 c D c B B 1 D =: r c B B 1 b and follows the smplex algorthm: Repeat untl r 0: select j = argmn {r k : r k < 0}, = argmn k {1,...,n} k {1,...,m} { (B 1 b) k (B 1 D) kj : (B 1 b) k (B 1 D) kj 0 }, pvot on (B 1 D) j. If all (B 1 b) k (B 1 D) kj < 0 then the problem s unbounded. Exercse 9.1. Usng the smplex algorthm, terate x = B 1 b from T for max{5x 1 + x } over x R subject to 4x 1 + 3x 1, x 1 + 3x 6, x 1 0, x 0. Sketch the feasble set and the path of the smplex steps stoppng at the soluton of ths LP. Are the teratons of x feasble? Exercse 9.. Usng the smplex algorthm, solve the followng LP: max{x 1 + x } over x R subject to x 1 + x 1, x 1 x 1, x 1 0, x 0. Sketch the feasble set and the path of the smplex steps. Exercse 9.3. Fnd all solutons of the LP: max{x 1 + x + x 3 } over x R 3 subject to x 1 + x + x 3, x 1 + 4x + x 3 4, x 1 0,..., x 3 0. Exercse 9.4. Fnd an optmal soluton of the problem: mn { x 1 3x 5 x 3} over x R 3 subject to x 1 0,..., x 3 0, 3x 1 x + x 3 7, x 1 + 4x 1, 4x 1 + 3x + 3x Exercse 9.5. Consder the followng LP: mn{5x 1 + 3x } over x R 3 subject to x 1 0,..., x 3 0, x 1 x + 4x 3 4, x 1 + x + x 3 5, x 1 x + x 3 1. Startng pvot element (3, 3), solve the problem wth the (dual) smplex algorthm.

Solutions to exam in SF1811 Optimization, Jan 14, 2015

Solutions to exam in SF1811 Optimization, Jan 14, 2015 Solutons to exam n SF8 Optmzaton, Jan 4, 25 3 3 O------O -4 \ / \ / The network: \/ where all lnks go from left to rght. /\ / \ / \ 6 O------O -5 2 4.(a) Let x = ( x 3, x 4, x 23, x 24 ) T, where the varable

More information

Convex Optimization. Optimality conditions. (EE227BT: UC Berkeley) Lecture 9 (Optimality; Conic duality) 9/25/14. Laurent El Ghaoui.

Convex Optimization. Optimality conditions. (EE227BT: UC Berkeley) Lecture 9 (Optimality; Conic duality) 9/25/14. Laurent El Ghaoui. Convex Optmzaton (EE227BT: UC Berkeley) Lecture 9 (Optmalty; Conc dualty) 9/25/14 Laurent El Ghaou Organsatonal Mdterm: 10/7/14 (1.5 hours, n class, double-sded cheat sheet allowed) Project: Intal proposal

More information

MMA and GCMMA two methods for nonlinear optimization

MMA and GCMMA two methods for nonlinear optimization MMA and GCMMA two methods for nonlnear optmzaton Krster Svanberg Optmzaton and Systems Theory, KTH, Stockholm, Sweden. krlle@math.kth.se Ths note descrbes the algorthms used n the author s 2007 mplementatons

More information

Solutions HW #2. minimize. Ax = b. Give the dual problem, and make the implicit equality constraints explicit. Solution.

Solutions HW #2. minimize. Ax = b. Give the dual problem, and make the implicit equality constraints explicit. Solution. Solutons HW #2 Dual of general LP. Fnd the dual functon of the LP mnmze subject to c T x Gx h Ax = b. Gve the dual problem, and make the mplct equalty constrants explct. Soluton. 1. The Lagrangan s L(x,

More information

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017 U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 4s Luca Trevsan September 5, 07 Summary of Lecture 4 In whch we ntroduce semdefnte programmng and apply t to Max Cut. Semdefnte Programmng Recall that

More information

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg prnceton unv. F 17 cos 521: Advanced Algorthm Desgn Lecture 7: LP Dualty Lecturer: Matt Wenberg Scrbe: LP Dualty s an extremely useful tool for analyzng structural propertes of lnear programs. Whle there

More information

6.854J / J Advanced Algorithms Fall 2008

6.854J / J Advanced Algorithms Fall 2008 MIT OpenCourseWare http://ocw.mt.edu 6.854J / 18.415J Advanced Algorthms Fall 2008 For nformaton about ctng these materals or our Terms of Use, vst: http://ocw.mt.edu/terms. 18.415/6.854 Advanced Algorthms

More information

SELECTED SOLUTIONS, SECTION (Weak duality) Prove that the primal and dual values p and d defined by equations (4.3.2) and (4.3.3) satisfy p d.

SELECTED SOLUTIONS, SECTION (Weak duality) Prove that the primal and dual values p and d defined by equations (4.3.2) and (4.3.3) satisfy p d. SELECTED SOLUTIONS, SECTION 4.3 1. Weak dualty Prove that the prmal and dual values p and d defned by equatons 4.3. and 4.3.3 satsfy p d. We consder an optmzaton problem of the form The Lagrangan for ths

More information

APPENDIX A Some Linear Algebra

APPENDIX A Some Linear Algebra APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,

More information

Lecture 10 Support Vector Machines II

Lecture 10 Support Vector Machines II Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed

More information

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009 College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:

More information

Lagrange Multipliers Kernel Trick

Lagrange Multipliers Kernel Trick Lagrange Multplers Kernel Trck Ncholas Ruozz Unversty of Texas at Dallas Based roughly on the sldes of Davd Sontag General Optmzaton A mathematcal detour, we ll come back to SVMs soon! subject to: f x

More information

Support Vector Machines. Jie Tang Knowledge Engineering Group Department of Computer Science and Technology Tsinghua University 2012

Support Vector Machines. Jie Tang Knowledge Engineering Group Department of Computer Science and Technology Tsinghua University 2012 Support Vector Machnes Je Tang Knowledge Engneerng Group Department of Computer Scence and Technology Tsnghua Unversty 2012 1 Outlne What s a Support Vector Machne? Solvng SVMs Kernel Trcks 2 What s a

More information

COS 521: Advanced Algorithms Game Theory and Linear Programming

COS 521: Advanced Algorithms Game Theory and Linear Programming COS 521: Advanced Algorthms Game Theory and Lnear Programmng Moses Charkar February 27, 2013 In these notes, we ntroduce some basc concepts n game theory and lnear programmng (LP). We show a connecton

More information

Lecture 20: Lift and Project, SDP Duality. Today we will study the Lift and Project method. Then we will prove the SDP duality theorem.

Lecture 20: Lift and Project, SDP Duality. Today we will study the Lift and Project method. Then we will prove the SDP duality theorem. prnceton u. sp 02 cos 598B: algorthms and complexty Lecture 20: Lft and Project, SDP Dualty Lecturer: Sanjeev Arora Scrbe:Yury Makarychev Today we wll study the Lft and Project method. Then we wll prove

More information

CHAPTER 7 CONSTRAINED OPTIMIZATION 1: THE KARUSH-KUHN-TUCKER CONDITIONS

CHAPTER 7 CONSTRAINED OPTIMIZATION 1: THE KARUSH-KUHN-TUCKER CONDITIONS CHAPER 7 CONSRAINED OPIMIZAION : HE KARUSH-KUHN-UCKER CONDIIONS 7. Introducton We now begn our dscusson of gradent-based constraned optzaton. Recall that n Chapter 3 we looked at gradent-based unconstraned

More information

CHAPTER 6 CONSTRAINED OPTIMIZATION 1: K-T CONDITIONS

CHAPTER 6 CONSTRAINED OPTIMIZATION 1: K-T CONDITIONS Chapter 6: Constraned Optzaton CHAPER 6 CONSRAINED OPIMIZAION : K- CONDIIONS Introducton We now begn our dscusson of gradent-based constraned optzaton. Recall that n Chapter 3 we looked at gradent-based

More information

10-801: Advanced Optimization and Randomized Methods Lecture 2: Convex functions (Jan 15, 2014)

10-801: Advanced Optimization and Randomized Methods Lecture 2: Convex functions (Jan 15, 2014) 0-80: Advanced Optmzaton and Randomzed Methods Lecture : Convex functons (Jan 5, 04) Lecturer: Suvrt Sra Addr: Carnege Mellon Unversty, Sprng 04 Scrbes: Avnava Dubey, Ahmed Hefny Dsclamer: These notes

More information

The Minimum Universal Cost Flow in an Infeasible Flow Network

The Minimum Universal Cost Flow in an Infeasible Flow Network Journal of Scences, Islamc Republc of Iran 17(2): 175-180 (2006) Unversty of Tehran, ISSN 1016-1104 http://jscencesutacr The Mnmum Unversal Cost Flow n an Infeasble Flow Network H Saleh Fathabad * M Bagheran

More information

14 Lagrange Multipliers

14 Lagrange Multipliers Lagrange Multplers 14 Lagrange Multplers The Method of Lagrange Multplers s a powerful technque for constraned optmzaton. Whle t has applcatons far beyond machne learnng t was orgnally developed to solve

More information

n α j x j = 0 j=1 has a nontrivial solution. Here A is the n k matrix whose jth column is the vector for all t j=0

n α j x j = 0 j=1 has a nontrivial solution. Here A is the n k matrix whose jth column is the vector for all t j=0 MODULE 2 Topcs: Lnear ndependence, bass and dmenson We have seen that f n a set of vectors one vector s a lnear combnaton of the remanng vectors n the set then the span of the set s unchanged f that vector

More information

Lecture 10 Support Vector Machines. Oct

Lecture 10 Support Vector Machines. Oct Lecture 10 Support Vector Machnes Oct - 20-2008 Lnear Separators Whch of the lnear separators s optmal? Concept of Margn Recall that n Perceptron, we learned that the convergence rate of the Perceptron

More information

The General Nonlinear Constrained Optimization Problem

The General Nonlinear Constrained Optimization Problem St back, relax, and enjoy the rde of your lfe as we explore the condtons that enable us to clmb to the top of a concave functon or descend to the bottom of a convex functon whle constraned wthn a closed

More information

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons

More information

Some modelling aspects for the Matlab implementation of MMA

Some modelling aspects for the Matlab implementation of MMA Some modellng aspects for the Matlab mplementaton of MMA Krster Svanberg krlle@math.kth.se Optmzaton and Systems Theory Department of Mathematcs KTH, SE 10044 Stockholm September 2004 1. Consdered optmzaton

More information

CHAPTER 7 CONSTRAINED OPTIMIZATION 2: SQP AND GRG

CHAPTER 7 CONSTRAINED OPTIMIZATION 2: SQP AND GRG Chapter 7: Constraned Optmzaton CHAPER 7 CONSRAINED OPIMIZAION : SQP AND GRG Introducton In the prevous chapter we eamned the necessary and suffcent condtons for a constraned optmum. We dd not, however,

More information

Yong Joon Ryang. 1. Introduction Consider the multicommodity transportation problem with convex quadratic cost function. 1 2 (x x0 ) T Q(x x 0 )

Yong Joon Ryang. 1. Introduction Consider the multicommodity transportation problem with convex quadratic cost function. 1 2 (x x0 ) T Q(x x 0 ) Kangweon-Kyungk Math. Jour. 4 1996), No. 1, pp. 7 16 AN ITERATIVE ROW-ACTION METHOD FOR MULTICOMMODITY TRANSPORTATION PROBLEMS Yong Joon Ryang Abstract. The optmzaton problems wth quadratc constrants often

More information

15 Lagrange Multipliers

15 Lagrange Multipliers 15 The Method of s a powerful technque for constraned optmzaton. Whle t has applcatons far beyond machne learnng t was orgnally developed to solve physcs equatons), t s used for several ey dervatons n

More information

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could

More information

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal Inner Product Defnton 1 () A Eucldean space s a fnte-dmensonal vector space over the reals R, wth an nner product,. Defnton 2 (Inner Product) An nner product, on a real vector space X s a symmetrc, blnear,

More information

PHYS 705: Classical Mechanics. Calculus of Variations II

PHYS 705: Classical Mechanics. Calculus of Variations II 1 PHYS 705: Classcal Mechancs Calculus of Varatons II 2 Calculus of Varatons: Generalzaton (no constrant yet) Suppose now that F depends on several dependent varables : We need to fnd such that has a statonary

More information

Which Separator? Spring 1

Which Separator? Spring 1 Whch Separator? 6.034 - Sprng 1 Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng 3 Margn of a pont " # y (w $ + b) proportonal

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Inexact Newton Methods for Inverse Eigenvalue Problems

Inexact Newton Methods for Inverse Eigenvalue Problems Inexact Newton Methods for Inverse Egenvalue Problems Zheng-jan Ba Abstract In ths paper, we survey some of the latest development n usng nexact Newton-lke methods for solvng nverse egenvalue problems.

More information

Math 217 Fall 2013 Homework 2 Solutions

Math 217 Fall 2013 Homework 2 Solutions Math 17 Fall 013 Homework Solutons Due Thursday Sept. 6, 013 5pm Ths homework conssts of 6 problems of 5 ponts each. The total s 30. You need to fully justfy your answer prove that your functon ndeed has

More information

Vector Norms. Chapter 7 Iterative Techniques in Matrix Algebra. Cauchy-Bunyakovsky-Schwarz Inequality for Sums. Distances. Convergence.

Vector Norms. Chapter 7 Iterative Techniques in Matrix Algebra. Cauchy-Bunyakovsky-Schwarz Inequality for Sums. Distances. Convergence. Vector Norms Chapter 7 Iteratve Technques n Matrx Algebra Per-Olof Persson persson@berkeley.edu Department of Mathematcs Unversty of Calforna, Berkeley Math 128B Numercal Analyss Defnton A vector norm

More information

Errors for Linear Systems

Errors for Linear Systems Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch

More information

Supplement: Proofs and Technical Details for The Solution Path of the Generalized Lasso

Supplement: Proofs and Technical Details for The Solution Path of the Generalized Lasso Supplement: Proofs and Techncal Detals for The Soluton Path of the Generalzed Lasso Ryan J. Tbshran Jonathan Taylor In ths document we gve supplementary detals to the paper The Soluton Path of the Generalzed

More information

Support Vector Machines CS434

Support Vector Machines CS434 Support Vector Machnes CS434 Lnear Separators Many lnear separators exst that perfectly classfy all tranng examples Whch of the lnear separators s the best? + + + + + + + + + Intuton of Margn Consder ponts

More information

A 2D Bounded Linear Program (H,c) 2D Linear Programming

A 2D Bounded Linear Program (H,c) 2D Linear Programming A 2D Bounded Lnear Program (H,c) h 3 v h 8 h 5 c h 4 h h 6 h 7 h 2 2D Lnear Programmng C s a polygonal regon, the ntersecton of n halfplanes. (H, c) s nfeasble, as C s empty. Feasble regon C s unbounded

More information

8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS

8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS SECTION 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS 493 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS All the vector spaces you have studed thus far n the text are real vector spaces because the scalars

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Konstantn Tretyakov (kt@ut.ee) MTAT.03.227 Machne Learnng So far Supervsed machne learnng Lnear models Least squares regresson Fsher s dscrmnant, Perceptron, Logstc model Non-lnear

More information

On the Multicriteria Integer Network Flow Problem

On the Multicriteria Integer Network Flow Problem BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 5, No 2 Sofa 2005 On the Multcrtera Integer Network Flow Problem Vassl Vasslev, Marana Nkolova, Maryana Vassleva Insttute of

More information

Natural Language Processing and Information Retrieval

Natural Language Processing and Information Retrieval Natural Language Processng and Informaton Retreval Support Vector Machnes Alessandro Moschtt Department of nformaton and communcaton technology Unversty of Trento Emal: moschtt@ds.untn.t Summary Support

More information

On a direct solver for linear least squares problems

On a direct solver for linear least squares problems ISSN 2066-6594 Ann. Acad. Rom. Sc. Ser. Math. Appl. Vol. 8, No. 2/2016 On a drect solver for lnear least squares problems Constantn Popa Abstract The Null Space (NS) algorthm s a drect solver for lnear

More information

First Year Examination Department of Statistics, University of Florida

First Year Examination Department of Statistics, University of Florida Frst Year Examnaton Department of Statstcs, Unversty of Florda May 7, 010, 8:00 am - 1:00 noon Instructons: 1. You have four hours to answer questons n ths examnaton.. You must show your work to receve

More information

ρ some λ THE INVERSE POWER METHOD (or INVERSE ITERATION) , for , or (more usually) to

ρ some λ THE INVERSE POWER METHOD (or INVERSE ITERATION) , for , or (more usually) to THE INVERSE POWER METHOD (or INVERSE ITERATION) -- applcaton of the Power method to A some fxed constant ρ (whch s called a shft), x λ ρ If the egenpars of A are { ( λ, x ) } ( ), or (more usually) to,

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Konstantn Tretyakov (kt@ut.ee) MTAT.03.227 Machne Learnng So far So far Supervsed machne learnng Lnear models Non-lnear models Unsupervsed machne learnng Generc scaffoldng So far

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 16

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 16 STAT 39: MATHEMATICAL COMPUTATIONS I FALL 218 LECTURE 16 1 why teratve methods f we have a lnear system Ax = b where A s very, very large but s ether sparse or structured (eg, banded, Toepltz, banded plus

More information

The Geometry of Logit and Probit

The Geometry of Logit and Probit The Geometry of Logt and Probt Ths short note s meant as a supplement to Chapters and 3 of Spatal Models of Parlamentary Votng and the notaton and reference to fgures n the text below s to those two chapters.

More information

Report on Image warping

Report on Image warping Report on Image warpng Xuan Ne, Dec. 20, 2004 Ths document summarzed the algorthms of our mage warpng soluton for further study, and there s a detaled descrpton about the mplementaton of these algorthms.

More information

PROBLEM SET 7 GENERAL EQUILIBRIUM

PROBLEM SET 7 GENERAL EQUILIBRIUM PROBLEM SET 7 GENERAL EQUILIBRIUM Queston a Defnton: An Arrow-Debreu Compettve Equlbrum s a vector of prces {p t } and allocatons {c t, c 2 t } whch satsfes ( Gven {p t }, c t maxmzes βt ln c t subject

More information

Solving the Quadratic Eigenvalue Complementarity Problem by DC Programming

Solving the Quadratic Eigenvalue Complementarity Problem by DC Programming Solvng the Quadratc Egenvalue Complementarty Problem by DC Programmng Y-Shua Nu 1, Joaqum Júdce, Le Th Hoa An 3 and Pham Dnh Tao 4 1 Shangha JaoTong Unversty, Maths Departement and SJTU-Parstech, Chna

More information

( ) 2 ( ) ( ) Problem Set 4 Suggested Solutions. Problem 1

( ) 2 ( ) ( ) Problem Set 4 Suggested Solutions. Problem 1 Problem Set 4 Suggested Solutons Problem (A) The market demand functon s the soluton to the followng utlty-maxmzaton roblem (UMP): The Lagrangean: ( x, x, x ) = + max U x, x, x x x x st.. x + x + x y x,

More information

Math1110 (Spring 2009) Prelim 3 - Solutions

Math1110 (Spring 2009) Prelim 3 - Solutions Math 1110 (Sprng 2009) Solutons to Prelm 3 (04/21/2009) 1 Queston 1. (16 ponts) Short answer. Math1110 (Sprng 2009) Prelm 3 - Solutons x a 1 (a) (4 ponts) Please evaluate lm, where a and b are postve numbers.

More information

w ). Then use the Cauchy-Schwartz inequality ( v w v w ).] = in R 4. Can you find a vector u 4 in R 4 such that the

w ). Then use the Cauchy-Schwartz inequality ( v w v w ).] = in R 4. Can you find a vector u 4 in R 4 such that the Math S-b Summer 8 Homework #5 Problems due Wed, July 8: Secton 5: Gve an algebrac proof for the trangle nequalty v+ w v + w Draw a sketch [Hnt: Expand v+ w ( v+ w) ( v+ w ) hen use the Cauchy-Schwartz

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

Lecture 11. minimize. c j x j. j=1. 1 x j 0 j. +, b R m + and c R n +

Lecture 11. minimize. c j x j. j=1. 1 x j 0 j. +, b R m + and c R n + Topcs n Theoretcal Computer Scence May 4, 2015 Lecturer: Ola Svensson Lecture 11 Scrbes: Vncent Eggerlng, Smon Rodrguez 1 Introducton In the last lecture we covered the ellpsod method and ts applcaton

More information

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography CSc 6974 and ECSE 6966 Math. Tech. for Vson, Graphcs and Robotcs Lecture 21, Aprl 17, 2006 Estmatng A Plane Homography Overvew We contnue wth a dscusson of the major ssues, usng estmaton of plane projectve

More information

Assortment Optimization under MNL

Assortment Optimization under MNL Assortment Optmzaton under MNL Haotan Song Aprl 30, 2017 1 Introducton The assortment optmzaton problem ams to fnd the revenue-maxmzng assortment of products to offer when the prces of products are fxed.

More information

CHALMERS, GÖTEBORGS UNIVERSITET. SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD

CHALMERS, GÖTEBORGS UNIVERSITET. SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD CHALMERS, GÖTEBORGS UNIVERSITET SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS COURSE CODES: FFR 35, FIM 72 GU, PhD Tme: Place: Teachers: Allowed materal: Not allowed: January 2, 28, at 8 3 2 3 SB

More information

Estimation: Part 2. Chapter GREG estimation

Estimation: Part 2. Chapter GREG estimation Chapter 9 Estmaton: Part 2 9. GREG estmaton In Chapter 8, we have seen that the regresson estmator s an effcent estmator when there s a lnear relatonshp between y and x. In ths chapter, we generalzed the

More information

e - c o m p a n i o n

e - c o m p a n i o n OPERATIONS RESEARCH http://dxdoorg/0287/opre007ec e - c o m p a n o n ONLY AVAILABLE IN ELECTRONIC FORM 202 INFORMS Electronc Companon Generalzed Quantty Competton for Multple Products and Loss of Effcency

More information

Feature Selection: Part 1

Feature Selection: Part 1 CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?

More information

A NEW ALGORITHM FOR FINDING THE MINIMUM DISTANCE BETWEEN TWO CONVEX HULLS. Dougsoo Kaown, B.Sc., M.Sc. Dissertation Prepared for the Degree of

A NEW ALGORITHM FOR FINDING THE MINIMUM DISTANCE BETWEEN TWO CONVEX HULLS. Dougsoo Kaown, B.Sc., M.Sc. Dissertation Prepared for the Degree of A NEW ALGORITHM FOR FINDING THE MINIMUM DISTANCE BETWEEN TWO CONVEX HULLS Dougsoo Kaown, B.Sc., M.Sc. Dssertaton Prepared for the Degree of DOCTOR OF PHILOSOPHY UNIVERSITY OF NORTH TEXAS May 2009 APPROVED:

More information

Lecture 20: November 7

Lecture 20: November 7 0-725/36-725: Convex Optmzaton Fall 205 Lecturer: Ryan Tbshran Lecture 20: November 7 Scrbes: Varsha Chnnaobreddy, Joon Sk Km, Lngyao Zhang Note: LaTeX template courtesy of UC Berkeley EECS dept. Dsclamer:

More information

Lecture 21: Numerical methods for pricing American type derivatives

Lecture 21: Numerical methods for pricing American type derivatives Lecture 21: Numercal methods for prcng Amercan type dervatves Xaoguang Wang STAT 598W Aprl 10th, 2014 (STAT 598W) Lecture 21 1 / 26 Outlne 1 Fnte Dfference Method Explct Method Penalty Method (STAT 598W)

More information

Support Vector Machines. Vibhav Gogate The University of Texas at dallas

Support Vector Machines. Vibhav Gogate The University of Texas at dallas Support Vector Machnes Vbhav Gogate he Unversty of exas at dallas What We have Learned So Far? 1. Decson rees. Naïve Bayes 3. Lnear Regresson 4. Logstc Regresson 5. Perceptron 6. Neural networks 7. K-Nearest

More information

LOW BIAS INTEGRATED PATH ESTIMATORS. James M. Calvin

LOW BIAS INTEGRATED PATH ESTIMATORS. James M. Calvin Proceedngs of the 007 Wnter Smulaton Conference S G Henderson, B Bller, M-H Hseh, J Shortle, J D Tew, and R R Barton, eds LOW BIAS INTEGRATED PATH ESTIMATORS James M Calvn Department of Computer Scence

More information

Maximal Margin Classifier

Maximal Margin Classifier CS81B/Stat41B: Advanced Topcs n Learnng & Decson Makng Mamal Margn Classfer Lecturer: Mchael Jordan Scrbes: Jana van Greunen Corrected verson - /1/004 1 References/Recommended Readng 1.1 Webstes www.kernel-machnes.org

More information

The Order Relation and Trace Inequalities for. Hermitian Operators

The Order Relation and Trace Inequalities for. Hermitian Operators Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence

More information

More metrics on cartesian products

More metrics on cartesian products More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of

More information

Kristin P. Bennett. Rensselaer Polytechnic Institute

Kristin P. Bennett. Rensselaer Polytechnic Institute Support Vector Machnes and Other Kernel Methods Krstn P. Bennett Mathematcal Scences Department Rensselaer Polytechnc Insttute Support Vector Machnes (SVM) A methodology for nference based on Statstcal

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

A New Refinement of Jacobi Method for Solution of Linear System Equations AX=b

A New Refinement of Jacobi Method for Solution of Linear System Equations AX=b Int J Contemp Math Scences, Vol 3, 28, no 17, 819-827 A New Refnement of Jacob Method for Soluton of Lnear System Equatons AX=b F Naem Dafchah Department of Mathematcs, Faculty of Scences Unversty of Gulan,

More information

CME 302: NUMERICAL LINEAR ALGEBRA FALL 2005/06 LECTURE 13

CME 302: NUMERICAL LINEAR ALGEBRA FALL 2005/06 LECTURE 13 CME 30: NUMERICAL LINEAR ALGEBRA FALL 005/06 LECTURE 13 GENE H GOLUB 1 Iteratve Methods Very large problems (naturally sparse, from applcatons): teratve methods Structured matrces (even sometmes dense,

More information

On the Global Linear Convergence of the ADMM with Multi-Block Variables

On the Global Linear Convergence of the ADMM with Multi-Block Variables On the Global Lnear Convergence of the ADMM wth Mult-Block Varables Tany Ln Shqan Ma Shuzhong Zhang May 31, 01 Abstract The alternatng drecton method of multplers ADMM has been wdely used for solvng structured

More information

Georgia Tech PHYS 6124 Mathematical Methods of Physics I

Georgia Tech PHYS 6124 Mathematical Methods of Physics I Georga Tech PHYS 624 Mathematcal Methods of Physcs I Instructor: Predrag Cvtanovć Fall semester 202 Homework Set #7 due October 30 202 == show all your work for maxmum credt == put labels ttle legends

More information

Bézier curves. Michael S. Floater. September 10, These notes provide an introduction to Bézier curves. i=0

Bézier curves. Michael S. Floater. September 10, These notes provide an introduction to Bézier curves. i=0 Bézer curves Mchael S. Floater September 1, 215 These notes provde an ntroducton to Bézer curves. 1 Bernsten polynomals Recall that a real polynomal of a real varable x R, wth degree n, s a functon of

More information

6) Derivatives, gradients and Hessian matrices

6) Derivatives, gradients and Hessian matrices 30C00300 Mathematcal Methods for Economsts (6 cr) 6) Dervatves, gradents and Hessan matrces Smon & Blume chapters: 14, 15 Sldes by: Tmo Kuosmanen 1 Outlne Defnton of dervatve functon Dervatve notatons

More information

Lecture 6: Support Vector Machines

Lecture 6: Support Vector Machines Lecture 6: Support Vector Machnes Marna Melă mmp@stat.washngton.edu Department of Statstcs Unversty of Washngton November, 2018 Lnear SVM s The margn and the expected classfcaton error Maxmum Margn Lnear

More information

CALCULUS CLASSROOM CAPSULES

CALCULUS CLASSROOM CAPSULES CALCULUS CLASSROOM CAPSULES SESSION S86 Dr. Sham Alfred Rartan Valley Communty College salfred@rartanval.edu 38th AMATYC Annual Conference Jacksonvlle, Florda November 8-, 202 2 Calculus Classroom Capsules

More information

Another converse of Jensen s inequality

Another converse of Jensen s inequality Another converse of Jensen s nequalty Slavko Smc Abstract. We gve the best possble global bounds for a form of dscrete Jensen s nequalty. By some examples ts frutfulness s shown. 1. Introducton Throughout

More information

Perfect Competition and the Nash Bargaining Solution

Perfect Competition and the Nash Bargaining Solution Perfect Competton and the Nash Barganng Soluton Renhard John Department of Economcs Unversty of Bonn Adenauerallee 24-42 53113 Bonn, Germany emal: rohn@un-bonn.de May 2005 Abstract For a lnear exchange

More information

ON A DETERMINATION OF THE INITIAL FUNCTIONS FROM THE OBSERVED VALUES OF THE BOUNDARY FUNCTIONS FOR THE SECOND-ORDER HYPERBOLIC EQUATION

ON A DETERMINATION OF THE INITIAL FUNCTIONS FROM THE OBSERVED VALUES OF THE BOUNDARY FUNCTIONS FOR THE SECOND-ORDER HYPERBOLIC EQUATION Advanced Mathematcal Models & Applcatons Vol.3, No.3, 2018, pp.215-222 ON A DETERMINATION OF THE INITIAL FUNCTIONS FROM THE OBSERVED VALUES OF THE BOUNDARY FUNCTIONS FOR THE SECOND-ORDER HYPERBOLIC EUATION

More information

form, and they present results of tests comparng the new algorthms wth other methods. Recently, Olschowka & Neumaer [7] ntroduced another dea for choo

form, and they present results of tests comparng the new algorthms wth other methods. Recently, Olschowka & Neumaer [7] ntroduced another dea for choo Scalng and structural condton numbers Arnold Neumaer Insttut fur Mathematk, Unverstat Wen Strudlhofgasse 4, A-1090 Wen, Austra emal: neum@cma.unve.ac.at revsed, August 1996 Abstract. We ntroduce structural

More information

Approximate D-optimal designs of experiments on the convex hull of a finite set of information matrices

Approximate D-optimal designs of experiments on the convex hull of a finite set of information matrices Approxmate D-optmal desgns of experments on the convex hull of a fnte set of nformaton matrces Radoslav Harman, Mára Trnovská Department of Appled Mathematcs and Statstcs Faculty of Mathematcs, Physcs

More information

FIRST AND SECOND ORDER NECESSARY OPTIMALITY CONDITIONS FOR DISCRETE OPTIMAL CONTROL PROBLEMS

FIRST AND SECOND ORDER NECESSARY OPTIMALITY CONDITIONS FOR DISCRETE OPTIMAL CONTROL PROBLEMS Yugoslav Journal of Operatons Research 6 (6), umber, 53-6 FIRST D SECOD ORDER ECESSRY OPTIMLITY CODITIOS FOR DISCRETE OPTIML COTROL PROBLEMS Boban MRIKOVIĆ Faculty of Mnng and Geology, Unversty of Belgrade

More information

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family IOSR Journal of Mathematcs IOSR-JM) ISSN: 2278-5728. Volume 3, Issue 3 Sep-Oct. 202), PP 44-48 www.osrjournals.org Usng T.O.M to Estmate Parameter of dstrbutons that have not Sngle Exponental Famly Jubran

More information

Deriving the X-Z Identity from Auxiliary Space Method

Deriving the X-Z Identity from Auxiliary Space Method Dervng the X-Z Identty from Auxlary Space Method Long Chen Department of Mathematcs, Unversty of Calforna at Irvne, Irvne, CA 92697 chenlong@math.uc.edu 1 Iteratve Methods In ths paper we dscuss teratve

More information

Linear Feature Engineering 11

Linear Feature Engineering 11 Lnear Feature Engneerng 11 2 Least-Squares 2.1 Smple least-squares Consder the followng dataset. We have a bunch of nputs x and correspondng outputs y. The partcular values n ths dataset are x y 0.23 0.19

More information

Additional Codes using Finite Difference Method. 1 HJB Equation for Consumption-Saving Problem Without Uncertainty

Additional Codes using Finite Difference Method. 1 HJB Equation for Consumption-Saving Problem Without Uncertainty Addtonal Codes usng Fnte Dfference Method Benamn Moll 1 HJB Equaton for Consumpton-Savng Problem Wthout Uncertanty Before consderng the case wth stochastc ncome n http://www.prnceton.edu/~moll/ HACTproect/HACT_Numercal_Appendx.pdf,

More information

A Local Variational Problem of Second Order for a Class of Optimal Control Problems with Nonsmooth Objective Function

A Local Variational Problem of Second Order for a Class of Optimal Control Problems with Nonsmooth Objective Function A Local Varatonal Problem of Second Order for a Class of Optmal Control Problems wth Nonsmooth Objectve Functon Alexander P. Afanasev Insttute for Informaton Transmsson Problems, Russan Academy of Scences,

More information

10-701/ Machine Learning, Fall 2005 Homework 3

10-701/ Machine Learning, Fall 2005 Homework 3 10-701/15-781 Machne Learnng, Fall 2005 Homework 3 Out: 10/20/05 Due: begnnng of the class 11/01/05 Instructons Contact questons-10701@autonlaborg for queston Problem 1 Regresson and Cross-valdaton [40

More information

CSCE 790S Background Results

CSCE 790S Background Results CSCE 790S Background Results Stephen A. Fenner September 8, 011 Abstract These results are background to the course CSCE 790S/CSCE 790B, Quantum Computaton and Informaton (Sprng 007 and Fall 011). Each

More information

Bezier curves. Michael S. Floater. August 25, These notes provide an introduction to Bezier curves. i=0

Bezier curves. Michael S. Floater. August 25, These notes provide an introduction to Bezier curves. i=0 Bezer curves Mchael S. Floater August 25, 211 These notes provde an ntroducton to Bezer curves. 1 Bernsten polynomals Recall that a real polynomal of a real varable x R, wth degree n, s a functon of the

More information

REAL ANALYSIS I HOMEWORK 1

REAL ANALYSIS I HOMEWORK 1 REAL ANALYSIS I HOMEWORK CİHAN BAHRAN The questons are from Tao s text. Exercse 0.0.. If (x α ) α A s a collecton of numbers x α [0, + ] such that x α

More information

General viscosity iterative method for a sequence of quasi-nonexpansive mappings

General viscosity iterative method for a sequence of quasi-nonexpansive mappings Avalable onlne at www.tjnsa.com J. Nonlnear Sc. Appl. 9 (2016), 5672 5682 Research Artcle General vscosty teratve method for a sequence of quas-nonexpansve mappngs Cuje Zhang, Ynan Wang College of Scence,

More information