Agenda. Interior Point Methods. 1 Barrier functions. 2 Analytic center. 3 Central path. 4 Barrier method. 5 Primal-dual path following algorithms

Size: px
Start display at page:

Download "Agenda. Interior Point Methods. 1 Barrier functions. 2 Analytic center. 3 Central path. 4 Barrier method. 5 Primal-dual path following algorithms"

Transcription

1 Agenda Interior Point Methods 1 Barrier functions 2 Analytic center 3 Central path 4 Barrier method 5 Primal-dual path following algorithms 6 Nesterov Todd scaling 7 Complexity analysis

2 Interior point methods Primal (P) Dual (D) minimize subject to c T x Gx + s = h Ax = b s K 0 maximize h T z b T y subject to A T y + G T z + c = 0 z K 0 Wlog (why?), work with [G; A] full col rank Interior point methods (IPMs): maintain primal + dual strict feasibility while working toward complementary slackness (x k, s k ) primal feasible with s k 0 (y k, z k ) dual feasible with z k 0 z k, s k 0

3 Main idea minimize subject to [ tc T x + ϕ(s) [ ] [ G I x h = A 0] s b] ϕ is a barrier function defined on int(k) with following properties: strictly convex analytic self-concordant blows up when s approaches K For each t, unique minimizer (x(t), s(t)) [requires tiny bit of thought] Limiting points as t are primal-optimal solutions Smooth curve (x(t), s(t)) usually called the central path

4 Canonical cones and canonical barriers 1 K = R n + ϕ(x) = Σ n i=1logx i 1 1 ) 1 ϕ(x) = (,, 2 ϕ(x) = diag( x 1 x n x 2,, 1 1 ) x 2 n 2 K = L = {(x 1, x 2,..., x n ) : x n 2 x n } ϕ(x) = 1 2 log(xt Jx) ϕ(x) = Jx x T Jx 2 ϕ(x) = J x T Jx + 2(Jx)(Jx)T (x T Jx) 2 J = [ ] I Note [ 2 ϕ(x)] 1 = 2xx T (x T Jx)J

5 3 K = S n + ϕ(x) = X 1 ϕ(x) = log detx 2 ϕ(x)h = X 1 HX 1

6 Self concordance (1) (2) (3) are strictly convex and self-concordant Implication of self-concordance Newton s method extremely effective at minimizing smooth, cvx, self-concordant objectives

7 Barrier function for composite cones x = (x 1,..., x m ), x i K i Product K = K 1 K m Barrier: ϕ(x) = i ϕ i (x i ) each ϕ i SC = ϕ SC

8 Properties of barrier functions: Generalized logarithm (i) ϕ(tx) = ϕ(x) θ(ϕ) log t, for t > 0 θ(ϕ) = n for R n + θ(ϕ) = 1 for L θ(ϕ) = n for S n + Further properties following from (i) (ii) (iii) ϕ(x), x = θ(ϕ) [ 2 ϕ(x)]x = ϕ(x)

9 Barriers are self dual K: cone product where each component is either LP, SOCP or SDP cone For every x in K, ϕ(x) K The mapping is self inverse and homogenous of degree 1 int(k) int(k) x ϕ(x) ϕ( ϕ(x)) = x ϕ(tx) = t 1 ϕ(x) x int(k), t > 0

10 Analytic center minimize subject to [ ϕ(s) [ ] [ G I x h = A 0] s b] Convex program Solution strictly feasible Unique solution (x, s )

11 Computing analytic center Newton s method + line search (P ) minimize f(x) (cvx) subject to Ax = b Pure Newton s method: sequence {x k }, k = 0, 1, 2,... Input: x 0 feasible Repeat x k+1 = arg min Ax=b [f(x k )+ f(x k )(x x k )+ 1 ] 2 (x x k) T [ 2 f(x k )](x x k ) until convergence

12 With v = x k+1 x k, this boils down to minimize f(x k)v vt 2 f(x k )v subject to Av = 0 Optimality conditions { f(xk ) + 2 f(x k )v + A T λ = 0 Av = 0 or in matrix form [ ] [ [ ] 2 f(x k ) A T v f(xk ) = A 0 λ] 0

13 Problem: Can get outside the feasible set Solution: Line search : x k+1 = x k + tv Exact line search : ˆt = argminf(x k + tv) Backtracking line search : 0 < α, β < 1 while f(x + tv) > f(x) + α f(x), tv do t = βt

14 Complexity analysis f : cvx & self concordant Repeat until convergence (1) Compute Newton direction v (2) Compute ˆt from line search (3) Update x = x + ˆtv Theorem Assume ɛ < 1 2. Then f(x k) f ɛ if k f(x 0 ) f + log 2 log 2 ɛ 1 For practical purposes, log 2 log 2 ɛ 1 is constant, e.g. 5

15 This lecture: K = K (symmetry) minimize c T x (P) subject to Gx + s = h Ax = b s 0 Same cone maximize h T z b T y (D) subject to A T y + G T z + c = 0 z 0

16 Central path minimize subject to c, x + t 1 ϕ(s) Gx + s = h Ax = b Optimality conditions Important consequence (x, s) feasible and s 0 c + A T y + G T z = 0 (y, z) dual feasible and z 0 t 1 ϕ(s) = z Optimality conditions (t = ) (x, s) primal feasible (y, z) dual feasible (s, z) 0 Complementary slackness: s, z = 0 Central path (x, s) primal feasible (y, z) dual feasible (s, z) 0 Relaxed compl. slack. : s, z = θ(k)/t

17 Dual central path (D) (Dual CP) maximize h T z b T y subject to A T y + G T z + c = 0 z 0 minimize ht z + b T y + t 1 ϕ(z) subject to A T y + G T z + c = 0 Theorem Primal and dual central paths linked via { z (t) = t 1 ϕ(s (t)) s (t) = t 1 ϕ(z (t)) There is only one central path (s(t), z(t)) and s(t), z(t) = 1 t θ(k)

18 Proof (x, s) on central path Gx + s = h, Ax = b, s 0 (y, z) : A T y + G T z + c = 0 t 1 ϕ(s) = z Dual central path minimize ht z + b T y + t 1 ϕ(z) subject to A T y + G T z + c = 0 Lagrangian: h T z + b T y + 1 t ϕ(z) xt (A T y + G T z + c) Optimality conditions (y, z) on dual central path Unique CP since A T y + G T z + c = 0, z 0 x : b Ax = 0 s = h Gx = t 1 ϕ(z) z = t 1 ϕ(s) s = t 1 ϕ(z)

19 s, z = s, t 1 ϕ(s) = θ(k)/t

20 Characterization of central path (CP 1 ) (CP 2 ) (CP 3 ) s (t) strictly feasible z (t) strictly feasible augmented complementary slackness z (t) = t 1 ϕ(s (t)) In the case of SDP : tz (t) = [s (t)] 1 = z (t)s (t) = t 1 Id = trace(z (t)s (t)) = t 1 n (CP 1 ) - (CP 2 ) - (CP 3 ) fully characterize CP

21 Duality gap along CP c T x + b T y + h T z = y T Ax z T Gx + b T y + h T z = (h Gx) T z = s T z = 1 t θ(k) Proposition Duality gap along CP is t 1 θ(k). In particular, c T x p θ(k) t d + b T y + h T z θ(k) t Therefore, as t (x(t), s(t)) opt. sol. (y(t), z(t)) opt. sol.

22 Path following algorithm Start with t = t 0 and (x (t 0 ), s (t 0 )) Increase t = t 1 > t 0 and compute (x(t 1 ), s(t 1 )) using Newton s method with (x (t 0 ), s (t 0 )) as initial guess Few Newton iterations because we may be inside the region of quadratic convergence

23 Barrier method min subject to c T x Gx + s = h Ax = b s 0 Given strictly fasible (x, s), t = t 0, µ > 1 and tol > 0, repeat 1 Centering step 2 Update (x, s) = (x(t), s(t)) 3 Quit if θ(k)/t < tol 4 Increase t = µt min tc T x + ϕ(s) s. t. Gx + s = h Ax = b

24 Primal-dual path following methods Closely related to barrier methods Follow CP to find approximate solutions Steps are computed by linearizing CP equations Gx + s = h (s, z) 0 Central path: Ax = b A T y + G T z + c = 0 z = 1 t ϕ(s) e.g. SDP: G t (s, z) := z 1 t s 1 = 0

25 Main idea: From (t, s, z), update into (t +, s +, z + ) (i) Equivalent system Ḡ t (s, z) = 0 (ii) Choose t + > t and linearize equation Ḡ t+ (s + s, z + z) Ḡt + (s, z) + Ḡt + s s + Ḡt + z z = 0

26 Suppose current guess is feasible (iii) Solve system and update G x + s = 0 A x = 0 A T y + G T z = 0 Ḡt + s s + Ḡt + z z = Ḡt + (s, z) { s+ = s + α s z + = z + β z

27 Symmetrization: How do we construct the system Ḡt(s, z) = 0? SDP : z = 1 t s 1 zs = 1 t Id sz = 1 t Id Popular approach: Make system symmetric in s and z 1 2 (sz + zs) = 1 t Id Fact [requires some thought]: (s, z) 0 1 t s 1 = z 1 2 (sz + zs) = 1 t Id Leads to Alizadeh-Haeberly-Overton search direction and the sz + zs primal-dual path following method

28 Other symmetrizations LP : With (s, z) 0 and s z = (s i z i ) i=1,...,m z = 1 t ϕ(s) s z = 1 t 1

29 SOCP : {x = (x, x n ) R n : x x n } ϕ(x) = 1 2 log[d x] ϕ(x) = 1 D x [ x x n ] D x = x 2 n x 2 Then z = 1 t ϕ(s) { tz = 1 D s s tz n = 1 D s s n { z s n + z n s = 0 tz n = 1 D s s n where the second equivalence follows from 1/D s = tz n /s n. Since t z, s + tz n s n = s2 n s 2 D s = 1 we have z = 1 t ϕ(s) { z s n + z n s = 0 s, z + s n z n = 1/t

30 Scaling Idea for SDP: Q 0 Z = 1 t S 1 SZ = 1 t Id QSZQ 1 = 1 t Id ZS = 1 t Id Q 1 ZSQ = 1 t Id 1 2 [QSZQ 1 + Q 1 ZSQ] = 1 t Id complete freedom in choosing Q Q can vary from one iteration to the next

31 Change of coordinates 1 2 [QSQQ 1 ZQ 1 + Q 1 ZQ 1 QSQ] = 1 t Id Change of coordinates { S = QSQ 0 Z = Q 1 ZQ 1 0 Preserves positive definite cone Preserves central path Convergence analysis is simplified considerably when at each iteration, Q is chosen such that S and Z commute when (S, Z) are iterates to be updated

32 Nesterov-Todd scaling: S = Z General Scaling (S, Z) 0 G t (S, Z) = 0 W scaling matrix if multiplications with W and W T preserve the cone multiplications preserve the central path (S, Z) on CP (W S, W T Z) on CP Example K = S+ n W (S) = QSQ T W T (S) = Q T SQ W T (S) = Q T SQ 1 W is a scaling matrix: preserves cone and central path Positive scaling Q 0 W (S) = QSQ W T (Z) = Q 1 ZQ 1

33 Nesterov-Todd scaling Used in SEDUMI and SDPT 3 W associated with Ŝ, Ẑ such that W T Ẑ = W Ŝ = λ implies Ŝ, Ẑ = λ 2 W T W = 2 ϕ(w) where w is the unique point obeying 2 ϕ(w)ŝ = ẑ

34 NT for S n + Q 0 W (S) = QSQ W T (Z) = Q 1 ZQ 1 W T W (S) = Q 2 SQ 2 [ 2 ϕ(p )]S = P 1 SP 1 Q = P 1/2 P 1 ŜP 1 = Ẑ P = Ŝ1/2 (Ŝ1/2 ẐŜ1/2 ) 1/2 Ŝ 1/2 Can be computed by Cholesky or SVD computations

35 NT for R n + Positive diagonal scaling: W = diag(w i ) w i ŝ i = 1 ẑ i w i ẑi w i = ŝ i λ = W ŝ = W T ẑ = { ŝ i ẑ i } i

36 NT for Lorentz cone: Ben-Tal and Nemirovski, Chapter 6.8 v = W = β(2vv T J) w + e n 2(wn + 1) w = 1 2γ γ = [ Jŝ ŝjŝ + [ẑt Jẑ β = ŝ T Jŝ ] ẑ ẑt Jẑ [ ŝ T ẑ 2 ŝjŝ ẑ T Jẑ ] 1/2 ] 1/4

37 Basic primal-dual update Current (ŝ, ẑ) and (ẑ, ŷ) ŝ 0, ẑ 0 1. Set t such that ŝ T ẑ = 1 t θ(k) and scaling W for ŝ, ẑ 2. Choose t + = µt (µ > 1) 3. Solve the KKT system by linearizing CP equation Gx + s = h Ḡ t+ (W s, W T z) = 0 Ax = b A T y + G T z + c = 0 around ŝ, ẑ 4. Update { (s+, x + ) = (ŝ, ˆx) + α p ( s, x) (z +, y + ) = (ẑ, ŷ) + α d ( z, y) such that positivity is preserved ŝ +, ẑ + 0

38 Linearized CP equations Gˆx + ŝ h Aˆx b := r A T ŷ + G T ẑ + c (r = 0 if strictly feasible) Linear system and G x + s A x = r A T y + G T z Ḡ t+ (W T ŝ, W ẑ) + Ḡt + [ W T s + W z ] = 0 where the scaling obeys W T ŝ = W ẑ = λ

39 The linearized equation LP: Ḡ t = s z 1 t e & e = 1 s z = sizi SDP: Ḡ t = 1 2 [SZ + ZS] 1 t Id = s z 1 t e s z = 1 [SZ + ZS] & e = Id 2 0 [ ] sn z SOCP: s z = + z n s = s n z 1 t. n 0 = 1 t e 1 Ḡ t+ (λ, λ) = λ λ 1 t + e gives Linearized equation reads Ḡt + [ W T s + W z ] = λ [W T s + W z ] λ [W T s + W z ] = 1 t + e λ λ

40 Path following algorithm Choose starting points ŝ 0, ẑ 0, ˆx and ŷ 1. Compute residuals and evaluate stopping criteria r = Gˆx + ŝ h Aˆx b A T ŷ + G T ẑ + c Terminate if r and ŝ T ẑ sufficiently small 2. Compute scaling matrix W λ = W T ŝ = W ẑ, 1 t := ŝt ẑ θ(k)

41 3. Computing search directions: solve G x a + s a A x a = r and λ [W z a + W T ] s a = λ λ A T y a + G T z a 4. Select barrier parameter: [ (ŝ + αp s) T (ẑ + α d ẑ) σ = ŝ T ẑ δ: algorithm parameter (typical value is δ = 3) α p = sup {α [0, 1], ŝ + α s a 0} ] δ t + = t/σ α d = sup {α [0, 1], ẑ + α z a 0}

42 5. Compute search direction G x + s A x = r A T y + G T z λ [W z + W T s ] = 1 t + e λ λ 6. Update iterates (ˆx, ŝ) = (ˆx, ŝ) + min {1, 0.99α p } ( x, s) (ŷ, ẑ) = (ŷ, ẑ) + min {1, 0.99α d } ( y, z) α p = sup {α 0, ŝ + α s 0} α d = sup {α 0, ẑ + α z 0}

43 Interpretation Step 3: affine scaling directions solve linearized CP equations Step 4: heuristic for updating t t + based on an estimate of the quality of the affine scaling direction σ small if the step in affine scaling direction large reduction in ŝ T ẑ Step 5: system has same coefficient matrix as in step 3 Direct method solve two equations with the cost of one (i.e. can reuse the matrix factorization)

44 Mehrotra correction Step 5: Solve the same system but with RHS of linearized equations 1 e λ λ [ W T ] s a W z a t + Extra term is the approximation of second order term in Typically saves a few iterations W T (ŝ + ŝ) W (ẑ + ẑ) = 1 t + e

45 Newton equations Eliminating s reduces to 0 AT G T A 0 0 G 0 W T W Eliminating z [ G T W 1 W T G A T A 0 Because W T W = 2 ϕ(w) 1 (NT scaling) x y = RHS z ] [ ] x = RHS y G T W 1 W T G = G T 2 ϕ(w)g Hessian of barrier ϕ(h Gx) at scaling point w

46 Complexity analysis: SDP Short step path following methods based on commutative scalings (e.g. NT) (ˆt, ŝ, ẑ) N 0.1 ˆtŝ 1/2 ẑŝ 1/2 I ŝ, ẑ strictly feasible 1. Choose a new value of t: ( t + = 1 χ ) 1 ˆt χ : parameter n 2. Solve linearized CP equations with commutative scaling

47 Key result Theorem If χ 0.1, then ŝ +, ẑ + strictly feasible and ŝ +, ẑ + = n t + and (ˆt +, ŝ +, ẑ + ) N 0.1 Same proximity to CP Value of centrality parameter larger by 1 + O(1) n factor Once we reach N 0.1 trace primal-dual CP by staying in N 0.1 and increasing parameter by an absolute constant factor every O( n) steps

48 In general t + = ( θ(k) )t Once we managed to get close to the CP, then every O( θ(k)) steps of the scheme improves the quality of approximations by an absolute constant factor In particular, it takes no more than O(1) θ(k) log steps to generate a strictly feasible ɛ-solution ( 1 + θ(k) ) t 0 ɛ

49 References 1 A. Ben-Tal and A. Nemirovski, Lectures on Modern Convex Optimization: Analysis, Algorithms, and Engineering Applications, MPS-SIAM Series on Optimization 2 S. Boyd and L. Vandenberghe, Convex Optimization, Cambridge University Press 3 L. Vandenberghe, EE236C (Spring 2011), UCLA

18. Primal-dual interior-point methods

18. Primal-dual interior-point methods L. Vandenberghe EE236C (Spring 213-14) 18. Primal-dual interior-point methods primal-dual central path equations infeasible primal-dual method primal-dual method for self-dual embedding 18-1 Symmetric

More information

Convex Optimization. Newton s method. ENSAE: Optimisation 1/44

Convex Optimization. Newton s method. ENSAE: Optimisation 1/44 Convex Optimization Newton s method ENSAE: Optimisation 1/44 Unconstrained minimization minimize f(x) f convex, twice continuously differentiable (hence dom f open) we assume optimal value p = inf x f(x)

More information

Lecture 15 Newton Method and Self-Concordance. October 23, 2008

Lecture 15 Newton Method and Self-Concordance. October 23, 2008 Newton Method and Self-Concordance October 23, 2008 Outline Lecture 15 Self-concordance Notion Self-concordant Functions Operations Preserving Self-concordance Properties of Self-concordant Functions Implications

More information

12. Interior-point methods

12. Interior-point methods 12. Interior-point methods Convex Optimization Boyd & Vandenberghe inequality constrained minimization logarithmic barrier function and central path barrier method feasibility and phase I methods complexity

More information

Advances in Convex Optimization: Theory, Algorithms, and Applications

Advances in Convex Optimization: Theory, Algorithms, and Applications Advances in Convex Optimization: Theory, Algorithms, and Applications Stephen Boyd Electrical Engineering Department Stanford University (joint work with Lieven Vandenberghe, UCLA) ISIT 02 ISIT 02 Lausanne

More information

Interior Point Algorithms for Constrained Convex Optimization

Interior Point Algorithms for Constrained Convex Optimization Interior Point Algorithms for Constrained Convex Optimization Chee Wei Tan CS 8292 : Advanced Topics in Convex Optimization and its Applications Fall 2010 Outline Inequality constrained minimization problems

More information

10. Unconstrained minimization

10. Unconstrained minimization Convex Optimization Boyd & Vandenberghe 10. Unconstrained minimization terminology and assumptions gradient descent method steepest descent method Newton s method self-concordant functions implementation

More information

Lecture 6: Conic Optimization September 8

Lecture 6: Conic Optimization September 8 IE 598: Big Data Optimization Fall 2016 Lecture 6: Conic Optimization September 8 Lecturer: Niao He Scriber: Juan Xu Overview In this lecture, we finish up our previous discussion on optimality conditions

More information

12. Interior-point methods

12. Interior-point methods 12. Interior-point methods Convex Optimization Boyd & Vandenberghe inequality constrained minimization logarithmic barrier function and central path barrier method feasibility and phase I methods complexity

More information

Lecture 9 Sequential unconstrained minimization

Lecture 9 Sequential unconstrained minimization S. Boyd EE364 Lecture 9 Sequential unconstrained minimization brief history of SUMT & IP methods logarithmic barrier function central path UMT & SUMT complexity analysis feasibility phase generalized inequalities

More information

Barrier Method. Javier Peña Convex Optimization /36-725

Barrier Method. Javier Peña Convex Optimization /36-725 Barrier Method Javier Peña Convex Optimization 10-725/36-725 1 Last time: Newton s method For root-finding F (x) = 0 x + = x F (x) 1 F (x) For optimization x f(x) x + = x 2 f(x) 1 f(x) Assume f strongly

More information

Primal-Dual Interior-Point Methods. Javier Peña Convex Optimization /36-725

Primal-Dual Interior-Point Methods. Javier Peña Convex Optimization /36-725 Primal-Dual Interior-Point Methods Javier Peña Convex Optimization 10-725/36-725 Last time: duality revisited Consider the problem min x subject to f(x) Ax = b h(x) 0 Lagrangian L(x, u, v) = f(x) + u T

More information

Lecture 1. 1 Conic programming. MA 796S: Convex Optimization and Interior Point Methods October 8, Consider the conic program. min.

Lecture 1. 1 Conic programming. MA 796S: Convex Optimization and Interior Point Methods October 8, Consider the conic program. min. MA 796S: Convex Optimization and Interior Point Methods October 8, 2007 Lecture 1 Lecturer: Kartik Sivaramakrishnan Scribe: Kartik Sivaramakrishnan 1 Conic programming Consider the conic program min s.t.

More information

CS711008Z Algorithm Design and Analysis

CS711008Z Algorithm Design and Analysis CS711008Z Algorithm Design and Analysis Lecture 8 Linear programming: interior point method Dongbo Bu Institute of Computing Technology Chinese Academy of Sciences, Beijing, China 1 / 31 Outline Brief

More information

E5295/5B5749 Convex optimization with engineering applications. Lecture 5. Convex programming and semidefinite programming

E5295/5B5749 Convex optimization with engineering applications. Lecture 5. Convex programming and semidefinite programming E5295/5B5749 Convex optimization with engineering applications Lecture 5 Convex programming and semidefinite programming A. Forsgren, KTH 1 Lecture 5 Convex optimization 2006/2007 Convex quadratic program

More information

On Generalized Primal-Dual Interior-Point Methods with Non-uniform Complementarity Perturbations for Quadratic Programming

On Generalized Primal-Dual Interior-Point Methods with Non-uniform Complementarity Perturbations for Quadratic Programming On Generalized Primal-Dual Interior-Point Methods with Non-uniform Complementarity Perturbations for Quadratic Programming Altuğ Bitlislioğlu and Colin N. Jones Abstract This technical note discusses convergence

More information

10 Numerical methods for constrained problems

10 Numerical methods for constrained problems 10 Numerical methods for constrained problems min s.t. f(x) h(x) = 0 (l), g(x) 0 (m), x X The algorithms can be roughly divided the following way: ˆ primal methods: find descent direction keeping inside

More information

Lecture 17: Primal-dual interior-point methods part II

Lecture 17: Primal-dual interior-point methods part II 10-725/36-725: Convex Optimization Spring 2015 Lecture 17: Primal-dual interior-point methods part II Lecturer: Javier Pena Scribes: Pinchao Zhang, Wei Ma Note: LaTeX template courtesy of UC Berkeley EECS

More information

Primal-Dual Interior-Point Methods. Ryan Tibshirani Convex Optimization /36-725

Primal-Dual Interior-Point Methods. Ryan Tibshirani Convex Optimization /36-725 Primal-Dual Interior-Point Methods Ryan Tibshirani Convex Optimization 10-725/36-725 Given the problem Last time: barrier method min x subject to f(x) h i (x) 0, i = 1,... m Ax = b where f, h i, i = 1,...

More information

Nonsymmetric potential-reduction methods for general cones

Nonsymmetric potential-reduction methods for general cones CORE DISCUSSION PAPER 2006/34 Nonsymmetric potential-reduction methods for general cones Yu. Nesterov March 28, 2006 Abstract In this paper we propose two new nonsymmetric primal-dual potential-reduction

More information

Primal-Dual Symmetric Interior-Point Methods from SDP to Hyperbolic Cone Programming and Beyond

Primal-Dual Symmetric Interior-Point Methods from SDP to Hyperbolic Cone Programming and Beyond Primal-Dual Symmetric Interior-Point Methods from SDP to Hyperbolic Cone Programming and Beyond Tor Myklebust Levent Tunçel September 26, 2014 Convex Optimization in Conic Form (P) inf c, x A(x) = b, x

More information

Interior Point Methods. We ll discuss linear programming first, followed by three nonlinear problems. Algorithms for Linear Programming Problems

Interior Point Methods. We ll discuss linear programming first, followed by three nonlinear problems. Algorithms for Linear Programming Problems AMSC 607 / CMSC 764 Advanced Numerical Optimization Fall 2008 UNIT 3: Constrained Optimization PART 4: Introduction to Interior Point Methods Dianne P. O Leary c 2008 Interior Point Methods We ll discuss

More information

L. Vandenberghe EE236C (Spring 2016) 18. Symmetric cones. definition. spectral decomposition. quadratic representation. log-det barrier 18-1

L. Vandenberghe EE236C (Spring 2016) 18. Symmetric cones. definition. spectral decomposition. quadratic representation. log-det barrier 18-1 L. Vandenberghe EE236C (Spring 2016) 18. Symmetric cones definition spectral decomposition quadratic representation log-det barrier 18-1 Introduction This lecture: theoretical properties of the following

More information

Primal-Dual Interior-Point Methods

Primal-Dual Interior-Point Methods Primal-Dual Interior-Point Methods Lecturer: Aarti Singh Co-instructor: Pradeep Ravikumar Convex Optimization 10-725/36-725 Outline Today: Primal-dual interior-point method Special case: linear programming

More information

15. Conic optimization

15. Conic optimization L. Vandenberghe EE236C (Spring 216) 15. Conic optimization conic linear program examples modeling duality 15-1 Generalized (conic) inequalities Conic inequality: a constraint x K where K is a convex cone

More information

Agenda. 1 Cone programming. 2 Convex cones. 3 Generalized inequalities. 4 Linear programming (LP) 5 Second-order cone programming (SOCP)

Agenda. 1 Cone programming. 2 Convex cones. 3 Generalized inequalities. 4 Linear programming (LP) 5 Second-order cone programming (SOCP) Agenda 1 Cone programming 2 Convex cones 3 Generalized inequalities 4 Linear programming (LP) 5 Second-order cone programming (SOCP) 6 Semidefinite programming (SDP) 7 Examples Optimization problem in

More information

Second-order cone programming

Second-order cone programming Outline Second-order cone programming, PhD Lehigh University Department of Industrial and Systems Engineering February 10, 2009 Outline 1 Basic properties Spectral decomposition The cone of squares The

More information

Newton s Method. Javier Peña Convex Optimization /36-725

Newton s Method. Javier Peña Convex Optimization /36-725 Newton s Method Javier Peña Convex Optimization 10-725/36-725 1 Last time: dual correspondences Given a function f : R n R, we define its conjugate f : R n R, f ( (y) = max y T x f(x) ) x Properties and

More information

Analytic Center Cutting-Plane Method

Analytic Center Cutting-Plane Method Analytic Center Cutting-Plane Method S. Boyd, L. Vandenberghe, and J. Skaf April 14, 2011 Contents 1 Analytic center cutting-plane method 2 2 Computing the analytic center 3 3 Pruning constraints 5 4 Lower

More information

Agenda. Applications of semidefinite programming. 1 Control and system theory. 2 Combinatorial and nonconvex optimization

Agenda. Applications of semidefinite programming. 1 Control and system theory. 2 Combinatorial and nonconvex optimization Agenda Applications of semidefinite programming 1 Control and system theory 2 Combinatorial and nonconvex optimization 3 Spectral estimation & super-resolution Control and system theory SDP in wide use

More information

Primal-Dual Interior-Point Methods. Ryan Tibshirani Convex Optimization

Primal-Dual Interior-Point Methods. Ryan Tibshirani Convex Optimization Primal-Dual Interior-Point Methods Ryan Tibshirani Convex Optimization 10-725 Given the problem Last time: barrier method min x subject to f(x) h i (x) 0, i = 1,... m Ax = b where f, h i, i = 1,... m are

More information

Newton s Method. Ryan Tibshirani Convex Optimization /36-725

Newton s Method. Ryan Tibshirani Convex Optimization /36-725 Newton s Method Ryan Tibshirani Convex Optimization 10-725/36-725 1 Last time: dual correspondences Given a function f : R n R, we define its conjugate f : R n R, Properties and examples: f (y) = max x

More information

Supplement: Universal Self-Concordant Barrier Functions

Supplement: Universal Self-Concordant Barrier Functions IE 8534 1 Supplement: Universal Self-Concordant Barrier Functions IE 8534 2 Recall that a self-concordant barrier function for K is a barrier function satisfying 3 F (x)[h, h, h] 2( 2 F (x)[h, h]) 3/2,

More information

More First-Order Optimization Algorithms

More First-Order Optimization Algorithms More First-Order Optimization Algorithms Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A. http://www.stanford.edu/ yyye Chapters 3, 8, 3 The SDM

More information

Homework 4. Convex Optimization /36-725

Homework 4. Convex Optimization /36-725 Homework 4 Convex Optimization 10-725/36-725 Due Friday November 4 at 5:30pm submitted to Christoph Dann in Gates 8013 (Remember to a submit separate writeup for each problem, with your name at the top)

More information

Optimization for Machine Learning

Optimization for Machine Learning Optimization for Machine Learning (Problems; Algorithms - A) SUVRIT SRA Massachusetts Institute of Technology PKU Summer School on Data Science (July 2017) Course materials http://suvrit.de/teaching.html

More information

Nonlinear Optimization for Optimal Control

Nonlinear Optimization for Optimal Control Nonlinear Optimization for Optimal Control Pieter Abbeel UC Berkeley EECS Many slides and figures adapted from Stephen Boyd [optional] Boyd and Vandenberghe, Convex Optimization, Chapters 9 11 [optional]

More information

Agenda. 1 Duality for LP. 2 Theorem of alternatives. 3 Conic Duality. 4 Dual cones. 5 Geometric view of cone programs. 6 Conic duality theorem

Agenda. 1 Duality for LP. 2 Theorem of alternatives. 3 Conic Duality. 4 Dual cones. 5 Geometric view of cone programs. 6 Conic duality theorem Agenda 1 Duality for LP 2 Theorem of alternatives 3 Conic Duality 4 Dual cones 5 Geometric view of cone programs 6 Conic duality theorem 7 Examples Lower bounds on LPs By eliminating variables (if needed)

More information

Convex Optimization and l 1 -minimization

Convex Optimization and l 1 -minimization Convex Optimization and l 1 -minimization Sangwoon Yun Computational Sciences Korea Institute for Advanced Study December 11, 2009 2009 NIMS Thematic Winter School Outline I. Convex Optimization II. l

More information

Determinant maximization with linear. S. Boyd, L. Vandenberghe, S.-P. Wu. Information Systems Laboratory. Stanford University

Determinant maximization with linear. S. Boyd, L. Vandenberghe, S.-P. Wu. Information Systems Laboratory. Stanford University Determinant maximization with linear matrix inequality constraints S. Boyd, L. Vandenberghe, S.-P. Wu Information Systems Laboratory Stanford University SCCM Seminar 5 February 1996 1 MAXDET problem denition

More information

CSCI 1951-G Optimization Methods in Finance Part 09: Interior Point Methods

CSCI 1951-G Optimization Methods in Finance Part 09: Interior Point Methods CSCI 1951-G Optimization Methods in Finance Part 09: Interior Point Methods March 23, 2018 1 / 35 This material is covered in S. Boyd, L. Vandenberge s book Convex Optimization https://web.stanford.edu/~boyd/cvxbook/.

More information

Lecture 14 Barrier method

Lecture 14 Barrier method L. Vandenberghe EE236A (Fall 2013-14) Lecture 14 Barrier method centering problem Newton decrement local convergence of Newton method short-step barrier method global convergence of Newton method predictor-corrector

More information

ORIE 6326: Convex Optimization. Quasi-Newton Methods

ORIE 6326: Convex Optimization. Quasi-Newton Methods ORIE 6326: Convex Optimization Quasi-Newton Methods Professor Udell Operations Research and Information Engineering Cornell April 10, 2017 Slides on steepest descent and analysis of Newton s method adapted

More information

Constrained Optimization and Lagrangian Duality

Constrained Optimization and Lagrangian Duality CIS 520: Machine Learning Oct 02, 2017 Constrained Optimization and Lagrangian Duality Lecturer: Shivani Agarwal Disclaimer: These notes are designed to be a supplement to the lecture. They may or may

More information

Lecture 3. Optimization Problems and Iterative Algorithms

Lecture 3. Optimization Problems and Iterative Algorithms Lecture 3 Optimization Problems and Iterative Algorithms January 13, 2016 This material was jointly developed with Angelia Nedić at UIUC for IE 598ns Outline Special Functions: Linear, Quadratic, Convex

More information

Agenda. Fast proximal gradient methods. 1 Accelerated first-order methods. 2 Auxiliary sequences. 3 Convergence analysis. 4 Numerical examples

Agenda. Fast proximal gradient methods. 1 Accelerated first-order methods. 2 Auxiliary sequences. 3 Convergence analysis. 4 Numerical examples Agenda Fast proximal gradient methods 1 Accelerated first-order methods 2 Auxiliary sequences 3 Convergence analysis 4 Numerical examples 5 Optimality of Nesterov s scheme Last time Proximal gradient method

More information

Lagrange Duality. Daniel P. Palomar. Hong Kong University of Science and Technology (HKUST)

Lagrange Duality. Daniel P. Palomar. Hong Kong University of Science and Technology (HKUST) Lagrange Duality Daniel P. Palomar Hong Kong University of Science and Technology (HKUST) ELEC5470 - Convex Optimization Fall 2017-18, HKUST, Hong Kong Outline of Lecture Lagrangian Dual function Dual

More information

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee227c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee227c@berkeley.edu

More information

A new primal-dual path-following method for convex quadratic programming

A new primal-dual path-following method for convex quadratic programming Volume 5, N., pp. 97 0, 006 Copyright 006 SBMAC ISSN 00-805 www.scielo.br/cam A new primal-dual path-following method for convex quadratic programming MOHAMED ACHACHE Département de Mathématiques, Faculté

More information

Interior-Point Methods

Interior-Point Methods Interior-Point Methods Stephen Wright University of Wisconsin-Madison Simons, Berkeley, August, 2017 Wright (UW-Madison) Interior-Point Methods August 2017 1 / 48 Outline Introduction: Problems and Fundamentals

More information

Largest dual ellipsoids inscribed in dual cones

Largest dual ellipsoids inscribed in dual cones Largest dual ellipsoids inscribed in dual cones M. J. Todd June 23, 2005 Abstract Suppose x and s lie in the interiors of a cone K and its dual K respectively. We seek dual ellipsoidal norms such that

More information

Lectures 9 and 10: Constrained optimization problems and their optimality conditions

Lectures 9 and 10: Constrained optimization problems and their optimality conditions Lectures 9 and 10: Constrained optimization problems and their optimality conditions Coralia Cartis, Mathematical Institute, University of Oxford C6.2/B2: Continuous Optimization Lectures 9 and 10: Constrained

More information

Lecture 7: Convex Optimizations

Lecture 7: Convex Optimizations Lecture 7: Convex Optimizations Radu Balan, David Levermore March 29, 2018 Convex Sets. Convex Functions A set S R n is called a convex set if for any points x, y S the line segment [x, y] := {tx + (1

More information

Unconstrained minimization

Unconstrained minimization CSCI5254: Convex Optimization & Its Applications Unconstrained minimization terminology and assumptions gradient descent method steepest descent method Newton s method self-concordant functions 1 Unconstrained

More information

LECTURE 25: REVIEW/EPILOGUE LECTURE OUTLINE

LECTURE 25: REVIEW/EPILOGUE LECTURE OUTLINE LECTURE 25: REVIEW/EPILOGUE LECTURE OUTLINE CONVEX ANALYSIS AND DUALITY Basic concepts of convex analysis Basic concepts of convex optimization Geometric duality framework - MC/MC Constrained optimization

More information

9. Dual decomposition and dual algorithms

9. Dual decomposition and dual algorithms EE 546, Univ of Washington, Spring 2016 9. Dual decomposition and dual algorithms dual gradient ascent example: network rate control dual decomposition and the proximal gradient method examples with simple

More information

14. Duality. ˆ Upper and lower bounds. ˆ General duality. ˆ Constraint qualifications. ˆ Counterexample. ˆ Complementary slackness.

14. Duality. ˆ Upper and lower bounds. ˆ General duality. ˆ Constraint qualifications. ˆ Counterexample. ˆ Complementary slackness. CS/ECE/ISyE 524 Introduction to Optimization Spring 2016 17 14. Duality ˆ Upper and lower bounds ˆ General duality ˆ Constraint qualifications ˆ Counterexample ˆ Complementary slackness ˆ Examples ˆ Sensitivity

More information

POLYNOMIAL OPTIMIZATION WITH SUMS-OF-SQUARES INTERPOLANTS

POLYNOMIAL OPTIMIZATION WITH SUMS-OF-SQUARES INTERPOLANTS POLYNOMIAL OPTIMIZATION WITH SUMS-OF-SQUARES INTERPOLANTS Sercan Yıldız syildiz@samsi.info in collaboration with Dávid Papp (NCSU) OPT Transition Workshop May 02, 2017 OUTLINE Polynomial optimization and

More information

Lecture 14: Optimality Conditions for Conic Problems

Lecture 14: Optimality Conditions for Conic Problems EE 227A: Conve Optimization and Applications March 6, 2012 Lecture 14: Optimality Conditions for Conic Problems Lecturer: Laurent El Ghaoui Reading assignment: 5.5 of BV. 14.1 Optimality for Conic Problems

More information

1. Gradient method. gradient method, first-order methods. quadratic bounds on convex functions. analysis of gradient method

1. Gradient method. gradient method, first-order methods. quadratic bounds on convex functions. analysis of gradient method L. Vandenberghe EE236C (Spring 2016) 1. Gradient method gradient method, first-order methods quadratic bounds on convex functions analysis of gradient method 1-1 Approximate course outline First-order

More information

Algorithms for nonlinear programming problems II

Algorithms for nonlinear programming problems II Algorithms for nonlinear programming problems II Martin Branda Charles University in Prague Faculty of Mathematics and Physics Department of Probability and Mathematical Statistics Computational Aspects

More information

Interior Point Methods for Mathematical Programming

Interior Point Methods for Mathematical Programming Interior Point Methods for Mathematical Programming Clóvis C. Gonzaga Federal University of Santa Catarina, Florianópolis, Brazil EURO - 2013 Roma Our heroes Cauchy Newton Lagrange Early results Unconstrained

More information

Written Examination

Written Examination Division of Scientific Computing Department of Information Technology Uppsala University Optimization Written Examination 202-2-20 Time: 4:00-9:00 Allowed Tools: Pocket Calculator, one A4 paper with notes

More information

Solution Methods. Richard Lusby. Department of Management Engineering Technical University of Denmark

Solution Methods. Richard Lusby. Department of Management Engineering Technical University of Denmark Solution Methods Richard Lusby Department of Management Engineering Technical University of Denmark Lecture Overview (jg Unconstrained Several Variables Quadratic Programming Separable Programming SUMT

More information

Convex Optimization. Dani Yogatama. School of Computer Science, Carnegie Mellon University, Pittsburgh, PA, USA. February 12, 2014

Convex Optimization. Dani Yogatama. School of Computer Science, Carnegie Mellon University, Pittsburgh, PA, USA. February 12, 2014 Convex Optimization Dani Yogatama School of Computer Science, Carnegie Mellon University, Pittsburgh, PA, USA February 12, 2014 Dani Yogatama (Carnegie Mellon University) Convex Optimization February 12,

More information

A CONIC DANTZIG-WOLFE DECOMPOSITION APPROACH FOR LARGE SCALE SEMIDEFINITE PROGRAMMING

A CONIC DANTZIG-WOLFE DECOMPOSITION APPROACH FOR LARGE SCALE SEMIDEFINITE PROGRAMMING A CONIC DANTZIG-WOLFE DECOMPOSITION APPROACH FOR LARGE SCALE SEMIDEFINITE PROGRAMMING Kartik Krishnan Advanced Optimization Laboratory McMaster University Joint work with Gema Plaza Martinez and Tamás

More information

The Q Method for Symmetric Cone Programmin

The Q Method for Symmetric Cone Programmin The Q Method for Symmetric Cone Programming The Q Method for Symmetric Cone Programmin Farid Alizadeh and Yu Xia alizadeh@rutcor.rutgers.edu, xiay@optlab.mcma Large Scale Nonlinear and Semidefinite Progra

More information

Interior Point Methods in Mathematical Programming

Interior Point Methods in Mathematical Programming Interior Point Methods in Mathematical Programming Clóvis C. Gonzaga Federal University of Santa Catarina, Brazil Journées en l honneur de Pierre Huard Paris, novembre 2008 01 00 11 00 000 000 000 000

More information

Lecture 5. The Dual Cone and Dual Problem

Lecture 5. The Dual Cone and Dual Problem IE 8534 1 Lecture 5. The Dual Cone and Dual Problem IE 8534 2 For a convex cone K, its dual cone is defined as K = {y x, y 0, x K}. The inner-product can be replaced by x T y if the coordinates of the

More information

Lecture: Duality of LP, SOCP and SDP

Lecture: Duality of LP, SOCP and SDP 1/33 Lecture: Duality of LP, SOCP and SDP Zaiwen Wen Beijing International Center For Mathematical Research Peking University http://bicmr.pku.edu.cn/~wenzw/bigdata2017.html wenzw@pku.edu.cn Acknowledgement:

More information

Lecture 16: October 22

Lecture 16: October 22 0-725/36-725: Conve Optimization Fall 208 Lecturer: Ryan Tibshirani Lecture 6: October 22 Scribes: Nic Dalmasso, Alan Mishler, Benja LeRoy Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer:

More information

Algorithms for nonlinear programming problems II

Algorithms for nonlinear programming problems II Algorithms for nonlinear programming problems II Martin Branda Charles University Faculty of Mathematics and Physics Department of Probability and Mathematical Statistics Computational Aspects of Optimization

More information

Full Newton step polynomial time methods for LO based on locally self concordant barrier functions

Full Newton step polynomial time methods for LO based on locally self concordant barrier functions Full Newton step polynomial time methods for LO based on locally self concordant barrier functions (work in progress) Kees Roos and Hossein Mansouri e-mail: [C.Roos,H.Mansouri]@ewi.tudelft.nl URL: http://www.isa.ewi.tudelft.nl/

More information

Lecture 8. Strong Duality Results. September 22, 2008

Lecture 8. Strong Duality Results. September 22, 2008 Strong Duality Results September 22, 2008 Outline Lecture 8 Slater Condition and its Variations Convex Objective with Linear Inequality Constraints Quadratic Objective over Quadratic Constraints Representation

More information

LMI MODELLING 4. CONVEX LMI MODELLING. Didier HENRION. LAAS-CNRS Toulouse, FR Czech Tech Univ Prague, CZ. Universidad de Valladolid, SP March 2009

LMI MODELLING 4. CONVEX LMI MODELLING. Didier HENRION. LAAS-CNRS Toulouse, FR Czech Tech Univ Prague, CZ. Universidad de Valladolid, SP March 2009 LMI MODELLING 4. CONVEX LMI MODELLING Didier HENRION LAAS-CNRS Toulouse, FR Czech Tech Univ Prague, CZ Universidad de Valladolid, SP March 2009 Minors A minor of a matrix F is the determinant of a submatrix

More information

Convex Optimization. Lecture 12 - Equality Constrained Optimization. Instructor: Yuanzhang Xiao. Fall University of Hawaii at Manoa

Convex Optimization. Lecture 12 - Equality Constrained Optimization. Instructor: Yuanzhang Xiao. Fall University of Hawaii at Manoa Convex Optimization Lecture 12 - Equality Constrained Optimization Instructor: Yuanzhang Xiao University of Hawaii at Manoa Fall 2017 1 / 19 Today s Lecture 1 Basic Concepts 2 for Equality Constrained

More information

Interior-Point Methods for Linear Optimization

Interior-Point Methods for Linear Optimization Interior-Point Methods for Linear Optimization Robert M. Freund and Jorge Vera March, 204 c 204 Robert M. Freund and Jorge Vera. All rights reserved. Linear Optimization with a Logarithmic Barrier Function

More information

minimize x subject to (x 2)(x 4) u,

minimize x subject to (x 2)(x 4) u, Math 6366/6367: Optimization and Variational Methods Sample Preliminary Exam Questions 1. Suppose that f : [, L] R is a C 2 -function with f () on (, L) and that you have explicit formulae for

More information

On the interior of the simplex, we have the Hessian of d(x), Hd(x) is diagonal with ith. µd(w) + w T c. minimize. subject to w T 1 = 1,

On the interior of the simplex, we have the Hessian of d(x), Hd(x) is diagonal with ith. µd(w) + w T c. minimize. subject to w T 1 = 1, Math 30 Winter 05 Solution to Homework 3. Recognizing the convexity of g(x) := x log x, from Jensen s inequality we get d(x) n x + + x n n log x + + x n n where the equality is attained only at x = (/n,...,

More information

Unconstrained minimization: assumptions

Unconstrained minimization: assumptions Unconstrained minimization I terminology and assumptions I gradient descent method I steepest descent method I Newton s method I self-concordant functions I implementation IOE 611: Nonlinear Programming,

More information

Lecture 24: August 28

Lecture 24: August 28 10-725: Optimization Fall 2012 Lecture 24: August 28 Lecturer: Geoff Gordon/Ryan Tibshirani Scribes: Jiaji Zhou,Tinghui Zhou,Kawa Cheung Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer:

More information

Semidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 4

Semidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 4 Semidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 4 Instructor: Farid Alizadeh Scribe: Haengju Lee 10/1/2001 1 Overview We examine the dual of the Fermat-Weber Problem. Next we will

More information

A Tutorial on Convex Optimization II: Duality and Interior Point Methods

A Tutorial on Convex Optimization II: Duality and Interior Point Methods A Tutorial on Convex Optimization II: Duality and Interior Point Methods Haitham Hindi Palo Alto Research Center (PARC), Palo Alto, California 94304 email: hhindi@parc.com Abstract In recent years, convex

More information

Convex Optimization M2

Convex Optimization M2 Convex Optimization M2 Lecture 3 A. d Aspremont. Convex Optimization M2. 1/49 Duality A. d Aspremont. Convex Optimization M2. 2/49 DMs DM par email: dm.daspremont@gmail.com A. d Aspremont. Convex Optimization

More information

IMPLEMENTATION OF INTERIOR POINT METHODS

IMPLEMENTATION OF INTERIOR POINT METHODS IMPLEMENTATION OF INTERIOR POINT METHODS IMPLEMENTATION OF INTERIOR POINT METHODS FOR SECOND ORDER CONIC OPTIMIZATION By Bixiang Wang, Ph.D. A Thesis Submitted to the School of Graduate Studies in Partial

More information

Lecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem

Lecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem Lecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem Michael Patriksson 0-0 The Relaxation Theorem 1 Problem: find f := infimum f(x), x subject to x S, (1a) (1b) where f : R n R

More information

ICS-E4030 Kernel Methods in Machine Learning

ICS-E4030 Kernel Methods in Machine Learning ICS-E4030 Kernel Methods in Machine Learning Lecture 3: Convex optimization and duality Juho Rousu 28. September, 2016 Juho Rousu 28. September, 2016 1 / 38 Convex optimization Convex optimisation This

More information

CS-E4830 Kernel Methods in Machine Learning

CS-E4830 Kernel Methods in Machine Learning CS-E4830 Kernel Methods in Machine Learning Lecture 3: Convex optimization and duality Juho Rousu 27. September, 2017 Juho Rousu 27. September, 2017 1 / 45 Convex optimization Convex optimisation This

More information

I.3. LMI DUALITY. Didier HENRION EECI Graduate School on Control Supélec - Spring 2010

I.3. LMI DUALITY. Didier HENRION EECI Graduate School on Control Supélec - Spring 2010 I.3. LMI DUALITY Didier HENRION henrion@laas.fr EECI Graduate School on Control Supélec - Spring 2010 Primal and dual For primal problem p = inf x g 0 (x) s.t. g i (x) 0 define Lagrangian L(x, z) = g 0

More information

10-725/ Optimization Midterm Exam

10-725/ Optimization Midterm Exam 10-725/36-725 Optimization Midterm Exam November 6, 2012 NAME: ANDREW ID: Instructions: This exam is 1hr 20mins long Except for a single two-sided sheet of notes, no other material or discussion is permitted

More information

Penalty and Barrier Methods General classical constrained minimization problem minimize f(x) subject to g(x) 0 h(x) =0 Penalty methods are motivated by the desire to use unconstrained optimization techniques

More information

Convex Optimization. Prof. Nati Srebro. Lecture 12: Infeasible-Start Newton s Method Interior Point Methods

Convex Optimization. Prof. Nati Srebro. Lecture 12: Infeasible-Start Newton s Method Interior Point Methods Convex Optimization Prof. Nati Srebro Lecture 12: Infeasible-Start Newton s Method Interior Point Methods Equality Constrained Optimization f 0 (x) s. t. A R p n, b R p Using access to: 2 nd order oracle

More information

Interior Point Methods for LP

Interior Point Methods for LP 11.1 Interior Point Methods for LP Katta G. Murty, IOE 510, LP, U. Of Michigan, Ann Arbor, Winter 1997. Simplex Method - A Boundary Method: Starting at an extreme point of the feasible set, the simplex

More information

Primal-dual IPM with Asymmetric Barrier

Primal-dual IPM with Asymmetric Barrier Primal-dual IPM with Asymmetric Barrier Yurii Nesterov, CORE/INMA (UCL) September 29, 2008 (IFOR, ETHZ) Yu. Nesterov Primal-dual IPM with Asymmetric Barrier 1/28 Outline 1 Symmetric and asymmetric barriers

More information

Semidefinite Programming

Semidefinite Programming Chapter 2 Semidefinite Programming 2.0.1 Semi-definite programming (SDP) Given C M n, A i M n, i = 1, 2,..., m, and b R m, the semi-definite programming problem is to find a matrix X M n for the optimization

More information

4TE3/6TE3. Algorithms for. Continuous Optimization

4TE3/6TE3. Algorithms for. Continuous Optimization 4TE3/6TE3 Algorithms for Continuous Optimization (Algorithms for Constrained Nonlinear Optimization Problems) Tamás TERLAKY Computing and Software McMaster University Hamilton, November 2005 terlaky@mcmaster.ca

More information

Convex Optimization Lecture 13

Convex Optimization Lecture 13 Convex Optimization Lecture 13 Today: Interior-Point (continued) Central Path method for SDP Feasibility and Phase I Methods From Central Path to Primal/Dual Central'Path'Log'Barrier'Method Init: Feasible&#

More information

Algorithms for constrained local optimization

Algorithms for constrained local optimization Algorithms for constrained local optimization Fabio Schoen 2008 http://gol.dsi.unifi.it/users/schoen Algorithms for constrained local optimization p. Feasible direction methods Algorithms for constrained

More information

Self-Concordant Barrier Functions for Convex Optimization

Self-Concordant Barrier Functions for Convex Optimization Appendix F Self-Concordant Barrier Functions for Convex Optimization F.1 Introduction In this Appendix we present a framework for developing polynomial-time algorithms for the solution of convex optimization

More information

Convex relaxation. In example below, we have N = 6, and the cut we are considering

Convex relaxation. In example below, we have N = 6, and the cut we are considering Convex relaxation The art and science of convex relaxation revolves around taking a non-convex problem that you want to solve, and replacing it with a convex problem which you can actually solve the solution

More information