A Constraint-Reduced MPC Algorithm for Convex Quadratic Programming, with a Modified Active-Set Identification Scheme
|
|
- Adele Warren
- 5 years ago
- Views:
Transcription
1 A Constraint-Reduced MPC Algorithm for Convex Quadratic Programming, with a Modified Active-Set Identification Scheme M. Paul Laiu 1 and (presenter) André L. Tits 2 1 Oak Ridge National Laboratory laiump@ornl.gov 2 Department ECE and ISR, University of Maryland, College Park andre@umd.edu ISMP, Bordeaux, July 1 6, 2018
2 Outline Mehrotra s Predictor/Corrector (MPC) for CQP Constraint-Reduced MPC for CQP Constraint Selection Convergence Theorem Numerical Results Conclusions
3 Convex Quadratic Program (CQP) (P) minimize f (x) := 1 x R n 2 xt Hx + c T x subject to Ax b, maximize 1 x R n, λ R m 2 xt Hx + b T λ (D) subject to Hx + c A T λ = 0, λ 0. x R n, c R n, H = H T 0, λ R m, A R m n, b R m. We are mostly interested in the case when most of the primal inequality constraints are inactive at the solution (e.g., m n). (x, λ) solves (P) (D) iff it satisfies the (KKT) system where S := diag(s) 0. Hx A T λ + c = 0 Ax b s = 0 Sλ = 0 s, λ 0,
4 MPC for CQP Given (x, λ), with s := Ax b > 0 and λ > 0, Compute the Newton (affine-scaling) search direction by solving H AT 0 x a A 0 I λ a = Hx AT λ + c 0, 0 S Λ s a Sλ where Λ := diag(λ). Set µ := s T λ/m and σ := (1 α a ) 3, with α a := argmax{α [0, 1] : s + α s a 0, λ + α λ a 0}. Compute a centering/corrector direction by solving Update H AT 0 A 0 I 0 S Λ x c λ c s c = 0 0 σµ1 S a λ a. (x +, λ + ) = (x, λ) + (α p ( x a + x c ), α d ( λ a + λ c )), with α p, α d (0, 1] such that s + := Ax + b > 0 and λ + > 0.
5 Toward Constraint Reduction By block Gaussian elimination, the Newton system becomes M x a = (Hx + c), s a = A x a, λ a = λ S 1 Λ s a, where M := H + A T S 1 ΛA = H + m λ i i=1 s i a i a T i, and similarly (same M) for the computation of the centering/corrector direction. When m > n, the main cost in computing the search direction is in forming M: approximately mn 2 /2 multiplications per iteration. If m n and we limit the sum to q appropriately selected terms, the cost per iteration will be reduced by a factor of m/q (!)
6 Outline Mehrotra s Predictor/Corrector (MPC) for CQP Constraint-Reduced MPC for CQP Constraint Selection Convergence Theorem Numerical Results Conclusions
7 Constraint Reduction (CR) for CQP: Basic Ideas Observation: For CQPs with m n, it is typical that most of the constraints are irrelevant or redundant. Idea: At each iteration, try to guess a small subset Q of constraints to compute the search direction. irrelevant n = 2 m = 13 x x redundant active Ignore many constraints Q = 6
8 CR for CQP: Reduced Normal Matrix At each iteration, select a working set Q of constraints and obtain an MPC search direction, from the current (x k, λ k ), for the problem 1 minimize x R n 2 xt Hx + c T x subject to A Q x b Q, with A Q a submatrix of A and b Q a subvector of b. Normal matrix for the reduced problem M (Q) := H + A T QS 1 Q Λ QA Q = H + λ i a i a T i s i i Q Cost of forming M (Q) reduces from mn 2 /2 to Q n 2 /2 multiplications Step sizes are still computed using all m constraints. (But the CPU cost is small compared to that of computing the search direction.)
9 CR-MPC for CQP At each iteration: Given (x, λ), with s := Ax b > 0 and λ > 0, select working set Q Compute the Newton (affine-scaling) search direction by solving M (Q) x a = (Hx + c), and set s a = A x a, λ a Q = λ Q S 1 Q Λ Q s a Q.
10 CR-MPC for CQP At each iteration: Given (x, λ), with s := Ax b > 0 and λ > 0, select working set Q Compute the Newton (affine-scaling) search direction by solving M (Q) x a = (Hx + c), and set s a = A x a, λ a Q = λ Q S 1 Q Λ Q s a Q. Set µ (Q) = s T Q λ Q/q and σ := (1 α a ) 3, with α a := argmax{α [0, 1] : s + α s a 0, λ + α λ a 0}. Compute the centering/corrector direction by solving M (Q) x c = A T QS 1 Q (σµ (Q)1 S a Q λ a Q), and set s c = A x c, λ c Q = S 1 Q ( Λ Q s c Q + σµ (Q)1 SQ a λa Q).
11 CR-MPC for CQP At each iteration: Given (x, λ), with s := Ax b > 0 and λ > 0, select working set Q Compute the Newton (affine-scaling) search direction by solving M (Q) x a = (Hx + c), and set s a = A x a, λ a Q = λ Q S 1 Q Λ Q s a Q. Set µ (Q) = s T Q λ Q/q and σ := (1 α a ) 3, with α a := argmax{α [0, 1] : s + α s a 0, λ + α λ a 0}. Compute the centering/corrector direction by solving M (Q) x c = A T QS 1 Q (σµ (Q)1 S a Q λ a Q), and set s c = A x c, λ c Q = S 1 Q ( Λ Q s c Q + σµ (Q)1 SQ a λa Q). Set the mixing parameter γ (0, 1] (see next slide). Set ( x, λ Q ) = ( x a, λ a Q) + γ( x c, λ c Q)
12 CR-MPC for CQP At each iteration: Given (x, λ), with s := Ax b > 0 and λ > 0, select working set Q Compute the Newton (affine-scaling) search direction by solving M (Q) x a = (Hx + c), and set s a = A x a, λ a Q = λ Q S 1 Q Λ Q s a Q. Set µ (Q) = s T Q λ Q/q and σ := (1 α a ) 3, with α a := argmax{α [0, 1] : s + α s a 0, λ + α λ a 0}. Compute the centering/corrector direction by solving M (Q) x c = A T QS 1 Q (σµ (Q)1 S a Q λ a Q), and set s c = A x c, λ c Q = S 1 Q ( Λ Q s c Q + σµ (Q)1 SQ a λa Q). Set the mixing parameter γ (0, 1] (see next slide). Set ( x, λ Q ) = ( x a, λ a Q) + γ( x c, λ c Q) Update. With α p, α d (0, 1] such that s + > 0 and λ + Q > 0: (x +, λ + Q ) = (x, λ Q) + (α p x, α d λ Q ), s + = Ax + b, λ + i = ((s + Q )T (λ + Q )/ Q )/s+ i, i Q.
13 CR-MPC for CQP with Convergence Safeguards The CR-MPC search direction is given by ( x, λ) = ( x a, λ a ) + γ( x c, λ c ), where mixing parameter γ guarantees that x is a descent direction for primal objective function f (indeed, in the CR context, this is critical to convergence); limits the effect of too-large x c { } γ := min γ 1, τ xa x c, τ xa σµ with γ 1 := argmax {f (x) f (x + x a + γ x c ) ζ(f (x) f (x + x a ))}, γ [0,1] where f is the primal objective function and τ, ζ (0, 1).
14 Regularized Normal Matrix M is invertible iff [H A T ] has full rank, which can be guaranteed by pre-processing. HOWEVER nonsingularity of M (Q) (and indeed unique solvability of the reduced linear systems) requires that the reduced matrix [H A T Q ] have full (numerical) rank, which is far from guaranteed. Regularization: replace M (Q) with M (Q) := W + A T QS 1 Q Λ QA Q, where W := H + ϱr, with R 0, and let ϱ > 0 go to zero as a solution of the optimization { problem } is approached. The choice ϱ := min and R = I turns out to be 1, E(x,λ) Ē adequate. Here E(x, λ) is a measure of the distance to optimality: where E(x, λ) := ( v(x, λ), w(x, λ) ), v(x, λ) = Hx + c A T λ, w i (x, λ) := min{ s i, λ i }, i = 1,..., m.
15 Outline Mehrotra s Predictor/Corrector (MPC) for CQP Constraint-Reduced MPC for CQP Constraint Selection Convergence Theorem Numerical Results Conclusions
16 How to select Q? Various constraint selection rules have been used in the past. We propose a CONDITION on the selection rule that guarantees convergence of the regularized CR-MPC algorithm. This condition is met by all existing selection rules we are aware of. Condition (CSR) The constraint-selection rule should be such that: 1. when {(x k, λ k )} is bounded away from optimality, Q k eventually includes every active (primal) constraint at limit points x of {x k } when such limit points are approached; 2. when {x k } converges to a primal solution point x, Q k eventually includes every active constraint at x.
17 Some Previously used Constraint-Selection Rules Rule JOT [Jung, O Leary, ALT: Adaptive constraint reduction for training support vector machines, Electronic Transactions on Numerical Analysis, Vol. 31, (2008)]: Q := {i : s i η}, where η = qth smallest slack s i, q being a certain decreasing function of duality measure µ that saturates at q = n. Rule FFK-CWH (for general NLP) Proposed in [Chen, Wang, He: A feasible active set QP-free method for nonlinear programming, SIAM J. Optimization, 17(2), (2006)]: Q := {i : s i E(x, λ)} based on a result in [Facchinei, Fischer, Kanzow: On the accurate identification of active constraints, SIAM J. Optimization, 9(1), (1998)]: (x x, λ λ ) E(x, λ) is bounded in a neighborhood of (x, λ )
18 Proposed Constraint-Selection Rule: Rule R Parameters: δ > 0, 0 < β < θ < 1. Input: Iteration: k, Slack variable: s k, Error: E min (value of error E k when δ k was last reduced), E k := E(x k, λ k ), Threshold: δ k 1. Output: Working set: Q k, Threshold: δ k Error: E min. if k = 0: δ 0 := δ, E min := E 0 else if E k βe min δk := θδ k 1 Emin := E k else δk := δ k 1 Select Q k := {i m s k i δ k }. Theorem: Rule R satisfies CSR.
19 Outline Mehrotra s Predictor/Corrector (MPC) for CQP Constraint-Reduced MPC for CQP Constraint Selection Convergence Theorem Numerical Results Conclusions
20 Convergence Theorem Assumptions: 1. Primal strictly feasible set is non-empty; primal solution set F P is non-empty and bounded. At every feasible point x, A A(x) has full row rank. (A(x) denotes the active set at x.) 2. There exists (unique) x where SOSC with strict complementarity holds, with (unique) λ.
21 Convergence Theorem Assumptions: 1. Primal strictly feasible set is non-empty; primal solution set F P is non-empty and bounded. At every feasible point x, A A(x) has full row rank. (A(x) denotes the active set at x.) 2. There exists (unique) x where SOSC with strict complementarity holds, with (unique) λ. Theorem Suppose that Condition CSR and Assumption 1 hold. Then {x k } F P. Suppose that, in addition, Assumption 2 holds. Then Q k contains A(x ) for k large enough. [With Rule R or Rule FFK-CWH, Q k = A(x ) for k large enough.] Q-quadratic convergence. Specifically, C > 0 such that, given any initial point (x 0, λ 0 ), there exists k such that, for all k > k, (x k+1 x, λ k+1 λ ) C (x k x, λ k λ ) 2.
22 Outline Mehrotra s Predictor/Corrector (MPC) for CQP Constraint-Reduced MPC for CQP Constraint Selection Convergence Theorem Numerical Results Conclusions
23 Randomly Generated Problems Problem setting: 1 minimize x R n 2 xt Hx + c T x subject to Ax b. A N (0, 1), c N (0, 1), x 0 U(0, 1), and s 0 U(1, 2) b := Ax 0 s 0 m := and n between 10 and 500. We consider the following two classes of Hessian matrices: 1. Strongly convex quadratic program: diagonal H, diag(h) U(0, 1). 2. Linear Program: H = 0. We solved 50 randomly generated problems for each class of H and for each problem size, and report the results averaged over the 50 problems.
24 Randomly Generated Problems Iteration count Size of working set (a) Strongly convex QP Computation time Iteration count Size of working set (b) LP Computation time
25 Data Fitting Problems Regularized minimax data fitting problem: minimize Ā x b + 1 2ᾱ xt H x x R n minimize x R n u R u + 1 2ᾱ xt H x subject to Ā x b u1, Ā x + b u1. b: noisy data measurement from a target function g. A: trigonometric basis, x: expansion coefficients H: regularization matrix, ᾱ: penalty parameter m = , n from 10 to 500 For each choice of g and for each problem size, we solved the problem 50 times and report the results averaged over the 50 problems.
26 Data Fitting Problems Iteration count Size of working set (a) g(t) = sin(10t) cos(25t 2 ) Computation time Iteration count Size of working set Computation time (b) g(t) = sin(5t 3 ) cos 2 (10t)
27 Outline Mehrotra s Predictor/Corrector (MPC) for CQP Constraint-Reduced MPC for CQP Constraint Selection Convergence Theorem Numerical Results Conclusions
28 Conclusions A convergent, constraint-reduced (CR) variant of Mehrotra s Predictor/Corrector for convex quadratic programming was stated and analyzed.
29 Conclusions A convergent, constraint-reduced (CR) variant of Mehrotra s Predictor/Corrector for convex quadratic programming was stated and analyzed. A regularization scheme was used to account for CR-triggered rank deficiency away from solutions.
30 Conclusions A convergent, constraint-reduced (CR) variant of Mehrotra s Predictor/Corrector for convex quadratic programming was stated and analyzed. A regularization scheme was used to account for CR-triggered rank deficiency away from solutions. A class of constraint-selection rules was defined by means of a sufficient condition (Condition CSR) that guarantees strong convergence properties for the resulting algorithm.
31 Conclusions A convergent, constraint-reduced (CR) variant of Mehrotra s Predictor/Corrector for convex quadratic programming was stated and analyzed. A regularization scheme was used to account for CR-triggered rank deficiency away from solutions. A class of constraint-selection rules was defined by means of a sufficient condition (Condition CSR) that guarantees strong convergence properties for the resulting algorithm. A new selection rule was proposed, based on a modified version of an active-constraint identification function due to Facchinei et al.
32 Conclusions A convergent, constraint-reduced (CR) variant of Mehrotra s Predictor/Corrector for convex quadratic programming was stated and analyzed. A regularization scheme was used to account for CR-triggered rank deficiency away from solutions. A class of constraint-selection rules was defined by means of a sufficient condition (Condition CSR) that guarantees strong convergence properties for the resulting algorithm. A new selection rule was proposed, based on a modified version of an active-constraint identification function due to Facchinei et al. Numerical results were reported that show the benefit of CR on problems with many inequality constraints, and the power of the new proposed selection rule. The slides are available from andre
A Constraint-Reduced MPC Algorithm for Convex Quadratic Programming, with a Modified Active Set Identification Scheme
Noname manuscript No. (will be inserted by the editor) A Constraint-Reduced MPC Algorithm for Convex Quadratic Programming, with a Modified Active Set Identification Scheme M. Paul Laiu André L. Tits February
More informationConstraint Reduction for Linear Programs with Many Constraints
Constraint Reduction for Linear Programs with Many Constraints André L. Tits Institute for Systems Research and Department of Electrical and Computer Engineering University of Maryland, College Park Pierre-Antoine
More informationABSTRACT. linear and convex quadratic programs with many more inequality constraints than
ABSTRACT Title of dissertation: INFEASIBLE-START CONSTRAINT-REDUCED METHODS FOR LINEAR AND CONVEX QUADRATIC OPTIMIZATION Meiyun He, Doctor of Philosophy, 211 Dissertation directed by: Professor André L.
More informationSVM May 2007 DOE-PI Dianne P. O Leary c 2007
SVM May 2007 DOE-PI Dianne P. O Leary c 2007 1 Speeding the Training of Support Vector Machines and Solution of Quadratic Programs Dianne P. O Leary Computer Science Dept. and Institute for Advanced Computer
More informationSupport Vector Machines: Maximum Margin Classifiers
Support Vector Machines: Maximum Margin Classifiers Machine Learning and Pattern Recognition: September 16, 2008 Piotr Mirowski Based on slides by Sumit Chopra and Fu-Jie Huang 1 Outline What is behind
More informationAlgorithms for Constrained Optimization
1 / 42 Algorithms for Constrained Optimization ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University April 19, 2015 2 / 42 Outline 1. Convergence 2. Sequential quadratic
More informationminimize x subject to (x 2)(x 4) u,
Math 6366/6367: Optimization and Variational Methods Sample Preliminary Exam Questions 1. Suppose that f : [, L] R is a C 2 -function with f () on (, L) and that you have explicit formulae for
More informationA Constraint-Reduced Variant of Mehrotra s Predictor-Corrector Algorithm
A Constraint-Reduced Variant of Mehrotra s Predictor-Corrector Algorithm Luke B. Winternitz, Stacey O. Nicholls, André L. Tits, Dianne P. O Leary September 24, 2007 Abstract Consider linear programs in
More informationLecture 15 Newton Method and Self-Concordance. October 23, 2008
Newton Method and Self-Concordance October 23, 2008 Outline Lecture 15 Self-concordance Notion Self-concordant Functions Operations Preserving Self-concordance Properties of Self-concordant Functions Implications
More informationMPC Infeasibility Handling
MPC Handling Thomas Wiese, TU Munich, KU Leuven supervised by H.J. Ferreau, Prof. M. Diehl (both KUL) and Dr. H. Gräb (TUM) October 9, 2008 1 / 42 MPC General MPC Strategies 2 / 42 Linear Discrete-Time
More informationWritten Examination
Division of Scientific Computing Department of Information Technology Uppsala University Optimization Written Examination 202-2-20 Time: 4:00-9:00 Allowed Tools: Pocket Calculator, one A4 paper with notes
More informationLecture: Algorithms for LP, SOCP and SDP
1/53 Lecture: Algorithms for LP, SOCP and SDP Zaiwen Wen Beijing International Center For Mathematical Research Peking University http://bicmr.pku.edu.cn/~wenzw/bigdata2018.html wenzw@pku.edu.cn Acknowledgement:
More informationOn Generalized Primal-Dual Interior-Point Methods with Non-uniform Complementarity Perturbations for Quadratic Programming
On Generalized Primal-Dual Interior-Point Methods with Non-uniform Complementarity Perturbations for Quadratic Programming Altuğ Bitlislioğlu and Colin N. Jones Abstract This technical note discusses convergence
More informationCS711008Z Algorithm Design and Analysis
CS711008Z Algorithm Design and Analysis Lecture 8 Linear programming: interior point method Dongbo Bu Institute of Computing Technology Chinese Academy of Sciences, Beijing, China 1 / 31 Outline Brief
More information10 Numerical methods for constrained problems
10 Numerical methods for constrained problems min s.t. f(x) h(x) = 0 (l), g(x) 0 (m), x X The algorithms can be roughly divided the following way: ˆ primal methods: find descent direction keeping inside
More informationALADIN An Algorithm for Distributed Non-Convex Optimization and Control
ALADIN An Algorithm for Distributed Non-Convex Optimization and Control Boris Houska, Yuning Jiang, Janick Frasch, Rien Quirynen, Dimitris Kouzoupis, Moritz Diehl ShanghaiTech University, University of
More informationPart 4: Active-set methods for linearly constrained optimization. Nick Gould (RAL)
Part 4: Active-set methods for linearly constrained optimization Nick Gould RAL fx subject to Ax b Part C course on continuoue optimization LINEARLY CONSTRAINED MINIMIZATION fx subject to Ax { } b where
More informationLinear algebra issues in Interior Point methods for bound-constrained least-squares problems
Linear algebra issues in Interior Point methods for bound-constrained least-squares problems Stefania Bellavia Dipartimento di Energetica S. Stecco Università degli Studi di Firenze Joint work with Jacek
More informationInterior Point Methods. We ll discuss linear programming first, followed by three nonlinear problems. Algorithms for Linear Programming Problems
AMSC 607 / CMSC 764 Advanced Numerical Optimization Fall 2008 UNIT 3: Constrained Optimization PART 4: Introduction to Interior Point Methods Dianne P. O Leary c 2008 Interior Point Methods We ll discuss
More informationInfeasibility Detection and an Inexact Active-Set Method for Large-Scale Nonlinear Optimization
Infeasibility Detection and an Inexact Active-Set Method for Large-Scale Nonlinear Optimization Frank E. Curtis, Lehigh University involving joint work with James V. Burke, University of Washington Daniel
More informationSecond-order cone programming
Outline Second-order cone programming, PhD Lehigh University Department of Industrial and Systems Engineering February 10, 2009 Outline 1 Basic properties Spectral decomposition The cone of squares The
More informationLinear Programming: Simplex
Linear Programming: Simplex Stephen J. Wright 1 2 Computer Sciences Department, University of Wisconsin-Madison. IMA, August 2016 Stephen Wright (UW-Madison) Linear Programming: Simplex IMA, August 2016
More informationLecture 8 Plus properties, merit functions and gap functions. September 28, 2008
Lecture 8 Plus properties, merit functions and gap functions September 28, 2008 Outline Plus-properties and F-uniqueness Equation reformulations of VI/CPs Merit functions Gap merit functions FP-I book:
More informationNumerical Optimization. Review: Unconstrained Optimization
Numerical Optimization Finding the best feasible solution Edward P. Gatzke Department of Chemical Engineering University of South Carolina Ed Gatzke (USC CHE ) Numerical Optimization ECHE 589, Spring 2011
More informationOptimization. Yuh-Jye Lee. March 21, Data Science and Machine Intelligence Lab National Chiao Tung University 1 / 29
Optimization Yuh-Jye Lee Data Science and Machine Intelligence Lab National Chiao Tung University March 21, 2017 1 / 29 You Have Learned (Unconstrained) Optimization in Your High School Let f (x) = ax
More informationInterior Point Methods in Mathematical Programming
Interior Point Methods in Mathematical Programming Clóvis C. Gonzaga Federal University of Santa Catarina, Brazil Journées en l honneur de Pierre Huard Paris, novembre 2008 01 00 11 00 000 000 000 000
More informationN. L. P. NONLINEAR PROGRAMMING (NLP) deals with optimization models with at least one nonlinear function. NLP. Optimization. Models of following form:
0.1 N. L. P. Katta G. Murty, IOE 611 Lecture slides Introductory Lecture NONLINEAR PROGRAMMING (NLP) deals with optimization models with at least one nonlinear function. NLP does not include everything
More informationOptimality Conditions for Constrained Optimization
72 CHAPTER 7 Optimality Conditions for Constrained Optimization 1. First Order Conditions In this section we consider first order optimality conditions for the constrained problem P : minimize f 0 (x)
More informationLecture 18: Optimization Programming
Fall, 2016 Outline Unconstrained Optimization 1 Unconstrained Optimization 2 Equality-constrained Optimization Inequality-constrained Optimization Mixture-constrained Optimization 3 Quadratic Programming
More informationAlgorithms for constrained local optimization
Algorithms for constrained local optimization Fabio Schoen 2008 http://gol.dsi.unifi.it/users/schoen Algorithms for constrained local optimization p. Feasible direction methods Algorithms for constrained
More informationAn Inexact Sequential Quadratic Optimization Method for Nonlinear Optimization
An Inexact Sequential Quadratic Optimization Method for Nonlinear Optimization Frank E. Curtis, Lehigh University involving joint work with Travis Johnson, Northwestern University Daniel P. Robinson, Johns
More informationConvex Optimization. Dani Yogatama. School of Computer Science, Carnegie Mellon University, Pittsburgh, PA, USA. February 12, 2014
Convex Optimization Dani Yogatama School of Computer Science, Carnegie Mellon University, Pittsburgh, PA, USA February 12, 2014 Dani Yogatama (Carnegie Mellon University) Convex Optimization February 12,
More informationLectures 9 and 10: Constrained optimization problems and their optimality conditions
Lectures 9 and 10: Constrained optimization problems and their optimality conditions Coralia Cartis, Mathematical Institute, University of Oxford C6.2/B2: Continuous Optimization Lectures 9 and 10: Constrained
More informationHomework 4. Convex Optimization /36-725
Homework 4 Convex Optimization 10-725/36-725 Due Friday November 4 at 5:30pm submitted to Christoph Dann in Gates 8013 (Remember to a submit separate writeup for each problem, with your name at the top)
More informationLecture 1. 1 Conic programming. MA 796S: Convex Optimization and Interior Point Methods October 8, Consider the conic program. min.
MA 796S: Convex Optimization and Interior Point Methods October 8, 2007 Lecture 1 Lecturer: Kartik Sivaramakrishnan Scribe: Kartik Sivaramakrishnan 1 Conic programming Consider the conic program min s.t.
More informationCSCI 1951-G Optimization Methods in Finance Part 09: Interior Point Methods
CSCI 1951-G Optimization Methods in Finance Part 09: Interior Point Methods March 23, 2018 1 / 35 This material is covered in S. Boyd, L. Vandenberge s book Convex Optimization https://web.stanford.edu/~boyd/cvxbook/.
More informationInterior-Point Methods for Linear Optimization
Interior-Point Methods for Linear Optimization Robert M. Freund and Jorge Vera March, 204 c 204 Robert M. Freund and Jorge Vera. All rights reserved. Linear Optimization with a Logarithmic Barrier Function
More informationA Constraint-Reduced Variant of Mehrotra s Predictor-Corrector Algorithm
A Constraint-Reduced Variant of Mehrotra s Predictor-Corrector Algorithm Luke B. Winternitz, Stacey O. Nicholls, André L. Tits, Dianne P. O Leary September 7, 2010 Abstract Consider linear programs in
More informationThe convergence of stationary iterations with indefinite splitting
The convergence of stationary iterations with indefinite splitting Michael C. Ferris Joint work with: Tom Rutherford and Andy Wathen University of Wisconsin, Madison 6th International Conference on Complementarity
More informationNonlinear Optimization: What s important?
Nonlinear Optimization: What s important? Julian Hall 10th May 2012 Convexity: convex problems A local minimizer is a global minimizer A solution of f (x) = 0 (stationary point) is a minimizer A global
More informationSupport Vector Machine (SVM) and Kernel Methods
Support Vector Machine (SVM) and Kernel Methods CE-717: Machine Learning Sharif University of Technology Fall 2016 Soleymani Outline Margin concept Hard-Margin SVM Soft-Margin SVM Dual Problems of Hard-Margin
More informationAgenda. Interior Point Methods. 1 Barrier functions. 2 Analytic center. 3 Central path. 4 Barrier method. 5 Primal-dual path following algorithms
Agenda Interior Point Methods 1 Barrier functions 2 Analytic center 3 Central path 4 Barrier method 5 Primal-dual path following algorithms 6 Nesterov Todd scaling 7 Complexity analysis Interior point
More informationSF2822 Applied Nonlinear Optimization. Preparatory question. Lecture 9: Sequential quadratic programming. Anders Forsgren
SF2822 Applied Nonlinear Optimization Lecture 9: Sequential quadratic programming Anders Forsgren SF2822 Applied Nonlinear Optimization, KTH / 24 Lecture 9, 207/208 Preparatory question. Try to solve theory
More information5.5 Quadratic programming
5.5 Quadratic programming Minimize a quadratic function subject to linear constraints: 1 min x t Qx + c t x 2 s.t. a t i x b i i I (P a t i x = b i i E x R n, where Q is an n n matrix, I and E are the
More informationLecture 3. Optimization Problems and Iterative Algorithms
Lecture 3 Optimization Problems and Iterative Algorithms January 13, 2016 This material was jointly developed with Angelia Nedić at UIUC for IE 598ns Outline Special Functions: Linear, Quadratic, Convex
More informationOptimization. Yuh-Jye Lee. March 28, Data Science and Machine Intelligence Lab National Chiao Tung University 1 / 40
Optimization Yuh-Jye Lee Data Science and Machine Intelligence Lab National Chiao Tung University March 28, 2017 1 / 40 The Key Idea of Newton s Method Let f : R n R be a twice differentiable function
More informationConvex Optimization. Newton s method. ENSAE: Optimisation 1/44
Convex Optimization Newton s method ENSAE: Optimisation 1/44 Unconstrained minimization minimize f(x) f convex, twice continuously differentiable (hence dom f open) we assume optimal value p = inf x f(x)
More informationNonlinear Optimization for Optimal Control
Nonlinear Optimization for Optimal Control Pieter Abbeel UC Berkeley EECS Many slides and figures adapted from Stephen Boyd [optional] Boyd and Vandenberghe, Convex Optimization, Chapters 9 11 [optional]
More informationNumerical Methods for Model Predictive Control. Jing Yang
Numerical Methods for Model Predictive Control Jing Yang Kongens Lyngby February 26, 2008 Technical University of Denmark Informatics and Mathematical Modelling Building 321, DK-2800 Kongens Lyngby, Denmark
More informationInterior Point Algorithms for Constrained Convex Optimization
Interior Point Algorithms for Constrained Convex Optimization Chee Wei Tan CS 8292 : Advanced Topics in Convex Optimization and its Applications Fall 2010 Outline Inequality constrained minimization problems
More informationInterior Point Methods for LP
11.1 Interior Point Methods for LP Katta G. Murty, IOE 510, LP, U. Of Michigan, Ann Arbor, Winter 1997. Simplex Method - A Boundary Method: Starting at an extreme point of the feasible set, the simplex
More information1 Computing with constraints
Notes for 2017-04-26 1 Computing with constraints Recall that our basic problem is minimize φ(x) s.t. x Ω where the feasible set Ω is defined by equality and inequality conditions Ω = {x R n : c i (x)
More informationA GLOBALLY CONVERGENT STABILIZED SQP METHOD: SUPERLINEAR CONVERGENCE
A GLOBALLY CONVERGENT STABILIZED SQP METHOD: SUPERLINEAR CONVERGENCE Philip E. Gill Vyacheslav Kungurtsev Daniel P. Robinson UCSD Center for Computational Mathematics Technical Report CCoM-14-1 June 30,
More informationEE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 17
EE/ACM 150 - Applications of Convex Optimization in Signal Processing and Communications Lecture 17 Andre Tkacenko Signal Processing Research Group Jet Propulsion Laboratory May 29, 2012 Andre Tkacenko
More informationFollowing The Central Trajectory Using The Monomial Method Rather Than Newton's Method
Following The Central Trajectory Using The Monomial Method Rather Than Newton's Method Yi-Chih Hsieh and Dennis L. Bricer Department of Industrial Engineering The University of Iowa Iowa City, IA 52242
More informationSMO vs PDCO for SVM: Sequential Minimal Optimization vs Primal-Dual interior method for Convex Objectives for Support Vector Machines
vs for SVM: Sequential Minimal Optimization vs Primal-Dual interior method for Convex Objectives for Support Vector Machines Ding Ma Michael Saunders Working paper, January 5 Introduction In machine learning,
More informationInterior Point Methods for Convex Quadratic and Convex Nonlinear Programming
School of Mathematics T H E U N I V E R S I T Y O H F E D I N B U R G Interior Point Methods for Convex Quadratic and Convex Nonlinear Programming Jacek Gondzio Email: J.Gondzio@ed.ac.uk URL: http://www.maths.ed.ac.uk/~gondzio
More informationCE 191: Civil and Environmental Engineering Systems Analysis. LEC 05 : Optimality Conditions
CE 191: Civil and Environmental Engineering Systems Analysis LEC : Optimality Conditions Professor Scott Moura Civil & Environmental Engineering University of California, Berkeley Fall 214 Prof. Moura
More informationECE580 Fall 2015 Solution to Midterm Exam 1 October 23, Please leave fractions as fractions, but simplify them, etc.
ECE580 Fall 2015 Solution to Midterm Exam 1 October 23, 2015 1 Name: Solution Score: /100 This exam is closed-book. You must show ALL of your work for full credit. Please read the questions carefully.
More informationCSCI : Optimization and Control of Networks. Review on Convex Optimization
CSCI7000-016: Optimization and Control of Networks Review on Convex Optimization 1 Convex set S R n is convex if x,y S, λ,µ 0, λ+µ = 1 λx+µy S geometrically: x,y S line segment through x,y S examples (one
More informationLinear Regression (continued)
Linear Regression (continued) Professor Ameet Talwalkar Professor Ameet Talwalkar CS260 Machine Learning Algorithms February 6, 2017 1 / 39 Outline 1 Administration 2 Review of last lecture 3 Linear regression
More informationSupport Vector Machine (SVM) and Kernel Methods
Support Vector Machine (SVM) and Kernel Methods CE-717: Machine Learning Sharif University of Technology Fall 2014 Soleymani Outline Margin concept Hard-Margin SVM Soft-Margin SVM Dual Problems of Hard-Margin
More informationSparsity Regularization
Sparsity Regularization Bangti Jin Course Inverse Problems & Imaging 1 / 41 Outline 1 Motivation: sparsity? 2 Mathematical preliminaries 3 l 1 solvers 2 / 41 problem setup finite-dimensional formulation
More informationBOUNDS ON EIGENVALUES OF MATRICES ARISING FROM INTERIOR-POINT METHODS
Cahier du GERAD G-2012-42 BOUNDS ON EIGENVALUES OF MATRICES ARISING FROM INTERIOR-POINT METHODS CHEN GREIF, ERIN MOULDING, AND DOMINIQUE ORBAN Abstract. Interior-point methods feature prominently among
More informationA Brief Review on Convex Optimization
A Brief Review on Convex Optimization 1 Convex set S R n is convex if x,y S, λ,µ 0, λ+µ = 1 λx+µy S geometrically: x,y S line segment through x,y S examples (one convex, two nonconvex sets): A Brief Review
More informationA Smoothing Newton Method for Solving Absolute Value Equations
A Smoothing Newton Method for Solving Absolute Value Equations Xiaoqin Jiang Department of public basic, Wuhan Yangtze Business University, Wuhan 430065, P.R. China 392875220@qq.com Abstract: In this paper,
More informationChapter 3 Numerical Methods
Chapter 3 Numerical Methods Part 2 3.2 Systems of Equations 3.3 Nonlinear and Constrained Optimization 1 Outline 3.2 Systems of Equations 3.3 Nonlinear and Constrained Optimization Summary 2 Outline 3.2
More informationCS-E4830 Kernel Methods in Machine Learning
CS-E4830 Kernel Methods in Machine Learning Lecture 3: Convex optimization and duality Juho Rousu 27. September, 2017 Juho Rousu 27. September, 2017 1 / 45 Convex optimization Convex optimisation This
More informationConstrained optimization: direct methods (cont.)
Constrained optimization: direct methods (cont.) Jussi Hakanen Post-doctoral researcher jussi.hakanen@jyu.fi Direct methods Also known as methods of feasible directions Idea in a point x h, generate a
More informationA GLOBALLY CONVERGENT STABILIZED SQP METHOD
A GLOBALLY CONVERGENT STABILIZED SQP METHOD Philip E. Gill Daniel P. Robinson July 6, 2013 Abstract Sequential quadratic programming SQP methods are a popular class of methods for nonlinearly constrained
More informationA STABILIZED SQP METHOD: SUPERLINEAR CONVERGENCE
A STABILIZED SQP METHOD: SUPERLINEAR CONVERGENCE Philip E. Gill Vyacheslav Kungurtsev Daniel P. Robinson UCSD Center for Computational Mathematics Technical Report CCoM-14-1 June 30, 2014 Abstract Regularized
More informationOptimality, Duality, Complementarity for Constrained Optimization
Optimality, Duality, Complementarity for Constrained Optimization Stephen Wright University of Wisconsin-Madison May 2014 Wright (UW-Madison) Optimality, Duality, Complementarity May 2014 1 / 41 Linear
More informationAlgorithms for nonlinear programming problems II
Algorithms for nonlinear programming problems II Martin Branda Charles University in Prague Faculty of Mathematics and Physics Department of Probability and Mathematical Statistics Computational Aspects
More information18. Primal-dual interior-point methods
L. Vandenberghe EE236C (Spring 213-14) 18. Primal-dual interior-point methods primal-dual central path equations infeasible primal-dual method primal-dual method for self-dual embedding 18-1 Symmetric
More informationABSTRACT. been done to make these machines more efficient in classification. In our work, we
ABSTRACT Title of thesis: THE USE OF PRECONDITIONING FOR TRAINING SUPPORT VECTOR MACHINES Jhacova Ashira Williams, Master of Science, 2008 Thesis directed by: Dr. Dianne P. O Leary Department of Computer
More informationE5295/5B5749 Convex optimization with engineering applications. Lecture 5. Convex programming and semidefinite programming
E5295/5B5749 Convex optimization with engineering applications Lecture 5 Convex programming and semidefinite programming A. Forsgren, KTH 1 Lecture 5 Convex optimization 2006/2007 Convex quadratic program
More informationPenalty and Barrier Methods General classical constrained minimization problem minimize f(x) subject to g(x) 0 h(x) =0 Penalty methods are motivated by the desire to use unconstrained optimization techniques
More informationSupport Vector Machine (SVM) and Kernel Methods
Support Vector Machine (SVM) and Kernel Methods CE-717: Machine Learning Sharif University of Technology Fall 2015 Soleymani Outline Margin concept Hard-Margin SVM Soft-Margin SVM Dual Problems of Hard-Margin
More informationINTERIOR-POINT METHODS FOR NONCONVEX NONLINEAR PROGRAMMING: CONVERGENCE ANALYSIS AND COMPUTATIONAL PERFORMANCE
INTERIOR-POINT METHODS FOR NONCONVEX NONLINEAR PROGRAMMING: CONVERGENCE ANALYSIS AND COMPUTATIONAL PERFORMANCE HANDE Y. BENSON, ARUN SEN, AND DAVID F. SHANNO Abstract. In this paper, we present global
More information2.3 Linear Programming
2.3 Linear Programming Linear Programming (LP) is the term used to define a wide range of optimization problems in which the objective function is linear in the unknown variables and the constraints are
More informationSupport Vector Machines, Kernel SVM
Support Vector Machines, Kernel SVM Professor Ameet Talwalkar Professor Ameet Talwalkar CS260 Machine Learning Algorithms February 27, 2017 1 / 40 Outline 1 Administration 2 Review of last lecture 3 SVM
More informationConstrained Optimization and Lagrangian Duality
CIS 520: Machine Learning Oct 02, 2017 Constrained Optimization and Lagrangian Duality Lecturer: Shivani Agarwal Disclaimer: These notes are designed to be a supplement to the lecture. They may or may
More informationConstrained Nonlinear Optimization Algorithms
Department of Industrial Engineering and Management Sciences Northwestern University waechter@iems.northwestern.edu Institute for Mathematics and its Applications University of Minnesota August 4, 2016
More informationLecture 7 Duality II
L. Vandenberghe EE236A (Fall 2013-14) Lecture 7 Duality II sensitivity analysis two-person zero-sum games circuit interpretation 7 1 Sensitivity analysis purpose: extract from the solution of an LP information
More informationAN AUGMENTED LAGRANGIAN AFFINE SCALING METHOD FOR NONLINEAR PROGRAMMING
AN AUGMENTED LAGRANGIAN AFFINE SCALING METHOD FOR NONLINEAR PROGRAMMING XIAO WANG AND HONGCHAO ZHANG Abstract. In this paper, we propose an Augmented Lagrangian Affine Scaling (ALAS) algorithm for general
More informationScientific Computing: Optimization
Scientific Computing: Optimization Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 Course MATH-GA.2043 or CSCI-GA.2112, Spring 2012 March 8th, 2011 A. Donev (Courant Institute) Lecture
More informationComputational Optimization. Constrained Optimization Part 2
Computational Optimization Constrained Optimization Part Optimality Conditions Unconstrained Case X* is global min Conve f X* is local min SOSC f ( *) = SONC Easiest Problem Linear equality constraints
More informationMark your answers ON THE EXAM ITSELF. If you are not sure of your answer you may wish to provide a brief explanation.
CS 189 Spring 2015 Introduction to Machine Learning Midterm You have 80 minutes for the exam. The exam is closed book, closed notes except your one-page crib sheet. No calculators or electronic items.
More information2.098/6.255/ Optimization Methods Practice True/False Questions
2.098/6.255/15.093 Optimization Methods Practice True/False Questions December 11, 2009 Part I For each one of the statements below, state whether it is true or false. Include a 1-3 line supporting sentence
More informationA Regularized Interior-Point Method for Constrained Nonlinear Least Squares
A Regularized Interior-Point Method for Constrained Nonlinear Least Squares XII Brazilian Workshop on Continuous Optimization Abel Soares Siqueira Federal University of Paraná - Curitiba/PR - Brazil Dominique
More informationInterior-Point Methods
Interior-Point Methods Stephen Wright University of Wisconsin-Madison Simons, Berkeley, August, 2017 Wright (UW-Madison) Interior-Point Methods August 2017 1 / 48 Outline Introduction: Problems and Fundamentals
More informationNumerical optimization
Numerical optimization Lecture 4 Alexander & Michael Bronstein tosca.cs.technion.ac.il/book Numerical geometry of non-rigid shapes Stanford University, Winter 2009 2 Longest Slowest Shortest Minimal Maximal
More informationA FULL-NEWTON STEP INFEASIBLE-INTERIOR-POINT ALGORITHM COMPLEMENTARITY PROBLEMS
Yugoslav Journal of Operations Research 25 (205), Number, 57 72 DOI: 0.2298/YJOR3055034A A FULL-NEWTON STEP INFEASIBLE-INTERIOR-POINT ALGORITHM FOR P (κ)-horizontal LINEAR COMPLEMENTARITY PROBLEMS Soodabeh
More informationOptimization. Escuela de Ingeniería Informática de Oviedo. (Dpto. de Matemáticas-UniOvi) Numerical Computation Optimization 1 / 30
Optimization Escuela de Ingeniería Informática de Oviedo (Dpto. de Matemáticas-UniOvi) Numerical Computation Optimization 1 / 30 Unconstrained optimization Outline 1 Unconstrained optimization 2 Constrained
More informationUNDERGROUND LECTURE NOTES 1: Optimality Conditions for Constrained Optimization Problems
UNDERGROUND LECTURE NOTES 1: Optimality Conditions for Constrained Optimization Problems Robert M. Freund February 2016 c 2016 Massachusetts Institute of Technology. All rights reserved. 1 1 Introduction
More informationOptimization Problems with Constraints - introduction to theory, numerical Methods and applications
Optimization Problems with Constraints - introduction to theory, numerical Methods and applications Dr. Abebe Geletu Ilmenau University of Technology Department of Simulation and Optimal Processes (SOP)
More informationOptimization for Communications and Networks. Poompat Saengudomlert. Session 4 Duality and Lagrange Multipliers
Optimization for Communications and Networks Poompat Saengudomlert Session 4 Duality and Lagrange Multipliers P Saengudomlert (2015) Optimization Session 4 1 / 14 24 Dual Problems Consider a primal convex
More informationSufficient Conditions for Finite-variable Constrained Minimization
Lecture 4 It is a small de tour but it is important to understand this before we move to calculus of variations. Sufficient Conditions for Finite-variable Constrained Minimization ME 256, Indian Institute
More informationNumerical Optimization Professor Horst Cerjak, Horst Bischof, Thomas Pock Mat Vis-Gra SS09
Numerical Optimization 1 Working Horse in Computer Vision Variational Methods Shape Analysis Machine Learning Markov Random Fields Geometry Common denominator: optimization problems 2 Overview of Methods
More informationISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints
ISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints Instructor: Prof. Kevin Ross Scribe: Nitish John October 18, 2011 1 The Basic Goal The main idea is to transform a given constrained
More information