Nonlinear Optimization Solvers

Similar documents
Infeasibility Detection in Nonlinear Optimization

Solving MPECs Implicit Programming and NLP Methods

What s New in Active-Set Methods for Nonlinear Optimization?

Infeasibility Detection and an Inexact Active-Set Method for Large-Scale Nonlinear Optimization

5 Handling Constraints

Survey of NLP Algorithms. L. T. Biegler Chemical Engineering Department Carnegie Mellon University Pittsburgh, PA

Nonlinear Programming, Elastic Mode, SQP, MPEC, MPCC, complementarity

Lecture 13: Constrained optimization


Interior Methods for Mathematical Programs with Complementarity Constraints

Constrained Nonlinear Optimization Algorithms

1 Computing with constraints

Algorithms for constrained local optimization

MS&E 318 (CME 338) Large-Scale Numerical Optimization

4TE3/6TE3. Algorithms for. Continuous Optimization

Lecture 15: SQP methods for equality constrained optimization

POWER SYSTEMS in general are currently operating

INTERIOR-POINT METHODS FOR NONCONVEX NONLINEAR PROGRAMMING: COMPLEMENTARITY CONSTRAINTS

Lecture 3 Interior Point Methods and Nonlinear Optimization

INTERIOR-POINT METHODS FOR NONCONVEX NONLINEAR PROGRAMMING: CONVERGENCE ANALYSIS AND COMPUTATIONAL PERFORMANCE

University of Erlangen-Nürnberg and Academy of Sciences of the Czech Republic. Solving MPECs by Implicit Programming and NLP Methods p.

SF2822 Applied Nonlinear Optimization. Preparatory question. Lecture 9: Sequential quadratic programming. Anders Forsgren

Multidisciplinary System Design Optimization (MSDO)

An Inexact Sequential Quadratic Optimization Method for Nonlinear Optimization

A New Penalty-SQP Method

LP. Kap. 17: Interior-point methods

Applications of Linear Programming

arxiv: v1 [math.na] 8 Jun 2018

Algorithms for Constrained Optimization

IP-PCG An interior point algorithm for nonlinear constrained optimization

Nonlinear optimization

2.3 Linear Programming

The Squared Slacks Transformation in Nonlinear Programming

ISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints

Penalty and Barrier Methods. So we again build on our unconstrained algorithms, but in a different way.

A Primal-Dual Augmented Lagrangian Penalty-Interior-Point Filter Line Search Algorithm

CONVERGENCE ANALYSIS OF AN INTERIOR-POINT METHOD FOR NONCONVEX NONLINEAR PROGRAMMING

Computational Finance

Part 5: Penalty and augmented Lagrangian methods for equality constrained optimization. Nick Gould (RAL)

Solving Multi-Leader-Common-Follower Games

PDE-Constrained and Nonsmooth Optimization

Numerical Optimization for Economists

CS6375: Machine Learning Gautam Kunapuli. Support Vector Machines

Optimization Problems with Constraints - introduction to theory, numerical Methods and applications

An Inexact Newton Method for Nonlinear Constrained Optimization

AN AUGMENTED LAGRANGIAN AFFINE SCALING METHOD FOR NONLINEAR PROGRAMMING

REGULARIZED SEQUENTIAL QUADRATIC PROGRAMMING METHODS

INTERIOR-POINT ALGORITHMS, PENALTY METHODS AND EQUILIBRIUM PROBLEMS

10 Numerical methods for constrained problems

The use of second-order information in structural topology optimization. Susana Rojas Labanda, PhD student Mathias Stolpe, Senior researcher

Interior Point Methods. We ll discuss linear programming first, followed by three nonlinear problems. Algorithms for Linear Programming Problems

Solution Methods. Richard Lusby. Department of Management Engineering Technical University of Denmark

Solving a Signalized Traffic Intersection Problem with NLP Solvers

Scientific Computing: An Introductory Survey

Scientific Computing: An Introductory Survey

Inexact Newton Methods and Nonlinear Constrained Optimization

MS&E 318 (CME 338) Large-Scale Numerical Optimization

Derivative-Based Numerical Method for Penalty-Barrier Nonlinear Programming

Lecture 11 and 12: Penalty methods and augmented Lagrangian methods for nonlinear programming

Numerical Optimization

An Inexact Newton Method for Optimization

Apolynomialtimeinteriorpointmethodforproblemswith nonconvex constraints

Numerical Nonlinear Optimization with WORHP

GAMS/MINOS MINOS 1 GAMS/MINOS. Table of Contents:

ALADIN An Algorithm for Distributed Non-Convex Optimization and Control

Software for Integer and Nonlinear Optimization

arxiv: v2 [math.oc] 11 Jan 2018

Lagrange duality. The Lagrangian. We consider an optimization program of the form

Written Examination

Implementation of an Interior Point Multidimensional Filter Line Search Method for Constrained Optimization

minimize x subject to (x 2)(x 4) u,

Primal-Dual Interior-Point Methods

Large-Scale Nonlinear Optimization with Inexact Step Computations

12. Interior-point methods

Outline. Scientific Computing: An Introductory Survey. Optimization. Optimization Problems. Examples: Optimization Problems

NUMERICAL OPTIMIZATION. J. Ch. Gilbert

A PRIMAL-DUAL TRUST REGION ALGORITHM FOR NONLINEAR OPTIMIZATION

INTERIOR-POINT METHODS FOR NONLINEAR, SECOND-ORDER CONE, AND SEMIDEFINITE PROGRAMMING. Hande Yurttan Benson

Numerical Methods for PDE-Constrained Optimization

Optimisation in Higher Dimensions

Part 4: Active-set methods for linearly constrained optimization. Nick Gould (RAL)

Support Vector Machine (continued)

Hot-Starting NLP Solvers

INTERIOR-POINT ALGORITHMS, PENALTY METHODS AND EQUILIBRIUM PROBLEMS

Recent Adaptive Methods for Nonlinear Optimization

Trust-Region SQP Methods with Inexact Linear System Solves for Large-Scale Optimization

On sequential optimality conditions for constrained optimization. José Mario Martínez martinez

Preprint ANL/MCS-P , Dec 2002 (Revised Nov 2003, Mar 2004) Mathematics and Computer Science Division Argonne National Laboratory

IPSOL: AN INTERIOR POINT SOLVER FOR NONCONVEX OPTIMIZATION PROBLEMS

Effective reformulations of the truss topology design problem

Some Recent Advances in Mixed-Integer Nonlinear Programming

Lecture 3. Optimization Problems and Iterative Algorithms

2.098/6.255/ Optimization Methods Practice True/False Questions

AN INTERIOR-POINT METHOD FOR NONLINEAR OPTIMIZATION PROBLEMS WITH LOCATABLE AND SEPARABLE NONSMOOTHNESS

Steering Exact Penalty Methods for Nonlinear Programming

E5295/5B5749 Convex optimization with engineering applications. Lecture 8. Smooth convex unconstrained and equality-constrained minimization

Sparse Optimization Lecture: Basic Sparse Optimization Models

Linear programming II

Pattern Classification, and Quadratic Problems

Constrained Optimization and Lagrangian Duality

Transcription:

ICE05 Argonne, July 19, 2005 Nonlinear Optimization Solvers Sven Leyffer leyffer@mcs.anl.gov Mathematics & Computer Science Division, Argonne National Laboratory Nonlinear Optimization Methods Optimization Software Troubleshooting Conclusions

NONLINEAR OPTIMIZATION METHODS Sven Leyffer Nonlinear Optimization Solvers 2 of 51

Overview: Nonlinear Optimization Methods Smooth nonlinear programming (NLP) problem Solution methods: minimize f (x) objective x subject to c(x) = 0 constraints x 0 variables 1. active set methods: SQP & projected gradient 2. interior point methods converge near solution x... Sven Leyffer Nonlinear Optimization Solvers 3 of 51

Sequential Quadratic Programming (SQP) General NLP minimize x f (x) subject to c(x) = 0 & x 0 REPEAT 1. Solve QP for (s, y k+1, z k+1 ) minimize s f T k s + 1 2 st H k s subject to c k + ck T s = 0 ( y k+1) x k + s 0 ( z k+1 0) 2. Set x k+1 = x k + s, & k = k + 1 UNTIL convergence... estimate active set A k := {i : x i = 0}... quadratic convergence near x... once A identified Sven Leyffer Nonlinear Optimization Solvers 4 of 51

General NLP Interior Point Methods (IPM) minimize x f (x) subject to c(x) = 0 & x 0 Perturbed µ > 0 optimality conditions (x, z 0) f (x) c(x) T y z F µ (x, y, z) = c(x) = 0 Xz µe Primal-dual formulation Central path {x(µ), y(µ), z(µ) : µ > 0} Apply Newton s method for sequence µ 0 Sven Leyffer Nonlinear Optimization Solvers 5 of 51

Interior Point Methods (IPM) REPEAT 1. Choose µ k 0 2. Solve F µk (x, y, z) = 0 to accuracy O(µ) 2 L k c(x k ) T I x c(x k ) 0 0 y = F µk (x k, y k, z k ) Z k 0 X k z where X k diagonal matrix of x k 3. Set x k+1 = x k + x,... & k = k + 1 UNTIL convergence... SQP & IPM are variants of Newton s method Sven Leyffer Nonlinear Optimization Solvers 6 of 51

GLOBAL CONVERGENCE Sven Leyffer Nonlinear Optimization Solvers 7 of 51

Overview: Global Convergence SQP & IPM converge quadratically near solution... but diverge far from solution convergence from remote starting points: 1. merit / penalty function to measure progress 2. enforce descent in merit function by... 2.1 line-search... backtrack on SQP step: x k + α x for α = 1, 1 2,... 2.2 restrict step with trust-region: x ρ k... interfere as little as possible with SQP/IPM Sven Leyffer Nonlinear Optimization Solvers 8 of 51

Penalty & Merit Functions Augmented Lagrangian (LANCELOT & MINOS) minimize x L(x, y k, ρ k ) = f (x) y T k c(x) + 1 2 ρ k c(x) 2 As y k y : x k x for ρ k > ρ No ill-conditioning, improves convergence rate Exact penalty function: π > 0 penalty parameter minimize x Φ(x, π) = f (x) + π c(x) equivalence of optimality exact for π > y D nonsmooth Sl 1 QP method Sven Leyffer Nonlinear Optimization Solvers 9 of 51

Trust Region Methods Globalize SQP (IPM) by adding trust region, k > 0 { minimize fk T s + 1 s 2 st H k s subject to c k + A T k s = 0, x k + s 0, s k Sven Leyffer Nonlinear Optimization Solvers 10 of 51

Trust Region Methods Globalize SQP (IPM) by adding trust region, k > 0 { minimize fk T s + 1 s 2 st H k s subject to c k + A T k s = 0, x k + s 0, s k Sven Leyffer Nonlinear Optimization Solvers 10 of 51

OPTIMIZATION SOFTWARE Sven Leyffer Nonlinear Optimization Solvers 11 of 51

Overview: Some Optimization Software Augmented Lagrangian LANCELOT: bound constrained; trust-region MINOS: linearly constrained; line-search PENNON: line-search or trust-region Sequential quadratic programming FILTER: trust-region; no penalty function KNITRO: trust-region; SLP-EQP... options "alg=3"; SNOPT: line-search; l 1 exact penalty function Interior point methods KNITRO: trust-region; SQP on barrier problem LOQO: line-search; diagonal perturbation IPOPT: line-search; no penalty function Sven Leyffer Nonlinear Optimization Solvers 12 of 51

Rule of Thumb for Solver Choice Subjective rules of thumb... trust-region more robust than line-search IPM faster than SQP... but... SQP more robust than IPM projected gradient is best for bound constrained optimization IPM may fail, if feasible set empty Sven Leyffer Nonlinear Optimization Solvers 13 of 51

Solver Options in AMPL 1. Setting solver options in AMPL: option snopt_options "outlev=2 \ timing=1"; 2. Useful solver options: outlev output level timing report solution times maxiter max. number of iterations mu initial barrier parameter compl_frm form of complements equation 3. Check with each solver for names, e.g. knitro -=... to see list of options (from unix prompt) Sven Leyffer Nonlinear Optimization Solvers 15 of 51

Life Cycle Savings param T integer; # number of periods param beta, default 0.9; # utility param param r, default 0.2; # interest rate param w{1..t} >= 0; # wages in period t var S{0..T} >= 0; # savings in period t var c{1..t} >= 0; # consumption in period t maximize utility: sum{t in 1..T} beta^t * (-exp(-c[t])); subject to budget{t in 0..T-1}: S[t+1] = (1+r)*S[t] + w[t+1] - c[t+1]; Cinit: S[0] = 0; Cend: S[T] = 0; Sven Leyffer Nonlinear Optimization Solvers 17 of 51

Solver Output for SNOPT Major Minors Step nobj Feasible Optimal Objective ns 0 14 1 3.0E-02-2.7667695E+00 14 r 1 2 6.8E-01 2 1.7E-02-2.7087027E+00 13 n rl 2 1 1.0E+00 3 1.1E-02-2.6701545E+00 13 s 8 1 1.0E+00 9 (2.3E-07)-2.6404137E+00 13 EXIT -- optimal solution found Problem name problem No. of iterations 28 Objective value -2.6404137441E+00 No. of major iterations 8 Linear objective 0.0000000000E+00 Penalty parameter 0.000E+00 Nonlinear objective -2.6404137441E+00 No. of calls to funobj 10 No. of calls to funcon 0 No. of superbasics 13 No. of basic nonlinears 20 No. of degenerate steps 5 Percentage 17.86 Norm of x (scaled) 6.4E+00 Norm of pi (scaled) 1.0E+00 Norm of x 3.6E+00 Norm of pi 1.0E+00 Max Prim inf(scaled) 0 0.0E+00 Max Dual inf(scaled) 25 5.4E-07 Max Primal infeas 0 0.0E+00 Max Dual infeas 25 5.4E-07 Sven Leyffer Nonlinear Optimization Solvers 19 of 51

Solver Output for LOQO LOQO 6.06: outlev=2 variables: non-neg 49, free 0, bdd 0, total 49 constraints: eq 25, ineq 0, ranged 0, total 25 nonzeros: A 73, Q 25 ----------------------------------------------------------------------------- Primal Dual Sig Iter Obj Value Infeas Obj Value Infeas Fig Status P M - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 1-8.353892e+00 2.7e-01-8.353892e+00 2.9e-01 60 nonzeros: L 151, arith_ops 880 2-4.629009e+00 1.3e-01-1.193999e+01 1.5e-01 10-2.642555e+00 3.5e-06-2.638877e+00 1.1e-04 3 11-2.640877e+00 4.7e-07-2.640111e+00 2.4e-05 4 PF 14-2.640414e+00 6.5e-10-2.640412e+00 1.2e-07 6 PF DF 15-2.640414e+00 3.6e-11-2.640414e+00 1.2e-08 7 PF DF ---------------------- OPTIMAL SOLUTION FOUND; Finished call LOQO 6.06: optimal solution (15 iterations, 15 evaluations) primal objective -2.640413785 dual objective -2.640413644 Sven Leyffer Nonlinear Optimization Solvers 21 of 51

TROUBLESHOOTING Sven Leyffer Nonlinear Optimization Solvers 22 of 51

Something Under the Bed is Drooling 1. floating point (IEEE) exceptions 2. unbounded problems 2.1 unbounded objective 2.2 unbounded multipliers 3. (locally) inconsistent problems 4. suboptimal solutions... identify problem & suggest remedies Sven Leyffer Nonlinear Optimization Solvers 23 of 51

Floating Point (IEEE) Exceptions Bad example: minimize barrier function param mu default 1; var x{1..2} >= -10, <= 10; var s; minimize barrier: x[1]^2 + x[2]^2 - mu*log(s); subject to cons: s = x[1] + x[2]^2-1;... results in error message like Cannot evaluate objective at start... change initialization of s: var s := 1;... difficult, if IEEE during solve... Sven Leyffer Nonlinear Optimization Solvers 25 of 51

Unbounded Objective Penalized MPEC π = 1: minimize x1 2 + x2 2 4x 1 x 2 + πx 1 x 2 x subject to x 1, x 2 0... unbounded below for all π < 2 Sven Leyffer Nonlinear Optimization Solvers 26 of 51

Unbounded Multipliers for MPECs A nasty MPEC example... multipliers do not exist... var z{1..2} >= 0; var z3; minimize objf: z[1] + z[2] - z3; subject to lin1: -4 * z[1] + z3 <= 0; lin2: -4 * z[2] + z3 <= 0; compl: z[1]*z[2] <= 0;... solution: (z[1],z[2],z3) = (0,0,0) with objf = 0 Sven Leyffer Nonlinear Optimization Solvers 28 of 51

Unbounded Multipliers LOQO LOQO 6.06: outlev=2 Primal Dual Iter Obj Value Infeas Obj Value Infeas - - - - - - - - - - - - - - - - - - - - - - - - - - 1 1.000000e+00 0.0e+00 0.000000e+00 1.1e+00 2 6.902180e-01 2.2e-01-2.672676e-01 2.6e-01 3 2.773222e-01 1.6e-01-3.051049e-01 1.1e-01 292-8.213292e-05 1.7e-09-4.106638e-05 9.1e-07 293-8.202525e-05 1.7e-09-4.101255e-05 9.1e-07 294 nan nan nan nan 500 nan nan nan nan nan = not a number... desaster... switch solver Sven Leyffer Nonlinear Optimization Solvers 30 of 51

Unbounded Multipliers FILTER iter rho d f / hj c /hjt ------+------------+------------+--------------+-------------- 0:0 10.0000 0.00000 1.0000000 0.0000000 1:1 10.0000 1.00000 0.0000000 0.0000000 24:1 0.156250 0.196695E-05-0.98347664E-06 0.24180657E-12 25:1 0.156250 0.983477E-06-0.49173832E-06 0.60451644E-13 Norm of KKT residual... 0.471404521 max( lam_i * a_i )... 2.06155281 Largest modulus multiplier... 2711469.25 large multiplier of z[1]*z[2]<=0 multipliers NLP solvers converge linearly luckily this behaviour is very rare Sven Leyffer Nonlinear Optimization Solvers 32 of 51

Unbounded Complementarity Multiplier Multiplier ξ of X 1 x 2 0 becomes large MPEC multipliers (B-stationary) Remedy: 1. check residuals of constraints (if large then failed) 2. find degenerate indices D := {i : x 1i + x 2i < ɛ} 3. fix x 1i = 0 or x 2i = 0 for all i D In previous example: solve; fix z[1] := 0; solve; Prove optimality: examine 2 D combinations x 1i or x 2i = 0 Sven Leyffer Nonlinear Optimization Solvers 33 of 51

Locally Inconsistent Problems NLP may have no feasible point feasible set: intersection of circles Sven Leyffer Nonlinear Optimization Solvers 35 of 51

Locally Inconsistent Problems NLP may have no feasible point var x{1..2} >= -1; minimize objf: -1000*x[2]; subject to con1: (x[1]+2)^2 + x[2]^2 <= 1; con2: (x[1]-2)^2 + x[2]^2 <= 1; not all solvers recognize this... finding feasible point global optimization Sven Leyffer Nonlinear Optimization Solvers 35 of 51

Locally Inconsistent Problems LOQO Primal Dual Iter Obj Value Infeas Obj Value Infeas - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 1-1.000000e+03 4.2e+00-6.000000e+00 1.0e-00 [...] 500 2.312535e-04 7.9e-01 1.715213e+12 1.5e-01 LOQO 6.06: iteration limit... fails to converge... not useful for user dual unbounded primal infeasible Sven Leyffer Nonlinear Optimization Solvers 37 of 51

Locally Inconsistent Problems FILTER iter rho d f / hj c /hjt ------+----------+------------+------------+------------ 0:0 10.0000 0.00000-1000.0000 16.000000 1:1 10.0000 2.00000-1000.0000 8.0000000 [...] 8:2 2.00000 0.320001E-02 7.9999693 0.10240052E-04 9:2 2.00000 0.512000E-05 8.0000000 0.26214586E-10 filtersqp: Nonlinear constraints locally infeasible... fast convergence to minimum infeasibility... identify blocking constraints... modify model/data Sven Leyffer Nonlinear Optimization Solvers 39 of 51

Locally Inconsistent Problems Remedies for locally infeasible problems: 1. check your model: print constraints & residuals, e.g. solve; display conname, con.lb, con.body, con.ub; display varname, var.lb, var, var.ub;... look at violated and active constraints 2. try different nonlinear solvers (easy with AMPL) 3. build-up model from few constraints at a time 4. try different starting points... global optimization Sven Leyffer Nonlinear Optimization Solvers 40 of 51

Inconsistent Problems: Sequential Approach Example heart6.mod SNOPT: nonlinear infeasibilities minimized... after 549 iters model heart6.mod; # load model problem heart614: obj, # dummy objective a, c, t, u, v, w, # variables cons1, cons2, cons3, # partial cons4; #... constraints solve heart614; # partial model partial solution after 6 iterations... Sven Leyffer Nonlinear Optimization Solvers 42 of 51

Inconsistent Problems: Sequential Approach Example heart6.mod SNOPT: nonlinear infeasibilities minimized... after 549 iters #... continue from prev. slide problem heart616: obj, # dummy objective a, c, t, u, v, w, # variables cons1, cons2, cons3, # all cons4, cons5, cons6; #... constraints solve heart616; # full model optimal solution after 6+20 iterations...excellent Sven Leyffer Nonlinear Optimization Solvers 44 of 51

Suboptimal Solution & Multi-start Problems can have many local mimimizers default start point converges to local minimizer Sven Leyffer Nonlinear Optimization Solvers 45 of 51

Suboptimal Solution & Multi-start Problems can have many local mimimizers param pi := 3.1416; param n integer, >= 0, default 2; set N := 1..n; var x{n} >= 0, <= 32*pi, := 1; minimize objf: - sum{i in N} x[i]*sin(sqrt(x[i])); default start point converges to local minimizer Sven Leyffer Nonlinear Optimization Solvers 45 of 51

Suboptimal Solution & Multi-start param nd := 5; # discretization set D := 1..nD; param hd := 32*pi/(nD-1); param optval{d,d}; model schwefel.mod; # load model for {i in D}{ let x[1] := (i-1)*hd; for {j in D}{ let x[2] := (j-1)*hd; solve; let optval[i,j] := objf; }; # end for }; # end for Sven Leyffer Nonlinear Optimization Solvers 47 of 51

Suboptimal Solution & Multi-start display optval; optval [*,*] : 1 2 3 4 5 := 1 0 24.003-36.29-50.927 56.909 2 24.003-7.8906-67.580-67.580-67.580 3-36.29-67.5803-127.27-127.27-127.27 4-50.927-67.5803-127.27-127.27-127.27 5 56.909-67.5803-127.27-127.27-127.27 ;... there exist better multi-start procedures Sven Leyffer Nonlinear Optimization Solvers 49 of 51

CONCLUSIONS Sven Leyffer Nonlinear Optimization Solvers 50 of 51

Conclusions Nonlinear optimization much harder than LP: there is no best solver best solver is problem dependent learn a little about solvers/methods help with troubleshooting multistart & model build-up in AMPL please write good models... Sven Leyffer Nonlinear Optimization Solvers 51 of 51