Hot-Starting NLP Solvers
|
|
- Michael Phillips
- 6 years ago
- Views:
Transcription
1 Hot-Starting NLP Solvers Andreas Wächter Department of Industrial Engineering and Management Sciences Northwestern University 204 Mixed Integer Programming Workshop Ohio State University, Columbus, OH July 2, 204 Andreas Wächter (NU) Hot-starting NLP MIP 204 / 33
2 Motivation: Branch and Bound MILP Very efficient enumeration of LP nodes with Simplex Strong-branching and diving Change one bound at a time Dual Simplex can rapidly solve new instance Reason: an existing factorization of basis matrix can be used! Can speed up MILP branch-and-bound significantly Andreas Wächter (NU) Hot-starting NLP MIP / 33
3 Motivation: Branch and Bound MILP Very efficient enumeration of LP nodes with Simplex Strong-branching and diving MINLP Change one bound at a time Dual Simplex can rapidly solve new instance Reason: an existing factorization of basis matrix can be used! Can speed up MILP branch-and-bound significantly Generally, NLP solvers cannot use hot starts Derivative matrices change with each iterate Factorization of old matrices useless? Andreas Wächter (NU) Hot-starting NLP MIP / 33
4 Motivation: Branch and Bound MILP Very efficient enumeration of LP nodes with Simplex Strong-branching and diving MINLP Change one bound at a time Dual Simplex can rapidly solve new instance Reason: an existing factorization of basis matrix can be used! Can speed up MILP branch-and-bound significantly Generally, NLP solvers cannot use hot starts Derivative matrices change with each iterate Factorization of old matrices useless? Question: Can we hot-start NLP solvers, re-using existing factorizations? Andreas Wächter (NU) Hot-starting NLP MIP / 33
5 Sequential Quadratic Programming (SQP) min x R n f (x) s.t. c(x) = 0 x 0 At (x k, λ k ), step d k is computed from quadratic/linear model min d R n f (x k) T d + 2 dt W k d s.t. c k + c(x k ) T d = 0 λ k+ x k + d 0 QP(x k, λ k ) where W k = 2 f (x k ) + j λ(j) k 2 c(x k ). (Assume W k 0) x k+ = x k + d k, λ k+ from QP (usually with line search) Andreas Wächter (NU) Hot-starting NLP MIP / 33
6 Solving QPs in SQP min d R n 2 dt W k d + gk T d s.t. A k d + c k = 0 QP(x k, λ k ) d l k A k = c(x k ) T and W k = 2 L(x k, λ k ) depend on iterate (x k, λ k ) However, for closely related NLPs and subsequent SQP iterations, QP data, specifically Ak and W k, and active set do not change much. Andreas Wächter (NU) Hot-starting NLP MIP / 33
7 Solving QPs in SQP min d R n 2 dt W k d + gk T d s.t. A k d + c k = 0 QP(x k, λ k ) d l k A k = c(x k ) T and W k = 2 L(x k, λ k ) depend on iterate (x k, λ k ) However, for closely related NLPs and subsequent SQP iterations, QP data, specifically Ak and W k, and active set do not change much. Goal: Develop method for solving QP(x k, λ k ) (approximately) that efficiently uses information from previously solved QP(x, λ ) Andreas Wächter (NU) Hot-starting NLP MIP / 33
8 Active Set QP Solver min d R n 2 dt W k d + gk T d s.t. A k d + c k = 0 Compute solution from linear system: (assume W k 0) [ Wk A T k A k 0 ]( d λ ) = ( gk c k ) Andreas Wächter (NU) Hot-starting NLP MIP / 33
9 Active Set QP Solver min d R n 2 dt W k d + gk T d s.t. A k d + c k = 0 d l k A {,..., n}: Guess of the active set, i.e., d A = l A k Compute solution from linear system: (assume W k 0) W k A T k (I A ) T A k 0 0 I A 0 0 d λ µ A = g k c k l A k Andreas Wächter (NU) Hot-starting NLP MIP / 33
10 Active Set QP Solver min d R n 2 dt W k d + gk T d s.t. A k d + c k = 0 d l k A {,..., n}: Guess of the active set, i.e., d A = l A k Compute solution from linear system: (assume W k 0) W k A T k (I A ) T A k 0 0 I A 0 0 A is optimal, if d l k and µ A 0 d λ µ A = g k c k l A k Andreas Wächter (NU) Hot-starting NLP MIP / 33
11 Active Set QP Solver min d R n 2 dt W k d + gk T d s.t. A k d + c k = 0 d l k A {,..., n}: Guess of the active set, i.e., d A = l A k Compute solution from linear system: (assume W k 0) W k A T k (I A ) T A k 0 0 I A 0 0 A is optimal, if d l k and µ A 0 d λ µ A = If A not optimal, add or remove variable from A Update factorization of KKT matrix Andreas Wächter (NU) Hot-starting NLP MIP / 33 g k c k l A k
12 Warm Start vs. Hot Start min d R n 2 dt W d + g T d s.t. A d + c = 0 d l min d R n 2 dt W k d + gk T d s.t. A k d + c k = 0 d l k Andreas Wächter (NU) Hot-starting NLP MIP / 33
13 Warm Start vs. Hot Start Warm Start: min d R n 2 dt W d + g T d s.t. A d + c = 0 d l min d R n 2 dt W k d + gk T d s.t. A k d + c k = 0 d l k Use A as starting guess, factorize with new matrix data Andreas Wächter (NU) Hot-starting NLP MIP / 33
14 Warm Start vs. Hot Start Warm Start: min d R n 2 dt W d + g T d s.t. A d + c = 0 d l min d R n 2 dt W k d + gk T d s.t. A k d + c k = 0 d l k Use A as starting guess, factorize with new matrix data min d R n 2 dt W d + g T d s.t. A d + c = 0 d l min d R n 2 dt W d + gk T d s.t. A d + c k = 0 d l k Andreas Wächter (NU) Hot-starting NLP MIP / 33
15 Warm Start vs. Hot Start Warm Start: min d R n 2 dt W d + g T d s.t. A d + c = 0 d l min d R n 2 dt W k d + gk T d s.t. A k d + c k = 0 d l k Use A as starting guess, factorize with new matrix data Hot Start: min d R n 2 dt W d + g T d s.t. A d + c = 0 d l min d R n 2 dt W d + gk T d s.t. A d + c k = 0 d l k Use A as starting guess, reuse factorization with same matrix data Andreas Wächter (NU) Hot-starting NLP MIP / 33
16 Hot-Started NLP Solver in Strong Branching Root Node NLP x i x i x i x i NLP i NLP + i Andreas Wächter (NU) Hot-starting NLP MIP / 33
17 Hot-Started NLP Solver in Strong Branching. Solve root node NLP 2. SQP for each NLP ± i Root Node NLP x i x i x i x i NLP i NLP + i Andreas Wächter (NU) Hot-starting NLP MIP / 33
18 Hot-Started NLP Solver in Strong Branching. Solve root node NLP 2. SQP for each NLP ± i Root Node NLP x i x i x i x i - solve each QP ± i (x k, λ k ) NLP i NLP + i QP ± i (x 0, λ 0 ) QP ± i (x 0, λ 0 ) QP ± i (x, λ ) QP ± i (x, λ ) QP ± i (x k, λ k ) QP ± i (x k, λ k ) Andreas Wächter (NU) Hot-starting NLP MIP / 33
19 Hot-Started NLP Solver in Strong Branching. Solve root node NLP, and store state of QP(x, λ ) 2. SQP for each NLP ± i - restore QP(x, λ ) state - solve each QP ± i (x k, λ k ) using several hot-starts for QP(x, λ ) NLP i QP ± i (x 0, λ 0 ) Root Node NLP Store QP(x, λ ) x i x i x i x i NLP + i QP ± i (x 0, λ 0 ) QP ± i (x, λ ) QP ± i (x, λ ) QP ± i (x k, λ k ) QP ± i (x k, λ k ) Andreas Wächter (NU) Hot-starting NLP MIP / 33
20 Hot-Started NLP Solver in Strong Branching. Solve root node NLP, and store state of QP(x, λ ) 2. SQP for each NLP ± i - restore QP(x, λ ) state - solve each QP ± i (x k, λ k ) using several hot-starts for QP(x, λ ) Assumption: Several hot-starts for QP(x, λ ) are less work than a warm-start for QP ± i (x k, λ k ) NLP i QP ± i (x 0, λ 0 ) QP ± i (x, λ ) QP ± i (x k, λ k ) Root Node NLP Store QP(x, λ ) x i x i x i x i NLP + i QP ± i (x 0, λ 0 ) QP ± i (x, λ ) QP ± i (x k, λ k ) Andreas Wächter (NU) Hot-starting NLP MIP / 33
21 Solve Sequence of Closely Related QPs Want algorithm for solving a sequence of closely related QPs min d R n 2 dt W d + g T d s.t. A d + c = 0 d l using hot-starts min d R n 2 dt W d + g T d s.t. Ad + c = 0 d l Andreas Wächter (NU) Hot-starting NLP MIP / 33
22 Solve Sequence of Closely Related QPs Want algorithm for solving a sequence of closely related QPs min d R n 2 dt W d + g T d s.t. A d + c = 0 d l using hot-starts min d R n 2 dt W d + g T d s.t. Ad + c = 0 d l Iterative, computes increasingly accurate solutions NLP algorithm needs to be able to handle inexact solutions E.g., Sl QP solver with inexact QP solves [Curtis, Johnson, Robinson, W 204] Andreas Wächter (NU) Hot-starting NLP MIP / 33
23 Solve Sequence of Closely Related QPs Want algorithm for solving a sequence of closely related QPs min d R n 2 dt W d + g T d s.t. A d + c = 0 d l using hot-starts min d R n 2 dt W d + g T d s.t. Ad + c = 0 d l Iterative, computes increasingly accurate solutions NLP algorithm needs to be able to handle inexact solutions E.g., Sl QP solver with inexact QP solves [Curtis, Johnson, Robinson, W 204] Other application: Real-time optimal control Solve a QP or NLP at each time step Data for process changes only slowly Andreas Wächter (NU) Hot-starting NLP MIP / 33
24 Equality Constrained QPs Recall: Solution of min d R n 2 dt W d + g T d s.t. Ad + c = 0 is computed from linear system: [ ]( ) ( W A T d g = A 0 λ c ) Assumptions: W positive definite on the null space of A A has full rank. Then KKT matrix is non-singular. Andreas Wächter (NU) Hot-starting NLP MIP / 33
25 Iterative Refinement For Linear Systems Want: Md = b Have factorization of W Cheap: M d = b Andreas Wächter (NU) Hot-starting NLP MIP / 33
26 Iterative Refinement For Linear Systems Want: Md = b Have factorization of W Cheap: M d = b Compute correction p Want: M(d + p) = b Equivalently Mp = b Md Cheap: M p = b Md Update: d d + p. Andreas Wächter (NU) Hot-starting NLP MIP / 33
27 Iterative Refinement For Linear Systems Want: Md = b Have factorization of W Cheap: M d = b Compute correction p Want: M(d + p) = b Equivalently Mp = b Md Cheap: M p = b Md Update: d d + p. Iterative Refinement Algorithm: Choose d 0 For i = 0,,... Solve: M p i = b Md i Update: di+ = d i + p i Fixed point iteration converges if σ(i M M) < Andreas Wächter (NU) Hot-starting NLP MIP / 33
28 Iterative Refinement Applied To Optimality Conditions Apply to [ W A T A 0 ]( d λ ) ( ) g = c In each iteration solve [ W A T A 0 ]( pi p λ i ) ( g = c ) ( W di + A T λ i Ad i ) Update (d i+, λ i+ ) = (d i, λ i ) + (p i, p λ i ) Andreas Wächter (NU) Hot-starting NLP MIP 204 / 33
29 Iterative Refinement Applied To Optimality Conditions Apply to [ W A T A 0 ]( d λ ) ( ) g = c In each iteration solve [ W A T A 0 ]( pi p λ i ) ( g = c ) ( W di + A T λ i Ad i ) Update (d i+, λ i+ ) = (d i, λ i ) + (p i, p λ i ) Equivalently, solve QP to get (p i, p λ i ) ( ) T min p 2 pt W p + g + W d i + A T λ i p s.t. A p + (c + Ad i ) = 0 p λ i Andreas Wächter (NU) Hot-starting NLP MIP 204 / 33
30 Iterative Refinement for QPs Solve min d 2 dt W d + g T d s.t. Ad + c = 0 by computing a sequence of iterative refinement steps from ( ) T min p 2 pt W p + g + W d i + A T λ i p s.t. A p + (c + Ad i ) = 0 p λ i Update (d i+, λ i+ ) = (d i, λ i ) + (p i, p λ i ) Andreas Wächter (NU) Hot-starting NLP MIP / 33
31 Iterative Refinement for QPs Solve min d 2 dt W d + g T d s.t. Ad + c = 0 d l by computing a sequence of iterative refinement steps from ( ) T min p 2 pt W p + g + W d i + A T λ i p s.t. A p + (c + Ad i ) = 0 p λ i Update (d i+, λ i+ ) = (d i, λ i ) + (p i, p λ i ) Andreas Wächter (NU) Hot-starting NLP MIP / 33
32 Iterative Refinement for QPs Solve min d 2 dt W d + g T d s.t. Ad + c = 0 d l by computing a sequence of iterative refinement steps from ( ) T min p 2 pt W p + g + W d i + A T λ i p s.t. A p + (c + Ad i ) = 0 p + d i l Update (d i+, λ i+ ) = (d i, λ i ) + (p i, p λ i ) p λ i Andreas Wächter (NU) Hot-starting NLP MIP / 33
33 Comments ( ) T min p 2 pt W p + g + W d i + A T λ i p s.t. A p + (c + Ad i ) = 0 p + d i l p λ i Use hot-start for QP(W, A ) and products with W and A Andreas Wächter (NU) Hot-starting NLP MIP / 33
34 Comments ( ) T min p 2 pt W p + g + W d i + A T λ i p s.t. A p + (c + Ad i ) = 0 p + d i l p λ i Use hot-start for QP(W, A ) and products with W and A If strict complementarity holds and (d i, λ i ) (d, λ ): QP eventually identifies correct active set Becomes iterative refinement for equality-only case Andreas Wächter (NU) Hot-starting NLP MIP / 33
35 Comments ( ) T min p 2 pt W p + g + W d i + A T λ i p s.t. A p + (c + Ad i ) = 0 p + d i l p λ i Use hot-start for QP(W, A ) and products with W and A If strict complementarity holds and (d i, λ i ) (d, λ ): QP eventually identifies correct active set Becomes iterative refinement for equality-only case Do not need to track bound multipliers Key consequence: QP solver is responsible for handling active set Can use black-box QP solver Andreas Wächter (NU) Hot-starting NLP MIP / 33
36 Comments ( ) T min p 2 pt W p + g + W d i + A T λ i p s.t. A p + (c + Ad i ) = 0 p + d i l p λ i Use hot-start for QP(W, A ) and products with W and A If strict complementarity holds and (d i, λ i ) (d, λ ): QP eventually identifies correct active set Becomes iterative refinement for equality-only case Do not need to track bound multipliers Key consequence: QP solver is responsible for handling active set Can use black-box QP solver Locally convergent (contraction, strict complementarity,... ) Andreas Wächter (NU) Hot-starting NLP MIP / 33
37 Comments ( ) T min p 2 pt W p + g + W d i + A T λ i p s.t. A p + (c + Ad i ) = 0 p + d i l p λ i Use hot-start for QP(W, A ) and products with W and A If strict complementarity holds and (d i, λ i ) (d, λ ): QP eventually identifies correct active set Becomes iterative refinement for equality-only case Do not need to track bound multipliers Key consequence: QP solver is responsible for handling active set Can use black-box QP solver Locally convergent (contraction, strict complementarity,... ) But: Slow linear convergence rate Andreas Wächter (NU) Hot-starting NLP MIP / 33
38 Accelerated Iterative Linear Solver Can apply iterative linear solver (e.g., SQMR) to solve [ W A T A 0 ]( d λ ) ( ) g = c Andreas Wächter (NU) Hot-starting NLP MIP / 33
39 Accelerated Iterative Linear Solver Can apply iterative linear solver (e.g., SQMR) to solve [ W A T A 0 ]( d λ ) ( ) g = c Application of preconditioner requires solution of [ ]( ) ( W A T zi r g = A 0 z λ i i r c i ) Andreas Wächter (NU) Hot-starting NLP MIP / 33
40 Accelerated Iterative Linear Solver Can apply iterative linear solver (e.g., SQMR) to solve [ W A T A 0 ]( d λ ) ( ) g = c Application of preconditioner requires solution of [ ]( ) ( W A T zi r g = A 0 z λ i i r c i ) Preconditioner QP (hot-start!) min z 2 zt W z + (r g i )T z s.t. A z + ri c = 0 zi λ Andreas Wächter (NU) Hot-starting NLP MIP / 33
41 Inequality Constraints and SQMR Iterative linear solver cannot handle inequality constraints Monitor active set A i = {j : d (j) i = l (j) } If A i = A i =: A during iterative refinement, set F = A C Apply SQMR in space of free variables (di A = l A fixed) [ ]( ) ( ) W FF (A F ) T d F g F A F = 0 λ c Preconditioner requires solution of [ W FF (A F ) T A F 0 ]( z z λ ) = ( r F g r c ) Andreas Wächter (NU) Hot-starting NLP MIP / 33
42 Preconditioner QP [ W FF (A F ) T ]( d F ) ( g F ) A F 0 λ = c Preconditioner: [ W FF (A F ) T A F 0 ]( z z λ ) = ( r F g r c ) Could solve: min z 2 zt W z + (r F g ) T z F s.t. A z + r c = 0 z (j) = 0 for j A z λ i Andreas Wächter (NU) Hot-starting NLP MIP / 33
43 Preconditioner QP [ W FF (A F ) T ]( d F ) ( g F ) A F 0 λ = c Preconditioner: [ W FF (A F ) T A F 0 ]( z z λ ) = ( r F g r c ) Could solve: min z 2 zt W z + (r F g ) T z F s.t. A z + r c = 0 z (j) = 0 for j A z λ i Issues: As SQMR converges, we might have d F Stop SQMR, if di l l F Andreas Wächter (NU) Hot-starting NLP MIP / 33
44 Preconditioner QP [ W FF (A F ) T ]( d F ) ( g F ) A F 0 λ = c Preconditioner: [ W FF (A F ) T A F 0 ]( z z λ ) = ( r F g r c ) Could solve: min z 2 zt W z + (r F g ) T z F s.t. A z + r c = 0 z (j) = 0 for j A z λ i Issues: As SQMR converges, we might have d F l F Stop SQMR, if di l Negative bound multipliers if guess of A too large Andreas Wächter (NU) Hot-starting NLP MIP / 33
45 Preconditioner QP min z 2 zt W z + (rg F ) T z F s.t. A z + r c = 0 z (j) = 0 for j A Andreas Wächter (NU) Hot-starting NLP MIP / 33
46 Preconditioner QP ( min z 2 zt W z + (rg F ) T z F + g A + W A, d ) Tz i + (A A ) T A λ i s.t. A z + r c = 0 z (j) 0 for j A Andreas Wächter (NU) Hot-starting NLP MIP / 33
47 Preconditioner QP ( min z 2 zt W z + (rg F ) T z F + g A + W A, d ) Tz i + (A A ) T A λ i s.t. A z + r c = 0 z (j) 0 for j A If A too large Then eventually z (j) > 0 for some j A In that case, stop SQMR Andreas Wächter (NU) Hot-starting NLP MIP / 33
48 Preconditioner QP ( min z 2 zt W z + (rg F ) T z F + g A + W A, d ) Tz i + (A A ) T A λ i s.t. A z + r c = 0 z (j) 0 for j A If A too large Then eventually z (j) > 0 for some j A In that case, stop SQMR For fixed A Convergence does not depend on some contraction condition Convergence in finite number of iterations Andreas Wächter (NU) Hot-starting NLP MIP / 33
49 Algorithm Summary Two phases: Iterative refinement Solve iterative refinement QP Switch if active set same in two consecutive iterations Accelerated linear solver (SQMR) Solve preconditioning QP Switch if z A 0 or d i+ l Andreas Wächter (NU) Hot-starting NLP MIP / 33
50 Algorithm Summary Two phases: Iterative refinement Solve iterative refinement QP Switch if active set same in two consecutive iterations Accelerated linear solver (SQMR) Solve preconditioning QP Switch if z A 0 or d i+ l Requires one hot-started QP solve per iteration Requires only matrix-vector products with new matrices Andreas Wächter (NU) Hot-starting NLP MIP / 33
51 Algorithm Summary Two phases: Iterative refinement Solve iterative refinement QP Switch if active set same in two consecutive iterations Accelerated linear solver (SQMR) Solve preconditioning QP Switch if z A 0 or d i+ l Requires one hot-started QP solve per iteration Requires only matrix-vector products with new matrices Can use any black-box parametric QP solver Andreas Wächter (NU) Hot-starting NLP MIP / 33
52 Algorithm Summary Two phases: Iterative refinement Solve iterative refinement QP Switch if active set same in two consecutive iterations Accelerated linear solver (SQMR) Solve preconditioning QP Switch if z A 0 or d i+ l Requires one hot-started QP solve per iteration Requires only matrix-vector products with new matrices Can use any black-box parametric QP solver Locally convergent, assuming Strict complementarity Starting point close to (d, λ ) All QPs are feasible All KKT matrices are non-singular (No contraction condition with SQMR) Andreas Wächter (NU) Hot-starting NLP MIP / 33
53 Numerical Experiments Matlab implementation iqp QP solver: qpoases [Ferreau et al., 2008, Potschka et al., 200] Open source parametric QP solver in C++ Originally dense linear algebra Added sparse Schur complement approach (e.g. [Gill et al., 990]) (with Dennis Janka) Comparing iqp with qpoases CPU time measurements only include qpoases time Caveat: Sparse version not yet optimized Andreas Wächter (NU) Hot-starting NLP MIP / 33
54 Experiment : Randomly Generated Convex NLPs Convex NLP with random perturbation (dense matrices) min x R n 2 xt H 0 x + (q 0 ) T x + r 0 s.t. 2 xt H j x + (q j ) T x + r j 0 j =,..., m x 0 n {50, 200, 500, 000} m {0.2n, 0.5n, 0.8n,.5n} σ {0.0, 0.05, 0., 0.2} standard deviation of perturbation Solve sequences of 0 perturbed NLPs (3 for n = 000) Solve first NLP, store final QP as reference QP Sl QP implemented in p-sqp [Curtis] QPs solved to tight tolerance (0 8 to 0 2 ) Andreas Wächter (NU) Hot-starting NLP MIP / 33
55 Randomly Generated Convex NLPs Results Size of perturbation σ Successfully solved 99.3% 99.3% 97.7% 42.9% Avrg # iqp iters per SQP iter Avrg change in # active ineq 5.3% 9.0% 25.7% 32.4% Avrg change in # active bounds 3.4% 9.5% 3.8% 9.2% Andreas Wächter (NU) Hot-starting NLP MIP / 33
56 Randomly Generated Convex NLPs Results 00 m = 0.2n 00 m = 0.8n CPU qpoases CPU iqp 0 σ = 0.0 σ = 0.05 σ = 0. σ = 0.2. n=50 n=200 n=500 n= σ = 0.0 σ = 0.05 σ = 0. σ = 0.2. n=50 n=200 n=500 n= MatVec qpoases MatVec iqp 0 σ = 0.0 σ = 0.05 σ = 0. σ = 0.2. n=50 n=200 n=500 n=000 0 σ = 0.0 σ = 0.05 σ = 0. σ = 0.2. n=50 n=200 n=500 n=000 Andreas Wächter (NU) Hot-starting NLP MIP / 33
57 Experiment 2: Model Predictive Control Controlling hanging chain From starting position (left) Controll red ball to get chain to steady state (right) as quickly as possible Andreas Wächter (NU) Hot-starting NLP MIP / 33
58 Experiment 2: Model Predictive Control Controlling hanging chain (DAE model) T NPM min w v v i(t) w x x NPM (t) x e w u u(t) w sl u sl (t) 2 2 dt x( ),u( ) 0 i= s.t. ẋ i(t) = v i(t) t [0, T], i < N PM v i(t) = (F i+(t) F i(t)) N PM /m g t [0, T], i < N PM ẋ NPM (t) = u(t) t [0, T] x(0) = ˆx 0 u(t) [, ] 3 t [0, T] x low x (t) x high, x3 low u sl (t) x 3(t) t [0, T], u sl (t) 0, t [0, T]. Solve one QP per time step Reference QP obtained from steady state Multiple shooting approach Integrate differential equations over subintervals Computation of full derivative matrices expensive Here: Performance metric is number of matrix-vector products Andreas Wächter (NU) Hot-starting NLP MIP / 33
59 Model Predictive Control Results 5 multiple shooting intervals QP Size Solve Matrix-Vector Products N PM n m Success Min Mean Max Full A k , ,328, ,522, multiple shooting intervals QP Size Solve Matrix-Vector Products N PM n m Success Min Mean Max Full A k ,55, ,449, ,743, ,977, Andreas Wächter (NU) Hot-starting NLP MIP / 33
60 Experiment 3: MINLP Strong Branching Strong branching on root node Basic SQP method (no line search) Get approximate solution from iqp Fails if QP or NLP infeasible NLP i Root Node Store QP(x, λ ) xi xi xi xi NLP + i Instances from IBMCMU collection Convex problems, but Hessian singular Few nonlinearities Very sparse QP± i (x0, λ0) QP± i (x, λ) QP± i (xk, λk) QP i ± (x0, λ0) QP i ± (x, λ) QP i ± (xk, λk) Limitations of current qpoases code Only heuristic for handling Hessian that are not positive definite Failures due to numerical issues Refactorization triggered by ill-conditioned KKT matrix Too many failures for CLay and FLay instances Results very preliminary Andreas Wächter (NU) Hot-starting NLP MIP / 33
61 Strong Branching Results: Batch (scaled) 3 Instances total 0 Instances excluded b/c no NLP solved successfully 0 Instances excluded b/c more than 30% of NLP subproblem required refactorization for iqp 3 Instances in averages below iqp qpoases % Solved % Both solved % Factorized SQP it inner it 6.36 QP pivots iqp qpoases CPU.07 QP pivots 2.9 Fact 0.0 total CPU.36 total QP pivots.92 total Fact 0.03 : Average over all NLP subproblems of an instance Andreas Wächter (NU) Hot-starting NLP MIP / 33
62 Strong Branching Results: RSyn 48 Instances total 2 Instances excluded b/c no NLP solved successfully 27 Instances excluded b/c more than 30% of NLP subproblem required refactorization for iqp 9 Instances in averages below iqp qpoases % Solved % Both solved % Factorized SQP it inner it 4.82 QP pivots iqp qpoases CPU.03 QP pivots.58 Fact 0.7 total CPU 0.84 total QP pivots.34 total Fact 0.6 : Average over all NLP subproblems of an instance Andreas Wächter (NU) Hot-starting NLP MIP / 33
63 Strong Branching Results: SLay 4 Instances total 0 Instances excluded b/c no NLP solved successfully 0 Instances excluded b/c more than 30% of NLP subproblem required refactorization for iqp 4 Instances in averages below iqp qpoases % Solved % Both solved % Factorized SQP it inner it.00 QP pivots iqp qpoases CPU 0.73 QP pivots 0.88 Fact 0.00 total CPU 0.62 total QP pivots 0.8 total Fact 0.00 : Average over all NLP subproblems of an instance Andreas Wächter (NU) Hot-starting NLP MIP / 33
64 Strong Branching Results: Syn 48 Instances total 9 Instances excluded b/c no NLP solved successfully 6 Instances excluded b/c more than 30% of NLP subproblem required refactorization for iqp 33 Instances in averages below iqp qpoases % Solved % Both solved % Factorized SQP it inner it 2.93 QP pivots iqp qpoases CPU.96 QP pivots 3.20 Fact 0.04 total CPU 2.2 total QP pivots 3.45 total Fact 0.06 : Average over all NLP subproblems of an instance Andreas Wächter (NU) Hot-starting NLP MIP / 33
65 Experiments Summary Dense random NLPs Time factorization Time solve Significant speedup Optimal control Reducing expensive derivative calculations Strong branching Results very preliminary Sparse version of qpoases not yet mature and robust advanced tailored implementation would be faster Encouraging observations Solved many of the NLP subproblems When numerical problems are limited, similar performance Other MINLP problems More speedup for denser problems? Still rebust for problems with more nonlinearities? Andreas Wächter (NU) Hot-starting NLP MIP / 33
66 Conclusion New QP solver, using repeated hot starts for Iterative refinement Preconditioning in accelerated linear solver Somewhat encouraging preliminary results for MINLP Potential improvements Employ robust parametric QP solver Use matrix data of new QP when Updating factorization Refactorization necessary In MINLP context: Makes NLP-based branch-and-bound more attractive? Useful in other situations? Andreas Wächter (NU) Hot-starting NLP MIP / 33
67 THANK YOU! Andreas Wächter (NU) Hot-starting NLP MIP / 33
An Active-Set Quadratic Programming Method Based On Sequential Hot-Starts
An Active-Set Quadratic Programg Method Based On Sequential Hot-Starts Travis C. Johnson, Christian Kirches, and Andreas Wächter October 7, 203 Abstract A new method for solving sequences of quadratic
More informationAn Inexact Sequential Quadratic Optimization Method for Nonlinear Optimization
An Inexact Sequential Quadratic Optimization Method for Nonlinear Optimization Frank E. Curtis, Lehigh University involving joint work with Travis Johnson, Northwestern University Daniel P. Robinson, Johns
More informationA New Penalty-SQP Method
Background and Motivation Illustration of Numerical Results Final Remarks Frank E. Curtis Informs Annual Meeting, October 2008 Background and Motivation Illustration of Numerical Results Final Remarks
More informationSoftware for Integer and Nonlinear Optimization
Software for Integer and Nonlinear Optimization Sven Leyffer, leyffer@mcs.anl.gov Mathematics & Computer Science Division Argonne National Laboratory Roger Fletcher & Jeff Linderoth Advanced Methods and
More information1 Computing with constraints
Notes for 2017-04-26 1 Computing with constraints Recall that our basic problem is minimize φ(x) s.t. x Ω where the feasible set Ω is defined by equality and inequality conditions Ω = {x R n : c i (x)
More informationInfeasibility Detection and an Inexact Active-Set Method for Large-Scale Nonlinear Optimization
Infeasibility Detection and an Inexact Active-Set Method for Large-Scale Nonlinear Optimization Frank E. Curtis, Lehigh University involving joint work with James V. Burke, University of Washington Daniel
More informationInexact Newton Methods and Nonlinear Constrained Optimization
Inexact Newton Methods and Nonlinear Constrained Optimization Frank E. Curtis EPSRC Symposium Capstone Conference Warwick Mathematics Institute July 2, 2009 Outline PDE-Constrained Optimization Newton
More informationRecent Adaptive Methods for Nonlinear Optimization
Recent Adaptive Methods for Nonlinear Optimization Frank E. Curtis, Lehigh University involving joint work with James V. Burke (U. of Washington), Richard H. Byrd (U. of Colorado), Nicholas I. M. Gould
More informationAn Inexact Newton Method for Nonlinear Constrained Optimization
An Inexact Newton Method for Nonlinear Constrained Optimization Frank E. Curtis Numerical Analysis Seminar, January 23, 2009 Outline Motivation and background Algorithm development and theoretical results
More informationDirect Methods. Moritz Diehl. Optimization in Engineering Center (OPTEC) and Electrical Engineering Department (ESAT) K.U.
Direct Methods Moritz Diehl Optimization in Engineering Center (OPTEC) and Electrical Engineering Department (ESAT) K.U. Leuven Belgium Overview Direct Single Shooting Direct Collocation Direct Multiple
More informationAn Inexact Newton Method for Optimization
New York University Brown Applied Mathematics Seminar, February 10, 2009 Brief biography New York State College of William and Mary (B.S.) Northwestern University (M.S. & Ph.D.) Courant Institute (Postdoc)
More informationWhat s New in Active-Set Methods for Nonlinear Optimization?
What s New in Active-Set Methods for Nonlinear Optimization? Philip E. Gill Advances in Numerical Computation, Manchester University, July 5, 2011 A Workshop in Honor of Sven Hammarling UCSD Center for
More informationNumerical Optimization. Review: Unconstrained Optimization
Numerical Optimization Finding the best feasible solution Edward P. Gatzke Department of Chemical Engineering University of South Carolina Ed Gatzke (USC CHE ) Numerical Optimization ECHE 589, Spring 2011
More informationminimize x subject to (x 2)(x 4) u,
Math 6366/6367: Optimization and Variational Methods Sample Preliminary Exam Questions 1. Suppose that f : [, L] R is a C 2 -function with f () on (, L) and that you have explicit formulae for
More informationPDE-Constrained and Nonsmooth Optimization
Frank E. Curtis October 1, 2009 Outline PDE-Constrained Optimization Introduction Newton s method Inexactness Results Summary and future work Nonsmooth Optimization Sequential quadratic programming (SQP)
More informationTrust-Region SQP Methods with Inexact Linear System Solves for Large-Scale Optimization
Trust-Region SQP Methods with Inexact Linear System Solves for Large-Scale Optimization Denis Ridzal Department of Computational and Applied Mathematics Rice University, Houston, Texas dridzal@caam.rice.edu
More informationSome Recent Advances in Mixed-Integer Nonlinear Programming
Some Recent Advances in Mixed-Integer Nonlinear Programming Andreas Wächter IBM T.J. Watson Research Center Yorktown Heights, New York andreasw@us.ibm.com SIAM Conference on Optimization 2008 Boston, MA
More informationNumerical Methods for PDE-Constrained Optimization
Numerical Methods for PDE-Constrained Optimization Richard H. Byrd 1 Frank E. Curtis 2 Jorge Nocedal 2 1 University of Colorado at Boulder 2 Northwestern University Courant Institute of Mathematical Sciences,
More information5 Handling Constraints
5 Handling Constraints Engineering design optimization problems are very rarely unconstrained. Moreover, the constraints that appear in these problems are typically nonlinear. This motivates our interest
More informationA Trust Funnel Algorithm for Nonconvex Equality Constrained Optimization with O(ɛ 3/2 ) Complexity
A Trust Funnel Algorithm for Nonconvex Equality Constrained Optimization with O(ɛ 3/2 ) Complexity Mohammadreza Samadi, Lehigh University joint work with Frank E. Curtis (stand-in presenter), Lehigh University
More informationImplementation of a KKT-based active-set QP solver
Block-LU updates p. 1/27 Implementation of a KKT-based active-set QP solver ISMP 2006 19th International Symposium on Mathematical Programming Rio de Janeiro, Brazil, July 30 August 4, 2006 Hanh Huynh
More informationConstrained Nonlinear Optimization Algorithms
Department of Industrial Engineering and Management Sciences Northwestern University waechter@iems.northwestern.edu Institute for Mathematics and its Applications University of Minnesota August 4, 2016
More informationAlgorithms for constrained local optimization
Algorithms for constrained local optimization Fabio Schoen 2008 http://gol.dsi.unifi.it/users/schoen Algorithms for constrained local optimization p. Feasible direction methods Algorithms for constrained
More informationPart 4: Active-set methods for linearly constrained optimization. Nick Gould (RAL)
Part 4: Active-set methods for linearly constrained optimization Nick Gould RAL fx subject to Ax b Part C course on continuoue optimization LINEARLY CONSTRAINED MINIMIZATION fx subject to Ax { } b where
More informationA convex QP solver based on block-lu updates
Block-LU updates p. 1/24 A convex QP solver based on block-lu updates PP06 SIAM Conference on Parallel Processing for Scientific Computing San Francisco, CA, Feb 22 24, 2006 Hanh Huynh and Michael Saunders
More informationReal-time Constrained Nonlinear Optimization for Maximum Power Take-off of a Wave Energy Converter
Real-time Constrained Nonlinear Optimization for Maximum Power Take-off of a Wave Energy Converter Thomas Bewley 23 May 2014 Southern California Optimization Day Summary 1 Introduction 2 Nonlinear Model
More informationEfficient Numerical Methods for Nonlinear MPC and Moving Horizon Estimation
Efficient Numerical Methods for Nonlinear MPC and Moving Horizon Estimation Moritz Diehl, Hans Joachim Ferreau, and Niels Haverbeke Optimization in Engineering Center (OPTEC) and ESAT-SCD, K.U. Leuven,
More informationAlgorithms for Constrained Optimization
1 / 42 Algorithms for Constrained Optimization ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University April 19, 2015 2 / 42 Outline 1. Convergence 2. Sequential quadratic
More informationLarge-Scale Nonlinear Optimization with Inexact Step Computations
Large-Scale Nonlinear Optimization with Inexact Step Computations Andreas Wächter IBM T.J. Watson Research Center Yorktown Heights, New York andreasw@us.ibm.com IPAM Workshop on Numerical Methods for Continuous
More informationNumerical Methods I Solving Nonlinear Equations
Numerical Methods I Solving Nonlinear Equations Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 MATH-GA 2011.003 / CSCI-GA 2945.003, Fall 2014 October 16th, 2014 A. Donev (Courant Institute)
More informationMINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms
MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Jeff Linderoth Industrial and Systems Engineering University of Wisconsin-Madison Jonas Schweiger Friedrich-Alexander-Universität
More informationAM 205: lecture 19. Last time: Conditions for optimality Today: Newton s method for optimization, survey of optimization methods
AM 205: lecture 19 Last time: Conditions for optimality Today: Newton s method for optimization, survey of optimization methods Optimality Conditions: Equality Constrained Case As another example of equality
More informationAM 205: lecture 19. Last time: Conditions for optimality, Newton s method for optimization Today: survey of optimization methods
AM 205: lecture 19 Last time: Conditions for optimality, Newton s method for optimization Today: survey of optimization methods Quasi-Newton Methods General form of quasi-newton methods: x k+1 = x k α
More informationLecture 13: Constrained optimization
2010-12-03 Basic ideas A nonlinearly constrained problem must somehow be converted relaxed into a problem which we can solve (a linear/quadratic or unconstrained problem) We solve a sequence of such problems
More informationMS&E 318 (CME 338) Large-Scale Numerical Optimization
Stanford University, Management Science & Engineering (and ICME) MS&E 318 (CME 338) Large-Scale Numerical Optimization 1 Origins Instructor: Michael Saunders Spring 2015 Notes 9: Augmented Lagrangian Methods
More informationAn SR1/BFGS SQP algorithm for nonconvex nonlinear programs with block-diagonal Hessian matrix
Math. Prog. Comp. (2016) 8:435 459 DOI 10.1007/s12532-016-0101-2 FULL LENGTH PAPER An SR1/BFGS SQP algorithm for nonconvex nonlinear programs with block-diagonal Hessian matrix Dennis Janka 1 Christian
More informationNumerical Optimal Control Overview. Moritz Diehl
Numerical Optimal Control Overview Moritz Diehl Simplified Optimal Control Problem in ODE path constraints h(x, u) 0 initial value x0 states x(t) terminal constraint r(x(t )) 0 controls u(t) 0 t T minimize
More informationConstrained optimization: direct methods (cont.)
Constrained optimization: direct methods (cont.) Jussi Hakanen Post-doctoral researcher jussi.hakanen@jyu.fi Direct methods Also known as methods of feasible directions Idea in a point x h, generate a
More information5.5 Quadratic programming
5.5 Quadratic programming Minimize a quadratic function subject to linear constraints: 1 min x t Qx + c t x 2 s.t. a t i x b i i I (P a t i x = b i i E x R n, where Q is an n n matrix, I and E are the
More informationSF2822 Applied Nonlinear Optimization. Preparatory question. Lecture 9: Sequential quadratic programming. Anders Forsgren
SF2822 Applied Nonlinear Optimization Lecture 9: Sequential quadratic programming Anders Forsgren SF2822 Applied Nonlinear Optimization, KTH / 24 Lecture 9, 207/208 Preparatory question. Try to solve theory
More information23. Cutting planes and branch & bound
CS/ECE/ISyE 524 Introduction to Optimization Spring 207 8 23. Cutting planes and branch & bound ˆ Algorithms for solving MIPs ˆ Cutting plane methods ˆ Branch and bound methods Laurent Lessard (www.laurentlessard.com)
More informationSurvey of NLP Algorithms. L. T. Biegler Chemical Engineering Department Carnegie Mellon University Pittsburgh, PA
Survey of NLP Algorithms L. T. Biegler Chemical Engineering Department Carnegie Mellon University Pittsburgh, PA NLP Algorithms - Outline Problem and Goals KKT Conditions and Variable Classification Handling
More informationOptimization Problems with Constraints - introduction to theory, numerical Methods and applications
Optimization Problems with Constraints - introduction to theory, numerical Methods and applications Dr. Abebe Geletu Ilmenau University of Technology Department of Simulation and Optimal Processes (SOP)
More informationMultidisciplinary System Design Optimization (MSDO)
Multidisciplinary System Design Optimization (MSDO) Numerical Optimization II Lecture 8 Karen Willcox 1 Massachusetts Institute of Technology - Prof. de Weck and Prof. Willcox Today s Topics Sequential
More informationIE418 Integer Programming
IE418: Integer Programming Department of Industrial and Systems Engineering Lehigh University 2nd February 2005 Boring Stuff Extra Linux Class: 8AM 11AM, Wednesday February 9. Room??? Accounts and Passwords
More informationOn the Implementation of an Interior-Point Algorithm for Nonlinear Optimization with Inexact Step Computations
On the Implementation of an Interior-Point Algorithm for Nonlinear Optimization with Inexact Step Computations Frank E. Curtis Lehigh University Johannes Huber University of Basel Olaf Schenk University
More informationThe use of second-order information in structural topology optimization. Susana Rojas Labanda, PhD student Mathias Stolpe, Senior researcher
The use of second-order information in structural topology optimization Susana Rojas Labanda, PhD student Mathias Stolpe, Senior researcher What is Topology Optimization? Optimize the design of a structure
More informationDevelopment of the new MINLP Solver Decogo using SCIP - Status Report
Development of the new MINLP Solver Decogo using SCIP - Status Report Pavlo Muts with Norman Breitfeld, Vitali Gintner, Ivo Nowak SCIP Workshop 2018, Aachen Table of contents 1. Introduction 2. Automatic
More informationPenalty and Barrier Methods General classical constrained minimization problem minimize f(x) subject to g(x) 0 h(x) =0 Penalty methods are motivated by the desire to use unconstrained optimization techniques
More informationSolving LP and MIP Models with Piecewise Linear Objective Functions
Solving LP and MIP Models with Piecewise Linear Obective Functions Zonghao Gu Gurobi Optimization Inc. Columbus, July 23, 2014 Overview } Introduction } Piecewise linear (PWL) function Convex and convex
More informationAn Introduction to Algebraic Multigrid (AMG) Algorithms Derrick Cerwinsky and Craig C. Douglas 1/84
An Introduction to Algebraic Multigrid (AMG) Algorithms Derrick Cerwinsky and Craig C. Douglas 1/84 Introduction Almost all numerical methods for solving PDEs will at some point be reduced to solving A
More informationInexact Newton-Type Optimization with Iterated Sensitivities
Inexact Newton-Type Optimization with Iterated Sensitivities Downloaded from: https://research.chalmers.se, 2019-01-12 01:39 UTC Citation for the original published paper (version of record: Quirynen,
More informationConvex Optimization. Newton s method. ENSAE: Optimisation 1/44
Convex Optimization Newton s method ENSAE: Optimisation 1/44 Unconstrained minimization minimize f(x) f convex, twice continuously differentiable (hence dom f open) we assume optimal value p = inf x f(x)
More informationInteger Programming. Wolfram Wiesemann. December 6, 2007
Integer Programming Wolfram Wiesemann December 6, 2007 Contents of this Lecture Revision: Mixed Integer Programming Problems Branch & Bound Algorithms: The Big Picture Solving MIP s: Complete Enumeration
More informationCE 191: Civil and Environmental Engineering Systems Analysis. LEC 05 : Optimality Conditions
CE 191: Civil and Environmental Engineering Systems Analysis LEC : Optimality Conditions Professor Scott Moura Civil & Environmental Engineering University of California, Berkeley Fall 214 Prof. Moura
More informationA GLOBALLY CONVERGENT STABILIZED SQP METHOD
A GLOBALLY CONVERGENT STABILIZED SQP METHOD Philip E. Gill Daniel P. Robinson July 6, 2013 Abstract Sequential quadratic programming SQP methods are a popular class of methods for nonlinearly constrained
More informationInterior-Point Methods as Inexact Newton Methods. Silvia Bonettini Università di Modena e Reggio Emilia Italy
InteriorPoint Methods as Inexact Newton Methods Silvia Bonettini Università di Modena e Reggio Emilia Italy Valeria Ruggiero Università di Ferrara Emanuele Galligani Università di Modena e Reggio Emilia
More informationBlock Structured Preconditioning within an Active-Set Method for Real-Time Optimal Control
MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.merl.com Block Structured Preconditioning within an Active-Set Method for Real-Time Optimal Control Quirynen, R.; Knyazev, A.; Di Cairano, S. TR2018-081
More informationMODIFYING SQP FOR DEGENERATE PROBLEMS
PREPRINT ANL/MCS-P699-1097, OCTOBER, 1997, (REVISED JUNE, 2000; MARCH, 2002), MATHEMATICS AND COMPUTER SCIENCE DIVISION, ARGONNE NATIONAL LABORATORY MODIFYING SQP FOR DEGENERATE PROBLEMS STEPHEN J. WRIGHT
More informationLine Search Methods for Unconstrained Optimisation
Line Search Methods for Unconstrained Optimisation Lecture 8, Numerical Linear Algebra and Optimisation Oxford University Computing Laboratory, MT 2007 Dr Raphael Hauser (hauser@comlab.ox.ac.uk) The Generic
More informationOn Perspective Functions, Vanishing Constraints, and Complementarity Programming
On Perspective Functions, Vanishing Constraints, and Complementarity Programming Fast Mixed-Integer Nonlinear Feedback Control Christian Kirches 1, Sebastian Sager 2 1 Interdisciplinary Center for Scientific
More information4TE3/6TE3. Algorithms for. Continuous Optimization
4TE3/6TE3 Algorithms for Continuous Optimization (Algorithms for Constrained Nonlinear Optimization Problems) Tamás TERLAKY Computing and Software McMaster University Hamilton, November 2005 terlaky@mcmaster.ca
More informationLecture 15: SQP methods for equality constrained optimization
Lecture 15: SQP methods for equality constrained optimization Coralia Cartis, Mathematical Institute, University of Oxford C6.2/B2: Continuous Optimization Lecture 15: SQP methods for equality constrained
More informationCHAPTER 2: QUADRATIC PROGRAMMING
CHAPTER 2: QUADRATIC PROGRAMMING Overview Quadratic programming (QP) problems are characterized by objective functions that are quadratic in the design variables, and linear constraints. In this sense,
More informationNumerical optimization
Numerical optimization Lecture 4 Alexander & Michael Bronstein tosca.cs.technion.ac.il/book Numerical geometry of non-rigid shapes Stanford University, Winter 2009 2 Longest Slowest Shortest Minimal Maximal
More informationInterior Point Algorithms for Constrained Convex Optimization
Interior Point Algorithms for Constrained Convex Optimization Chee Wei Tan CS 8292 : Advanced Topics in Convex Optimization and its Applications Fall 2010 Outline Inequality constrained minimization problems
More informationOn sequential optimality conditions for constrained optimization. José Mario Martínez martinez
On sequential optimality conditions for constrained optimization José Mario Martínez www.ime.unicamp.br/ martinez UNICAMP, Brazil 2011 Collaborators This talk is based in joint papers with Roberto Andreani
More informationSequential Quadratic Programming Method for Nonlinear Second-Order Cone Programming Problems. Hirokazu KATO
Sequential Quadratic Programming Method for Nonlinear Second-Order Cone Programming Problems Guidance Professor Masao FUKUSHIMA Hirokazu KATO 2004 Graduate Course in Department of Applied Mathematics and
More informationTime-Optimal Automobile Test Drives with Gear Shifts
Time-Optimal Control of Automobile Test Drives with Gear Shifts Christian Kirches Interdisciplinary Center for Scientific Computing (IWR) Ruprecht-Karls-University of Heidelberg, Germany joint work with
More informationNonlinear optimization
Nonlinear optimization Anders Forsgren Optimization and Systems Theory Department of Mathematics Royal Institute of Technology (KTH) Stockholm, Sweden evita Winter School 2009 Geilo, Norway January 11
More informationIP-PCG An interior point algorithm for nonlinear constrained optimization
IP-PCG An interior point algorithm for nonlinear constrained optimization Silvia Bonettini (bntslv@unife.it), Valeria Ruggiero (rgv@unife.it) Dipartimento di Matematica, Università di Ferrara December
More informationALADIN An Algorithm for Distributed Non-Convex Optimization and Control
ALADIN An Algorithm for Distributed Non-Convex Optimization and Control Boris Houska, Yuning Jiang, Janick Frasch, Rien Quirynen, Dimitris Kouzoupis, Moritz Diehl ShanghaiTech University, University of
More informationNumerical optimization. Numerical optimization. Longest Shortest where Maximal Minimal. Fastest. Largest. Optimization problems
1 Numerical optimization Alexander & Michael Bronstein, 2006-2009 Michael Bronstein, 2010 tosca.cs.technion.ac.il/book Numerical optimization 048921 Advanced topics in vision Processing and Analysis of
More informationComputational Finance
Department of Mathematics at University of California, San Diego Computational Finance Optimization Techniques [Lecture 2] Michael Holst January 9, 2017 Contents 1 Optimization Techniques 3 1.1 Examples
More informationMS&E 318 (CME 338) Large-Scale Numerical Optimization
Stanford University, Management Science & Engineering (and ICME) MS&E 318 (CME 338) Large-Scale Numerical Optimization Instructor: Michael Saunders Spring 2015 Notes 11: NPSOL and SNOPT SQP Methods 1 Overview
More informationComputational Optimization. Augmented Lagrangian NW 17.3
Computational Optimization Augmented Lagrangian NW 17.3 Upcoming Schedule No class April 18 Friday, April 25, in class presentations. Projects due unless you present April 25 (free extension until Monday
More informationLecture 16: October 22
0-725/36-725: Conve Optimization Fall 208 Lecturer: Ryan Tibshirani Lecture 6: October 22 Scribes: Nic Dalmasso, Alan Mishler, Benja LeRoy Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer:
More informationNetwork Flows. 6. Lagrangian Relaxation. Programming. Fall 2010 Instructor: Dr. Masoud Yaghini
In the name of God Network Flows 6. Lagrangian Relaxation 6.3 Lagrangian Relaxation and Integer Programming Fall 2010 Instructor: Dr. Masoud Yaghini Integer Programming Outline Branch-and-Bound Technique
More informationSEQUENTIAL QUADRATIC PROGAMMING METHODS FOR PARAMETRIC NONLINEAR OPTIMIZATION
SEQUENTIAL QUADRATIC PROGAMMING METHODS FOR PARAMETRIC NONLINEAR OPTIMIZATION Vyacheslav Kungurtsev Moritz Diehl July 2013 Abstract Sequential quadratic programming (SQP) methods are known to be efficient
More informationCONSTRAINED NONLINEAR PROGRAMMING
149 CONSTRAINED NONLINEAR PROGRAMMING We now turn to methods for general constrained nonlinear programming. These may be broadly classified into two categories: 1. TRANSFORMATION METHODS: In this approach
More information12. Interior-point methods
12. Interior-point methods Convex Optimization Boyd & Vandenberghe inequality constrained minimization logarithmic barrier function and central path barrier method feasibility and phase I methods complexity
More informationBlock Condensing with qpdunes
Block Condensing with qpdunes Dimitris Kouzoupis Rien Quirynen, Janick Frasch and Moritz Diehl Systems control and optimization laboratory (SYSCOP) TEMPO summer school August 5, 215 Dimitris Kouzoupis
More informationQuiz Discussion. IE417: Nonlinear Programming: Lecture 12. Motivation. Why do we care? Jeff Linderoth. 16th March 2006
Quiz Discussion IE417: Nonlinear Programming: Lecture 12 Jeff Linderoth Department of Industrial and Systems Engineering Lehigh University 16th March 2006 Motivation Why do we care? We are interested in
More informationMPC Infeasibility Handling
MPC Handling Thomas Wiese, TU Munich, KU Leuven supervised by H.J. Ferreau, Prof. M. Diehl (both KUL) and Dr. H. Gräb (TUM) October 9, 2008 1 / 42 MPC General MPC Strategies 2 / 42 Linear Discrete-Time
More informationA Trust-region-based Sequential Quadratic Programming Algorithm
Downloaded from orbit.dtu.dk on: Oct 19, 2018 A Trust-region-based Sequential Quadratic Programming Algorithm Henriksen, Lars Christian; Poulsen, Niels Kjølstad Publication date: 2010 Document Version
More informationISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints
ISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints Instructor: Prof. Kevin Ross Scribe: Nitish John October 18, 2011 1 The Basic Goal The main idea is to transform a given constrained
More informationInterior Point Methods. We ll discuss linear programming first, followed by three nonlinear problems. Algorithms for Linear Programming Problems
AMSC 607 / CMSC 764 Advanced Numerical Optimization Fall 2008 UNIT 3: Constrained Optimization PART 4: Introduction to Interior Point Methods Dianne P. O Leary c 2008 Interior Point Methods We ll discuss
More informationThe Lifted Newton Method and Its Use in Optimization
The Lifted Newton Method and Its Use in Optimization Moritz Diehl Optimization in Engineering Center (OPTEC), K.U. Leuven, Belgium joint work with Jan Albersmeyer (U. Heidelberg) ENSIACET, Toulouse, February
More informationTrajectory Planning and Collision Detection for Robotics Applications
Trajectory Planning and Collision Detection for Robotics Applications joint work with Rene Henrion, Dietmar Homberg, Chantal Landry (WIAS) Institute of Mathematics and Applied Computing Department of Aerospace
More informationFrom structures to heuristics to global solvers
From structures to heuristics to global solvers Timo Berthold Zuse Institute Berlin DFG Research Center MATHEON Mathematics for key technologies OR2013, 04/Sep/13, Rotterdam Outline From structures to
More informationMS&E 318 (CME 338) Large-Scale Numerical Optimization
Stanford University, Management Science & Engineering (and ICME) MS&E 318 (CME 338) Large-Scale Numerical Optimization Instructor: Michael Saunders Spring 2015 Notes 4: The Primal Simplex Method 1 Linear
More informationDELFT UNIVERSITY OF TECHNOLOGY
DELFT UNIVERSITY OF TECHNOLOGY REPORT -09 Computational and Sensitivity Aspects of Eigenvalue-Based Methods for the Large-Scale Trust-Region Subproblem Marielba Rojas, Bjørn H. Fotland, and Trond Steihaug
More informationLinear algebra issues in Interior Point methods for bound-constrained least-squares problems
Linear algebra issues in Interior Point methods for bound-constrained least-squares problems Stefania Bellavia Dipartimento di Energetica S. Stecco Università degli Studi di Firenze Joint work with Jacek
More informationarxiv:cs/ v1 [cs.ms] 13 Aug 2004
tsnnls: A solver for large sparse least squares problems with non-negative variables Jason Cantarella Department of Mathematics, University of Georgia, Athens, GA 30602 Michael Piatek arxiv:cs/0408029v
More informationImproved quadratic cuts for convex mixed-integer nonlinear programs
Improved quadratic cuts for convex mixed-integer nonlinear programs Lijie Su a,b, Lixin Tang a*, David E. Bernal c, Ignacio E. Grossmann c a Institute of Industrial and Systems Engineering, Northeastern
More informationDEPARTMENT OF STATISTICS AND OPERATIONS RESEARCH OPERATIONS RESEARCH DETERMINISTIC QUALIFYING EXAMINATION. Part I: Short Questions
DEPARTMENT OF STATISTICS AND OPERATIONS RESEARCH OPERATIONS RESEARCH DETERMINISTIC QUALIFYING EXAMINATION Part I: Short Questions August 12, 2008 9:00 am - 12 pm General Instructions This examination is
More informationProcess Model Formulation and Solution, 3E4
Process Model Formulation and Solution, 3E4 Section B: Linear Algebraic Equations Instructor: Kevin Dunn dunnkg@mcmasterca Department of Chemical Engineering Course notes: Dr Benoît Chachuat 06 October
More informationLinear Programming: Simplex
Linear Programming: Simplex Stephen J. Wright 1 2 Computer Sciences Department, University of Wisconsin-Madison. IMA, August 2016 Stephen Wright (UW-Madison) Linear Programming: Simplex IMA, August 2016
More informationNumerical Methods for Embedded Optimization and Optimal Control. Exercises
Summer Course Numerical Methods for Embedded Optimization and Optimal Control Exercises Moritz Diehl, Daniel Axehill and Lars Eriksson June 2011 Introduction This collection of exercises is intended to
More informationProximal Newton Method. Ryan Tibshirani Convex Optimization /36-725
Proximal Newton Method Ryan Tibshirani Convex Optimization 10-725/36-725 1 Last time: primal-dual interior-point method Given the problem min x subject to f(x) h i (x) 0, i = 1,... m Ax = b where f, h
More information