Introducing PENLAB a MATLAB code for NLP-SDP
|
|
- Augustine Chandler
- 5 years ago
- Views:
Transcription
1 Introducing PENLAB a MATLAB code for NLP-SDP Michal Kočvara School of Mathematics, The University of Birmingham jointly with Jan Fiala Numerical Algorithms Group Michael Stingl University of Erlangen-Nürnberg Toulouse, July, 2013
2 PENNON collection PENNON (PENalty methods for NONlinear optimization) a collection of codes for NLP, SDP and BMI one algorithm to rule them all READY PENNLP PENSDP PENBMI AMPL, MATLAB, C/Fortran MATLAB/YALMIP, SDPA, C/Fortran MATLAB/YALMIP, C/Fortran (relatively) NEW PENNON (NLP + SDP) C/Fortran extended AMPL, MATLAB,
3 The problem Optimization problems with nonlinear objective subject to nonlinear inequality and equality constraints and semidefinite bound constraints: min f(x, Y) x R n,y 1 S p 1,...,Y k S p k subject to g i (x, Y) 0, i = 1,...,m g h i (x, Y) = 0, i = 1,...,m h λ i I Y i λ i I, i = 1,...,k. (NLP-SDP)
4 The algorithm Based on penalty/barrier functions ϕ g : R R and Φ P : S p S p : g i (x) 0 p i ϕ g (g i (x)/p i ) 0, Z 0 Φ P (Z) 0, Z S p. Augmented Lagrangian of (NLP-SDP): i = 1,...,m F(x,Y,u,U,U,p)=f(x,Y)+ m g i=1 u ip i ϕ g(g i (x,y)/p i ) + k i=1 U i,φ P(λ i I Y i ) + k i=1 U i,φ P (Y i λ i I) ; here u R mg and U i, U i are Lagrange multipliers.
5 The algorithm A generalized Augmented Lagrangian algorithm (based on R. Polyak 92, Ben-Tal Zibulevsky 94, Stingl 05): Given x 1, Y 1, u 1, U 1, U 1 ; p 1 i > 0, i = 1,...,m g and P > 0. For k = 1, 2,... repeat till a stopping criterium is reached: (i) Find x k+1 and Y k+1 s.t. x F(x k+1, Y k+1, u k, U k, U k, p k ) K (ii) u k+1 i = u k i ϕ g (g i(x k+1 )/p k i ), i = 1,..., m g (iii) p k+1 i U k+1 = D i A Φ P ((λ i I Y i ); U k i ), i = 1,..., k U k+1 i = D A Φ P ((Y i λ i I); U k i ), i = 1,..., k < p k i, i = 1,..., m g P k+1 < P k.
6 Interfaces How to enter the data the functions and their derivatives? Matlab interface AMPL interface c/fortran interface Key point: Matrix variables are treated as vectors
7 What s new PENNON being implemented in NAG (The Numerical Algorithms Group) library The first routines should appear in the NAG Fortran Library, Mark 24 (Autumn 2013) By-product: PENLAB free, open, fully functional version of PENNON coded in MATLAB
8 PENLAB PENLAB free, open, fully functional version of PENNON coded in Matlab Open source, all in MATLAB (one MEX function) The basic algorithm is identical Some data handling routines not (yet?) implemented PENLAB runs just as PENNON but is slower Pre-programmed procedures for standard NLP (with AMPL input!) linear SDP (reading SDPA input files) bilinear SDP (=BMI) SDP with polynomial MI (PMI) easy to add more (QP, robust QP, SOF, TTO... )
9 PENLAB The problem min f(x, Y) x R n,y 1 S p 1,...,Y k S p k subject to g i (x, Y) 0, i = 1,...,m g h i (x, Y) = 0, i = 1,...,m h A i (x, Y) 0, i = 1,...,m A λ i I Y i λ i I, i = 1,...,k (NLP-SDP) A i (x, Y)... nonlinear matrix operators
10 PENLAB Solving a problem: prepare a structure penm containing basic problem data >> prob = penlab(penm); MATLAB class containing all data >> solve(prob); results in class prob The user has to provide MATLAB functions for function values gradients Hessians (for nonlinear functions) of all f, g, A.
11 Structure penm and f/g/h functions Example: min x 1 + x 2 s.t. x x 2 2 1, x penm = []; penm.nx = 2; penm.lbx = [-0.5 ; -Inf]; penm.ngnln = 1; penm.ubg = [1]; penm.objfun deal(x(1) + x(2)); penm.objgrad deal([1 ; 1]); penm.confun deal([x(1)ˆ2 + x(2)ˆ2]); penm.congrad deal([2*x(1) ; 2*x(2)]); penm.conhess deal([2 0 ; 0 2]); % set starting point penm.xinit = [2,1];
12 Toy NLP-SDP example 1 1 min x R 2 2 (x x 2 2 ) 1 x subject to B + A 1 x 1 + A 2 x 2 := x x x 2 1 D. Noll, 2007
13 Structure penm and f/g/h functions B = [1-1 0; ; 0 0 1]; A{1} = [0 1 0; 1 0 0; 0 0 0]; A{2} = [0 0 0; 0 0 1; 0 1 0]; penm = []; penm.nx=2; penm.nalin=1; penm.lba=zeros(1,1); penm.objfun deal(-.5*(x(1)ˆ2+x(2)ˆ2)); penm.objgrad deal(-[x(1);x(2)]); penm.objhess deal(-eye(2,2)); penm.mconfun=@(x,y,k)deal(b+a{1}*x(1)+a{2}*x(2)); penm.mcongrad=@(x,y,k,i)deal(a{i});
14 Example: nearest correlation matrix Find a nearest correlation matrix: min X n (X ij H ij ) 2 (1) i,j=1 subject to X ii = 1, X 0 i = 1,...,n
15 Example: nearest correlation matrix The condition number of the nearest correlation matrix must be bounded by κ. Using the transformation of the variable X: The new problem: min z, X z X = X n (z X ij H ij ) 2 (2) i,j=1 subject to z X ii = 1, I X κi i = 1,...,n
16 Structure penm and f/g/h functions function [f,userdata] = objfun(x,y,userdata) YH = svec2(x(1).*y{1}-userdata.h); f = YH(:) *YH(:); function [df, userdata]=objgrad(x,y,userdata) YH=svec2(x(1).*Y{1}-userdata.H); df(1) = sum(2*svec2(y{1}).*yh); df(2:length(yh)+1) = 2*x(1).*YH; function [ddf, userdata] = objhess(x,y,userdata) YH=svec2(x(1).*Y{1}-userdata.H); yy = svec2(y{1}); n = length(yy); ddf = zeros(n+1,n+1); ddf(1,1) = 2*sum(yy.ˆ2); ddf(1,2:n+1) = 2.*(x(1).*yy+YH); ddf(2:n+1,1) = 2.*(x(1).*yy +YH ); for i=1:n, ddf(i+1,i+1) = 2*x(1)ˆ2; end
17 NLP with AMPL input Pre-programmed. All you need to do: >> penm=nlp_define( datafiles/chain100.nl ); >> prob=penlab(penm); >> prob.solve();
18 NLP with AMPL input problem vars constr. constr. PENNON PENLAB type sec iter. sec iter. chain = 1 14/ /56 pinene = 1 7/ /17 channel = 3 3/3 1 3/3 torsion / /26 lane emd / /49 dirichlet / /68 henon / /158 minsurf box 1 20/ /203 gasoil = & b 3 34/ /71 duct = & 6 19/ /11 marine & b 2 39/ /35 steering & b 1 9/9 7 19/40 methanol & b 2 24/ /67
19 Linear SDP with SDPA input Pre-programmed. All you need to do: >> sdpdata=readsdpa( datafiles/arch0.dat-s ); >> penm=sdp_define(sdpdata); >> prob=penlab(penm); >> prob.solve();
20 Bilinear matrix inequalities (BMI) Pre-programmed. All you need to do: >> bmidata=define_my_problem; %matrices A, K,... >> penm=bmi_define(bmidata); >> prob=penlab(penm); >> prob.solve(); min x R n ct x s.t. n n A i 0 + x k A i k + k=1 k=1 l=1 n x k x l Kkl i 0, i = 1,...,m
21 Polynomial matrix inequalities (PMI) Pre-programmed. All you need to do: >> load datafiles/example_pmidata; >> penm = pmi_define(pmidata); >> problem = penlab(penm); >> problem.solve(); with 1 min x R n 2 x T Hx + c T x subject to b low Bx b up A i (x) 0, i = 1,..., m A(x) = i x κ(i) Q i where κ(i) is a multi-index with possibly repeated entries and x κ(i) is a product of elements with indices in κ(i).
22 Other pre-programmed modules Nearest correlation matrix Truss topology optimization (stability constraints) Static output feedback (input from COMPlib, formulated as PMI) Robust QP
23 Availability PENNON: Free time-limited academic version of the code available PENLAB: Free open MATLAB version available from NAG
24 What s missing? SOCP (Second-Order Conic Programming) - nonlinear, integrated in PENLAB (and PENNON) Postdoctoral research position in Birmingham (sponsored by NAG) development of NL-SOCP algorithm (compatible with PENNON algorithm) implementation in PENLAB and PENNON Alain Zemkoho, started April 2013
25 Sensor network localization (Euclidean distance matrix completion, Graph realization) We have (in R 2 (or R d )) n points a i, anchors with known location m points x i, sensors with unknown location d ij known Euclidean distance between close points d ij = x i x j, (i, j) I x d kj = a k x j, (k, j) I a Goal: Find the positions of the sensors! Find x R 2 m such that x i x j 2 = d 2 ij, a k x j 2 = d 2 kj, (i, j) I x (k, j) I a
26 Sensor network localization Example, 4 anchors, 36 sensors
27 Sensor network localization Applications Wireless sensor network localization habitat monitoring system in the Great Duck Island detecting volcano eruptions industrial control in semiconductor manufacturing plants structural health monitoring military and civilian surveillance moving object tracking asset location Molecule conformation...
28 Sensor network localization Solve the least-square problem min x 1,...,x m (i,j) I x ( x i x j 2 d 2 ij ) 2 + (i,j) I a to global minimum. This is an NP-hard problem. ( ) a k x j 2 d 2 2 kj
29 (P. Biswas and Y. Ye, 04) SDP relaxation Let X = [x 1 x 2... x n ] be a d n unknown matrix. Then x i x j 2 = (e i e j ) T X T X(e i e j ) [ ] a k x j 2 = (a k ; e j ) T Id X T [I d X](a k ; e j ) and the problem becomes (e i e j ) T X T X(e i e j ) = d 2 ij (a k ; e j ) T Z(a k ; e j ) = d 2 kj ( ) Id X Z = X T X T X Z 1:d,1:d = I d, Z 0, Z has rank d
30 SDP relaxation Now relax Z 1:d,1:d = I d, Z 0, Z has rank d to Z 1:d,1:d = I d, Z 0 Relaxed problem: min 0 subject to (0; e i e j ) T Z(0; e i e j ) = d 2 ij (i, j) I x (a k ; e j ) T Z(a k ; e j ) = d 2 kj (k, j) I a Z 1:d,1:d = I d Z 0 Full SDP relaxation, FSDP (linear SDP)
31 SDP relaxation Equivalent formulation: min (i,j) I x + subject to (k,j) I a Z 1:d,1:d = I d Z 0 ( (0; e i e j ) T Z(0; e i e j ) dij 2 ) 2 ( ) (a k ; e j ) T Z(a k ; e j ) d 2 2 kj Full SDP relaxation, FSDP (nonlinear SDP)
32 SDP relaxation For larger problems, FSDP is not solvable numerically: many variables (number of sensors) large and full matrix constraint (although low-rank) Can we exploit sparsity of I x and I a at the relaxation modelling level? Recently several approaches: Wolkowicz Toh Kojima Su Simple, yet powerful way: Edge-based relaxation (ESDP, Wang-Zheng-Ye-Boyd, 08).
33 Example, 16 anchors, 455 sensors
34 Example, 16 anchors, 455 sensors
35 Example, 16 anchors, 455 sensors
36 Example, 16 anchors, 455 sensors problem rmsd out-3 out-2 E-linear E-quadratic SDP: 6714 variables, 5349 (4 4) LMIs
37 Solution refinement Take the SDP solution as initial approximation for the original unconstrained nonconvex problem. Solve both by PENNON.
38 Example, 16 anchors, 455 sensors problem rmsd out-3 out-2 E-linear orig from lin
39 Example, 16 anchors, 455 sensors problem rmsd out-3 out-2 E-linear E-quadratic orig from lin orig from qua 0 0 0
40 Happy Birthday,
Solving linear and non-linear SDP by PENNON
Solving linear and non-linear SDP by PENNON Michal Kočvara School of Mathematics, The University of Birmingham University of Warwick, February 2010 Outline Why nonlinear SDP? PENNON the new generation
More informationPENNON A Generalized Augmented Lagrangian Method for Nonconvex NLP and SDP p.1/22
PENNON A Generalized Augmented Lagrangian Method for Nonconvex NLP and SDP Michal Kočvara Institute of Information Theory and Automation Academy of Sciences of the Czech Republic and Czech Technical University
More informationPENNON A Generalized Augmented Lagrangian Method for Convex NLP and SDP p.1/39
PENNON A Generalized Augmented Lagrangian Method for Convex NLP and SDP Michal Kočvara Institute of Information Theory and Automation Academy of Sciences of the Czech Republic and Czech Technical University
More informationPENNON A Code for Convex Nonlinear and Semidefinite Programming
PENNON A Code for Convex Nonlinear and Semidefinite Programming Michal Kočvara Michael Stingl Abstract We introduce a computer program PENNON for the solution of problems of convex Nonlinear and Semidefinite
More informationPENNON: Software for linear and nonlinear matrix inequalities
PENNON: Software for linear and nonlinear matrix inequalities Michal Kočvara 1 and Michael Stingl 2 1 School of Mathematics, University of Birmingham, Birmingham B15 2TT, UK and Institute of Information
More informationReduced-order output feedback control design with specsdp, a code for linear / nonlinear SDP problems
Reduced-order output feedback control design with specsdp, a code for linear / nonlinear SDP problems J.B. Thevenet and P. Apkarian and D. Noll ONERA-CERT, Control System department, 2 Av. Edouard Belin,
More informationSemidefinite Facial Reduction for Euclidean Distance Matrix Completion
Semidefinite Facial Reduction for Euclidean Distance Matrix Completion Nathan Krislock, Henry Wolkowicz Department of Combinatorics & Optimization University of Waterloo First Alpen-Adria Workshop on Optimization
More informationSOCP Relaxation of Sensor Network Localization
SOCP Relaxation of Sensor Network Localization Paul Tseng Mathematics, University of Washington Seattle University of Vienna/Wien June 19, 2006 Abstract This is a talk given at Univ. Vienna, 2006. SOCP
More informationMathematical Optimization Models and Applications
Mathematical Optimization Models and Applications Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A. http://www.stanford.edu/ yyye Chapters 1, 2.1-2,
More informationSolving polynomial static output feedback problems with PENBMI
Solving polynomial static output feedback problems with PENBMI Didier Henrion, Johan Löfberg, Michal Kočvara, Michael Stingl Abstract An algebraic formulation is proposed for the static output feedback
More informationSOCP Relaxation of Sensor Network Localization
SOCP Relaxation of Sensor Network Localization Paul Tseng Mathematics, University of Washington Seattle IWCSN, Simon Fraser University August 19, 2006 Abstract This is a talk given at Int. Workshop on
More informationSemidefinite Programming Basics and Applications
Semidefinite Programming Basics and Applications Ray Pörn, principal lecturer Åbo Akademi University Novia University of Applied Sciences Content What is semidefinite programming (SDP)? How to represent
More informationOn SDP and ESDP Relaxation of Sensor Network Localization
On SDP and ESDP Relaxation of Sensor Network Localization Paul Tseng Mathematics, University of Washington Seattle Southern California Optimization Day, UCSD March 19, 2009 (joint work with Ting Kei Pong)
More informationModule 04 Optimization Problems KKT Conditions & Solvers
Module 04 Optimization Problems KKT Conditions & Solvers Ahmad F. Taha EE 5243: Introduction to Cyber-Physical Systems Email: ahmad.taha@utsa.edu Webpage: http://engineering.utsa.edu/ taha/index.html September
More informationarxiv: v1 [math.oc] 20 Nov 2013
Mathematical Programming manuscript No. (will be inserted by the editor) Jan Fiala Michal Kočvara Michael Stingl PENLAB: A MATLAB solver for nonlinear semidefinite optimization Received: date / Revised
More informationELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications
ELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications Professor M. Chiang Electrical Engineering Department, Princeton University March
More informationOn the solution of large-scale SDP problems by the modified barrier method using iterative solvers
Mathematical Programming manuscript No. (will be inserted by the editor) Michal Kočvara Michael Stingl On the solution of large-scale SDP problems by the modified barrier method using iterative solvers
More informationResearch overview. Seminar September 4, Lehigh University Department of Industrial & Systems Engineering. Research overview.
Research overview Lehigh University Department of Industrial & Systems Engineering COR@L Seminar September 4, 2008 1 Duality without regularity condition Duality in non-exact arithmetic 2 interior point
More informationELE539A: Optimization of Communication Systems Lecture 6: Quadratic Programming, Geometric Programming, and Applications
ELE539A: Optimization of Communication Systems Lecture 6: Quadratic Programming, Geometric Programming, and Applications Professor M. Chiang Electrical Engineering Department, Princeton University February
More informationConstrained optimization: direct methods (cont.)
Constrained optimization: direct methods (cont.) Jussi Hakanen Post-doctoral researcher jussi.hakanen@jyu.fi Direct methods Also known as methods of feasible directions Idea in a point x h, generate a
More informationE5295/5B5749 Convex optimization with engineering applications. Lecture 5. Convex programming and semidefinite programming
E5295/5B5749 Convex optimization with engineering applications Lecture 5 Convex programming and semidefinite programming A. Forsgren, KTH 1 Lecture 5 Convex optimization 2006/2007 Convex quadratic program
More informationApplications of Linear Programming
Applications of Linear Programming lecturer: András London University of Szeged Institute of Informatics Department of Computational Optimization Lecture 9 Non-linear programming In case of LP, the goal
More informationOn SDP and ESDP Relaxation of Sensor Network Localization
On SDP and ESDP Relaxation of Sensor Network Localization Paul Tseng Mathematics, University of Washington Seattle IASI, Roma September 16, 2008 (joint work with Ting Kei Pong) ON SDP/ESDP RELAXATION OF
More informationI.3. LMI DUALITY. Didier HENRION EECI Graduate School on Control Supélec - Spring 2010
I.3. LMI DUALITY Didier HENRION henrion@laas.fr EECI Graduate School on Control Supélec - Spring 2010 Primal and dual For primal problem p = inf x g 0 (x) s.t. g i (x) 0 define Lagrangian L(x, z) = g 0
More informationCSCI 1951-G Optimization Methods in Finance Part 10: Conic Optimization
CSCI 1951-G Optimization Methods in Finance Part 10: Conic Optimization April 6, 2018 1 / 34 This material is covered in the textbook, Chapters 9 and 10. Some of the materials are taken from it. Some of
More informationISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints
ISM206 Lecture Optimization of Nonlinear Objective with Linear Constraints Instructor: Prof. Kevin Ross Scribe: Nitish John October 18, 2011 1 The Basic Goal The main idea is to transform a given constrained
More informationA trust-region method for box-constrained nonlinear semidefinite programs
A trust-region method for box-constrained nonlinear semidefinite programs Akihiko Komatsu 1 and Makoto Yamashita 2 Submitted: November 17, 2014. Abstract: We propose a trust-region method for nonlinear
More informationOn the solution of large-scale SDP problems by the modified barrier method using iterative solvers
Mathematical Programming manuscript No. (will be inserted by the editor) Michal Kočvara Michael Stingl On the solution of large-scale SDP problems by the modified barrier method using iterative solvers
More informationOn the solution of large-scale SDP problems by the modified barrier method using iterative solvers
Mathematical Programming manuscript No. (will be inserted by the editor) Michal Kočvara Michael Stingl On the solution of large-scale SDP problems by the modified barrier method using iterative solvers
More informationProjection methods to solve SDP
Projection methods to solve SDP Franz Rendl http://www.math.uni-klu.ac.at Alpen-Adria-Universität Klagenfurt Austria F. Rendl, Oberwolfach Seminar, May 2010 p.1/32 Overview Augmented Primal-Dual Method
More informationLecture 7: Convex Optimizations
Lecture 7: Convex Optimizations Radu Balan, David Levermore March 29, 2018 Convex Sets. Convex Functions A set S R n is called a convex set if for any points x, y S the line segment [x, y] := {tx + (1
More informationSolving Nonconvex SDP Problems of Structural Optimization with Stability Control
Copyright information to be inserted by the Publishers Solving Nonconvex SDP Problems of Structural Optimization with Stability Control MICHAL KOČVARA Institute of Information Theory and Automation, Pod
More informationProblem 1 (Exercise 2.2, Monograph)
MS&E 314/CME 336 Assignment 2 Conic Linear Programming January 3, 215 Prof. Yinyu Ye 6 Pages ASSIGNMENT 2 SOLUTIONS Problem 1 (Exercise 2.2, Monograph) We prove the part ii) of Theorem 2.1 (Farkas Lemma
More informationNON LINEAR SPECTRAL SDP METHOD FOR BMI-CONSTRAINED PROBLEMS : APPLICATIONS TO CONTROL DESIGN
NON LINEAR SPECTRAL SDP METHOD FOR BMI-CONSTRAINED PROBLEMS : APPLICATIONS TO CONTROL DESIGN Jean-Baptiste Thevenet ONERA-CERT, 2 av. Edouard Belin, 31055 Toulouse, France and UPS-MIP (Mathmatiques pour
More informationSemidefinite and Second Order Cone Programming Seminar Fall 2012 Project: Robust Optimization and its Application of Robust Portfolio Optimization
Semidefinite and Second Order Cone Programming Seminar Fall 2012 Project: Robust Optimization and its Application of Robust Portfolio Optimization Instructor: Farid Alizadeh Author: Ai Kagawa 12/12/2012
More informationSemidefinite Programming
Semidefinite Programming Basics and SOS Fernando Mário de Oliveira Filho Campos do Jordão, 2 November 23 Available at: www.ime.usp.br/~fmario under talks Conic programming V is a real vector space h, i
More information10 Numerical methods for constrained problems
10 Numerical methods for constrained problems min s.t. f(x) h(x) = 0 (l), g(x) 0 (m), x X The algorithms can be roughly divided the following way: ˆ primal methods: find descent direction keeping inside
More informationExplicit Sensor Network Localization using Semidefinite Programming and Clique Reductions
Explicit Sensor Network Localization using Semidefinite Programming and Clique Reductions Nathan Krislock and Henry Wolkowicz Dept. of Combinatorics and Optimization University of Waterloo Haifa Matrix
More informationProject Discussions: SNL/ADMM, MDP/Randomization, Quadratic Regularization, and Online Linear Programming
Project Discussions: SNL/ADMM, MDP/Randomization, Quadratic Regularization, and Online Linear Programming Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305,
More informationNumerical Optimization. Review: Unconstrained Optimization
Numerical Optimization Finding the best feasible solution Edward P. Gatzke Department of Chemical Engineering University of South Carolina Ed Gatzke (USC CHE ) Numerical Optimization ECHE 589, Spring 2011
More informationShiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 9. Alternating Direction Method of Multipliers
Shiqian Ma, MAT-258A: Numerical Optimization 1 Chapter 9 Alternating Direction Method of Multipliers Shiqian Ma, MAT-258A: Numerical Optimization 2 Separable convex optimization a special case is min f(x)
More informationHermite matrix in Lagrange basis for scaling static output feedback polynomial matrix inequalities
Hermite matrix in Lagrange basis for scaling static output feedback polynomial matrix inequalities Akın Delibaşı 1,2,3, Didier Henrion 1,2,4 June 1, 2010 Abstract Using Hermite s formulation of polynomial
More informationMIT Algebraic techniques and semidefinite optimization February 14, Lecture 3
MI 6.97 Algebraic techniques and semidefinite optimization February 4, 6 Lecture 3 Lecturer: Pablo A. Parrilo Scribe: Pablo A. Parrilo In this lecture, we will discuss one of the most important applications
More informationminimize x x2 2 x 1x 2 x 1 subject to x 1 +2x 2 u 1 x 1 4x 2 u 2, 5x 1 +76x 2 1,
4 Duality 4.1 Numerical perturbation analysis example. Consider the quadratic program with variables x 1, x 2, and parameters u 1, u 2. minimize x 2 1 +2x2 2 x 1x 2 x 1 subject to x 1 +2x 2 u 1 x 1 4x
More informationConvex Optimization and Modeling
Convex Optimization and Modeling Convex Optimization Fourth lecture, 05.05.2010 Jun.-Prof. Matthias Hein Reminder from last time Convex functions: first-order condition: f(y) f(x) + f x,y x, second-order
More informationOptimization in Wireless Communication
Zhi-Quan (Tom) Luo Department of Electrical and Computer Engineering University of Minnesota 200 Union Street SE Minneapolis, MN 55455 2007 NSF Workshop Challenges Optimization problems from wireless applications
More informationHermite matrix in Lagrange basis for scaling static output feedback polynomial matrix inequalities
Hermite matrix in Lagrange basis for scaling static output feedback polynomial matrix inequalities Akın Delibaşı 1,2,3, Didier Henrion 1,2,4 January 20, 2010 Abstract Using Hermite s formulation of polynomial
More informationAdvances in Convex Optimization: Theory, Algorithms, and Applications
Advances in Convex Optimization: Theory, Algorithms, and Applications Stephen Boyd Electrical Engineering Department Stanford University (joint work with Lieven Vandenberghe, UCLA) ISIT 02 ISIT 02 Lausanne
More informationTutorial on Convex Optimization: Part II
Tutorial on Convex Optimization: Part II Dr. Khaled Ardah Communications Research Laboratory TU Ilmenau Dec. 18, 2018 Outline Convex Optimization Review Lagrangian Duality Applications Optimal Power Allocation
More informationPart 5: Penalty and augmented Lagrangian methods for equality constrained optimization. Nick Gould (RAL)
Part 5: Penalty and augmented Lagrangian methods for equality constrained optimization Nick Gould (RAL) x IR n f(x) subject to c(x) = Part C course on continuoue optimization CONSTRAINED MINIMIZATION x
More informationExercises. Exercises. Basic terminology and optimality conditions. 4.2 Consider the optimization problem
Exercises Basic terminology and optimality conditions 4.1 Consider the optimization problem f 0(x 1, x 2) 2x 1 + x 2 1 x 1 + 3x 2 1 x 1 0, x 2 0. Make a sketch of the feasible set. For each of the following
More informationFast ADMM for Sum of Squares Programs Using Partial Orthogonality
Fast ADMM for Sum of Squares Programs Using Partial Orthogonality Antonis Papachristodoulou Department of Engineering Science University of Oxford www.eng.ox.ac.uk/control/sysos antonis@eng.ox.ac.uk with
More informationA CONIC DANTZIG-WOLFE DECOMPOSITION APPROACH FOR LARGE SCALE SEMIDEFINITE PROGRAMMING
A CONIC DANTZIG-WOLFE DECOMPOSITION APPROACH FOR LARGE SCALE SEMIDEFINITE PROGRAMMING Kartik Krishnan Advanced Optimization Laboratory McMaster University Joint work with Gema Plaza Martinez and Tamás
More informationA Julia JuMP-based module for polynomial optimization with complex variables applied to ACOPF
JuMP-dev workshop 018 A Julia JuMP-based module for polynomial optimization with complex variables applied to ACOPF Gilles Bareilles, Manuel Ruiz, Julie Sliwak 1. MathProgComplex.jl: A toolbox for Polynomial
More informationConvex Optimization M2
Convex Optimization M2 Lecture 8 A. d Aspremont. Convex Optimization M2. 1/57 Applications A. d Aspremont. Convex Optimization M2. 2/57 Outline Geometrical problems Approximation problems Combinatorial
More informationSF2822 Applied nonlinear optimization, final exam Saturday December
SF2822 Applied nonlinear optimization, final exam Saturday December 20 2008 8.00 13.00 Examiner: Anders Forsgren, tel. 790 71 27. Allowed tools: Pen/pencil, ruler and eraser; plus a calculator provided
More informationTruss topology design with integer variables made easy
Mathematical Programming manuscript No. (will be inserted by the editor) Michal Kočvara Truss topology design with integer variables made easy Dedicated to Herbert Hörnlein on the occasion of his 65th
More informationOptimal Power Flow. S. Bose, M. Chandy, M. Farivar, D. Gayme S. Low. C. Clarke. Southern California Edison. Caltech. March 2012
Optimal Power Flow over Radial Networks S. Bose, M. Chandy, M. Farivar, D. Gayme S. Low Caltech C. Clarke Southern California Edison March 2012 Outline Motivation Semidefinite relaxation Bus injection
More information446 CHAP. 8 NUMERICAL OPTIMIZATION. Newton's Search for a Minimum of f(x,y) Newton s Method
446 CHAP. 8 NUMERICAL OPTIMIZATION Newton's Search for a Minimum of f(xy) Newton s Method The quadratic approximation method of Section 8.1 generated a sequence of seconddegree Lagrange polynomials. It
More informationIE 5531: Engineering Optimization I
IE 5531: Engineering Optimization I Lecture 12: Nonlinear optimization, continued Prof. John Gunnar Carlsson October 20, 2010 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I October 20,
More informationDecentralized Control of Stochastic Systems
Decentralized Control of Stochastic Systems Sanjay Lall Stanford University CDC-ECC Workshop, December 11, 2005 2 S. Lall, Stanford 2005.12.11.02 Decentralized Control G 1 G 2 G 3 G 4 G 5 y 1 u 1 y 2 u
More informationSolving large Semidefinite Programs - Part 1 and 2
Solving large Semidefinite Programs - Part 1 and 2 Franz Rendl http://www.math.uni-klu.ac.at Alpen-Adria-Universität Klagenfurt Austria F. Rendl, Singapore workshop 2006 p.1/34 Overview Limits of Interior
More informationRobust and Optimal Control, Spring 2015
Robust and Optimal Control, Spring 2015 Instructor: Prof. Masayuki Fujita (S5-303B) D. Linear Matrix Inequality D.1 Convex Optimization D.2 Linear Matrix Inequality(LMI) D.3 Control Design and LMI Formulation
More informationSelf-Calibration and Biconvex Compressive Sensing
Self-Calibration and Biconvex Compressive Sensing Shuyang Ling Department of Mathematics, UC Davis July 12, 2017 Shuyang Ling (UC Davis) SIAM Annual Meeting, 2017, Pittsburgh July 12, 2017 1 / 22 Acknowledgements
More informationN. L. P. NONLINEAR PROGRAMMING (NLP) deals with optimization models with at least one nonlinear function. NLP. Optimization. Models of following form:
0.1 N. L. P. Katta G. Murty, IOE 611 Lecture slides Introductory Lecture NONLINEAR PROGRAMMING (NLP) deals with optimization models with at least one nonlinear function. NLP does not include everything
More informationIntroduction to Convex Optimization
Introduction to Convex Optimization Daniel P. Palomar Hong Kong University of Science and Technology (HKUST) ELEC5470 - Convex Optimization Fall 2018-19, HKUST, Hong Kong Outline of Lecture Optimization
More informationEnabling Efficient Analog Synthesis by Coupling Sparse Regression and Polynomial Optimization
Enabling Efficient Analog Synthesis by Coupling Sparse Regression and Polynomial Optimization Ye Wang, Michael Orshansky, Constantine Caramanis ECE Dept., The University of Texas at Austin 1 Analog Synthesis
More informationFacial reduction for Euclidean distance matrix problems
Facial reduction for Euclidean distance matrix problems Nathan Krislock Department of Mathematical Sciences Northern Illinois University, USA Distance Geometry: Theory and Applications DIMACS Center, Rutgers
More informationOptimization Problems with Constraints - introduction to theory, numerical Methods and applications
Optimization Problems with Constraints - introduction to theory, numerical Methods and applications Dr. Abebe Geletu Ilmenau University of Technology Department of Simulation and Optimal Processes (SOP)
More informationIE 521 Convex Optimization
Lecture 14: and Applications 11th March 2019 Outline LP SOCP SDP LP SOCP SDP 1 / 21 Conic LP SOCP SDP Primal Conic Program: min c T x s.t. Ax K b (CP) : b T y s.t. A T y = c (CD) y K 0 Theorem. (Strong
More information5.5 Quadratic programming
5.5 Quadratic programming Minimize a quadratic function subject to linear constraints: 1 min x t Qx + c t x 2 s.t. a t i x b i i I (P a t i x = b i i E x R n, where Q is an n n matrix, I and E are the
More informationSparse Matrix Theory and Semidefinite Optimization
Sparse Matrix Theory and Semidefinite Optimization Lieven Vandenberghe Department of Electrical Engineering University of California, Los Angeles Joint work with Martin S. Andersen and Yifan Sun Third
More informationminimize x subject to (x 2)(x 4) u,
Math 6366/6367: Optimization and Variational Methods Sample Preliminary Exam Questions 1. Suppose that f : [, L] R is a C 2 -function with f () on (, L) and that you have explicit formulae for
More informationHilbert s 17th Problem to Semidefinite Programming & Convex Algebraic Geometry
Hilbert s 17th Problem to Semidefinite Programming & Convex Algebraic Geometry Rekha R. Thomas University of Washington, Seattle References Monique Laurent, Sums of squares, moment matrices and optimization
More informationAn ADMM algorithm for optimal sensor and actuator selection
An ADMM algorithm for optimal sensor and actuator selection Neil K. Dhingra, Mihailo R. Jovanović, and Zhi-Quan Luo 53rd Conference on Decision and Control, Los Angeles, California, 2014 1 / 25 2 / 25
More informationOn the Local Quadratic Convergence of the Primal-Dual Augmented Lagrangian Method
Optimization Methods and Software Vol. 00, No. 00, Month 200x, 1 11 On the Local Quadratic Convergence of the Primal-Dual Augmented Lagrangian Method ROMAN A. POLYAK Department of SEOR and Mathematical
More informationLagrangian Duality Theory
Lagrangian Duality Theory Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A. http://www.stanford.edu/ yyye Chapter 14.1-4 1 Recall Primal and Dual
More informationConvex Optimization M2
Convex Optimization M2 Lecture 3 A. d Aspremont. Convex Optimization M2. 1/49 Duality A. d Aspremont. Convex Optimization M2. 2/49 DMs DM par email: dm.daspremont@gmail.com A. d Aspremont. Convex Optimization
More informationConvex relaxation. In example below, we have N = 6, and the cut we are considering
Convex relaxation The art and science of convex relaxation revolves around taking a non-convex problem that you want to solve, and replacing it with a convex problem which you can actually solve the solution
More informationEffective reformulations of the truss topology design problem
Effective reformulations of the truss topology design problem Michal Kočvara and Jiří V. Outrata Abstract We present a new formulation of the truss topology problem that results in unique design and unique
More informationLMI Methods in Optimal and Robust Control
LMI Methods in Optimal and Robust Control Matthew M. Peet Arizona State University Lecture 02: Optimization (Convex and Otherwise) What is Optimization? An Optimization Problem has 3 parts. x F f(x) :
More informationLecture 18: Optimization Programming
Fall, 2016 Outline Unconstrained Optimization 1 Unconstrained Optimization 2 Equality-constrained Optimization Inequality-constrained Optimization Mixture-constrained Optimization 3 Quadratic Programming
More informationLift me up but not too high Fast algorithms to solve SDP s with block-diagonal constraints
Lift me up but not too high Fast algorithms to solve SDP s with block-diagonal constraints Nicolas Boumal Université catholique de Louvain (Belgium) IDeAS seminar, May 13 th, 2014, Princeton The Riemannian
More informationAnalysis/Calculus Review Day 3
Analysis/Calculus Review Day 3 Arvind Saibaba arvindks@stanford.edu Institute of Computational and Mathematical Engineering Stanford University September 15, 2010 Big- Oh and Little- Oh Notation We write
More informationTrust Region Problems with Linear Inequality Constraints: Exact SDP Relaxation, Global Optimality and Robust Optimization
Trust Region Problems with Linear Inequality Constraints: Exact SDP Relaxation, Global Optimality and Robust Optimization V. Jeyakumar and G. Y. Li Revised Version: September 11, 2013 Abstract The trust-region
More informationResearch Reports on Mathematical and Computing Sciences
ISSN 1342-284 Research Reports on Mathematical and Computing Sciences Exploiting Sparsity in Linear and Nonlinear Matrix Inequalities via Positive Semidefinite Matrix Completion Sunyoung Kim, Masakazu
More informationA General Framework for Convex Relaxation of Polynomial Optimization Problems over Cones
Research Reports on Mathematical and Computing Sciences Series B : Operations Research Department of Mathematical and Computing Sciences Tokyo Institute of Technology 2-12-1 Oh-Okayama, Meguro-ku, Tokyo
More informationHomework 5. Convex Optimization /36-725
Homework 5 Convex Optimization 10-725/36-725 Due Tuesday November 22 at 5:30pm submitted to Christoph Dann in Gates 8013 (Remember to a submit separate writeup for each problem, with your name at the top)
More informationLecture 6: Conic Optimization September 8
IE 598: Big Data Optimization Fall 2016 Lecture 6: Conic Optimization September 8 Lecturer: Niao He Scriber: Juan Xu Overview In this lecture, we finish up our previous discussion on optimality conditions
More informationLMI solvers. Some notes on standard. Laboratoire d Analyse et d Architecture des Systèmes. Denis Arzelier - Dimitri Peaucelle - Didier Henrion
Some notes on standard LMI solvers Denis Arzelier - Dimitri Peaucelle - Didier Henrion Laboratoire d Analyse et d Architecture des Systèmes LAAS-CNRS 5 L = h l 1 l i Motivation State-feedback stabilization:
More informationAcyclic Semidefinite Approximations of Quadratically Constrained Quadratic Programs
2015 American Control Conference Palmer House Hilton July 1-3, 2015. Chicago, IL, USA Acyclic Semidefinite Approximations of Quadratically Constrained Quadratic Programs Raphael Louca and Eilyan Bitar
More informationRobust and Optimal Control, Spring 2015
Robust and Optimal Control, Spring 2015 Instructor: Prof. Masayuki Fujita (S5-303B) G. Sum of Squares (SOS) G.1 SOS Program: SOS/PSD and SDP G.2 Duality, valid ineqalities and Cone G.3 Feasibility/Optimization
More informationOn Conic QPCCs, Conic QCQPs and Completely Positive Programs
Noname manuscript No. (will be inserted by the editor) On Conic QPCCs, Conic QCQPs and Completely Positive Programs Lijie Bai John E.Mitchell Jong-Shi Pang July 28, 2015 Received: date / Accepted: date
More informationDimension reduction for semidefinite programming
1 / 22 Dimension reduction for semidefinite programming Pablo A. Parrilo Laboratory for Information and Decision Systems Electrical Engineering and Computer Science Massachusetts Institute of Technology
More informationRelaxations and Randomized Methods for Nonconvex QCQPs
Relaxations and Randomized Methods for Nonconvex QCQPs Alexandre d Aspremont, Stephen Boyd EE392o, Stanford University Autumn, 2003 Introduction While some special classes of nonconvex problems can be
More informationAlvaro F. M. Azevedo A. Adão da Fonseca
SECOND-ORDER SHAPE OPTIMIZATION OF A STEEL BRIDGE Alvaro F. M. Azevedo A. Adão da Fonseca Faculty of Engineering University of Porto Portugal 16-18 March 1999 OPTI 99 Orlando - Florida - USA 1 PROBLEM
More informationConvexification of Mixed-Integer Quadratically Constrained Quadratic Programs
Convexification of Mixed-Integer Quadratically Constrained Quadratic Programs Laura Galli 1 Adam N. Letchford 2 Lancaster, April 2011 1 DEIS, University of Bologna, Italy 2 Department of Management Science,
More informationFurther Relaxations of the SDP Approach to Sensor Network Localization
Further Relaxations of the SDP Approach to Sensor Network Localization Zizhuo Wang, Song Zheng, Stephen Boyd, and Yinyu Ye September 27, 2006; Revised May 2, 2007 Abstract Recently, a semi-definite programming
More informationNetwork Localization via Schatten Quasi-Norm Minimization
Network Localization via Schatten Quasi-Norm Minimization Anthony Man-Cho So Department of Systems Engineering & Engineering Management The Chinese University of Hong Kong (Joint Work with Senshan Ji Kam-Fung
More informationA direct formulation for sparse PCA using semidefinite programming
A direct formulation for sparse PCA using semidefinite programming A. d Aspremont, L. El Ghaoui, M. Jordan, G. Lanckriet ORFE, Princeton University & EECS, U.C. Berkeley A. d Aspremont, INFORMS, Denver,
More information