POLYNOMIAL OPTIMIZATION WITH SUMS-OF-SQUARES INTERPOLANTS

Similar documents
Advances in Convex Optimization: Theory, Algorithms, and Applications

A semidefinite relaxation scheme for quadratically constrained quadratic problems with an additional linear constraint

Interior Point Methods: Second-Order Cone Programming and Semidefinite Programming

SEMIDEFINITE PROGRAM BASICS. Contents

4. Algebra and Duality

Interior Point Algorithms for Constrained Convex Optimization

Semidefinite Programming

Primal-dual IPM with Asymmetric Barrier

Primal-Dual Symmetric Interior-Point Methods from SDP to Hyperbolic Cone Programming and Beyond

Primal-Dual Geometry of Level Sets and their Explanatory Value of the Practical Performance of Interior-Point Methods for Conic Optimization

Convex Optimization. (EE227A: UC Berkeley) Lecture 28. Suvrit Sra. (Algebra + Optimization) 02 May, 2013

Robust and Optimal Control, Spring 2015

Lecture 9 Sequential unconstrained minimization

Lecture 5. The Dual Cone and Dual Problem

Lecture 6: Conic Optimization September 8

Lecture 17: Primal-dual interior-point methods part II

LMI MODELLING 4. CONVEX LMI MODELLING. Didier HENRION. LAAS-CNRS Toulouse, FR Czech Tech Univ Prague, CZ. Universidad de Valladolid, SP March 2009

ADVANCES IN CONVEX OPTIMIZATION: CONIC PROGRAMMING. Arkadi Nemirovski

New stopping criteria for detecting infeasibility in conic optimization

Example: feasibility. Interpretation as formal proof. Example: linear inequalities and Farkas lemma

12. Interior-point methods

Semidefinite Programming Basics and Applications

Barrier Method. Javier Peña Convex Optimization /36-725

COURSE ON LMI PART I.2 GEOMETRY OF LMI SETS. Didier HENRION henrion

Fast ADMM for Sum of Squares Programs Using Partial Orthogonality

The maximal stable set problem : Copositive programming and Semidefinite Relaxations

On self-concordant barriers for generalized power cones

Lagrangian Duality Theory

MIT Algebraic techniques and semidefinite optimization February 14, Lecture 3

ELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications

The moment-lp and moment-sos approaches

Nonsymmetric potential-reduction methods for general cones

Research Note. A New Infeasible Interior-Point Algorithm with Full Nesterov-Todd Step for Semi-Definite Optimization

Global Optimization of Polynomials

Semidefinite programming relaxations for semialgebraic problems

An explicit construction of distinguished representations of polynomials nonnegative over finite sets

Copositive Programming and Combinatorial Optimization

Convex Optimization and l 1 -minimization

Optimization over Nonnegative Polynomials: Algorithms and Applications. Amir Ali Ahmadi Princeton, ORFE

How to generate weakly infeasible semidefinite programs via Lasserre s relaxations for polynomial optimization

Applications of Linear Programming

Lecture 8. Strong Duality Results. September 22, 2008

Acyclic Semidefinite Approximations of Quadratically Constrained Quadratic Programs

A CONIC DANTZIG-WOLFE DECOMPOSITION APPROACH FOR LARGE SCALE SEMIDEFINITE PROGRAMMING

A Second-Order Path-Following Algorithm for Unconstrained Convex Optimization

An E cient A ne-scaling Algorithm for Hyperbolic Programming

A PREDICTOR-CORRECTOR PATH-FOLLOWING ALGORITHM FOR SYMMETRIC OPTIMIZATION BASED ON DARVAY'S TECHNIQUE

Optimization over Polynomials with Sums of Squares and Moment Matrices

Convex Optimization. Ofer Meshi. Lecture 6: Lower Bounds Constrained Optimization

A solution approach for linear optimization with completely positive matrices

Agenda. Interior Point Methods. 1 Barrier functions. 2 Analytic center. 3 Central path. 4 Barrier method. 5 Primal-dual path following algorithms

Lecture 15 Newton Method and Self-Concordance. October 23, 2008

Minimum Ellipsoid Bounds for Solutions of Polynomial Systems via Sum of Squares

Advances in Convex Optimization: Conic Programming

Lecture: Introduction to LP, SDP and SOCP

Lecture 5. Theorems of Alternatives and Self-Dual Embedding

Approximate Farkas Lemmas in Convex Optimization

Croatian Operational Research Review (CRORR), Vol. 3, 2012

Local Self-concordance of Barrier Functions Based on Kernel-functions

Advanced SDPs Lecture 6: March 16, 2017

A priori bounds on the condition numbers in interior-point methods

Interior Point Methods. We ll discuss linear programming first, followed by three nonlinear problems. Algorithms for Linear Programming Problems

Supplement: Universal Self-Concordant Barrier Functions

Sparse Matrix Theory and Semidefinite Optimization

Semidefinite Programming, Combinatorial Optimization and Real Algebraic Geometry

w Kluwer Academic Publishers Boston/Dordrecht/London HANDBOOK OF SEMIDEFINITE PROGRAMMING Theory, Algorithms, and Applications

A New Self-Dual Embedding Method for Convex Programming

Network Localization via Schatten Quasi-Norm Minimization

Copositive Programming and Combinatorial Optimization

Semidefinite Programming

Convex Optimization. Newton s method. ENSAE: Optimisation 1/44

I.3. LMI DUALITY. Didier HENRION EECI Graduate School on Control Supélec - Spring 2010

Analysis and synthesis: a complexity perspective

On Two Measures of Problem Instance Complexity and their Correlation with the Performance of SeDuMi on Second-Order Cone Problems

6-1 The Positivstellensatz P. Parrilo and S. Lall, ECC

A PARALLEL INTERIOR POINT DECOMPOSITION ALGORITHM FOR BLOCK-ANGULAR SEMIDEFINITE PROGRAMS

Optimization of univariate functions on bounded intervals by interpolation and semidefinite programming

Exact SDP Relaxations for Classes of Nonlinear Semidefinite Programming Problems

Interior-Point Methods

SPARSE SECOND ORDER CONE PROGRAMMING FORMULATIONS FOR CONVEX OPTIMIZATION PROBLEMS

4TE3/6TE3. Algorithms for. Continuous Optimization

On the Sandwich Theorem and a approximation algorithm for MAX CUT

January 29, Introduction to optimization and complexity. Outline. Introduction. Problem formulation. Convexity reminder. Optimality Conditions

A new primal-dual path-following method for convex quadratic programming

Largest dual ellipsoids inscribed in dual cones

Facial Reduction and Geometry on Conic Programming

1 Introduction Semidenite programming (SDP) has been an active research area following the seminal work of Nesterov and Nemirovski [9] see also Alizad

No OPTIMIZATION OF UNIVARIATE FUNCTIONS ON BOUNDED INTERVALS BY INTERPOLATION AND SEMIDEFINITE PROGRAMMING

A full-newton step infeasible interior-point algorithm for linear programming based on a kernel function

Convex Optimization M2

9. Interpretations, Lifting, SOS and Moments

Lecture 3: Semidefinite Programming

On Conically Ordered Convex Programs

Newton s Method. Javier Peña Convex Optimization /36-725

Fast Algorithms for SDPs derived from the Kalman-Yakubovich-Popov Lemma

Complexity of Deciding Convexity in Polynomial Optimization

Acyclic Semidefinite Approximations of Quadratically Constrained Quadratic Programs

Lecture Note 5: Semidefinite Programming for Stability Analysis

Interior Point Methods for Mathematical Programming

12. Interior-point methods

Transcription:

POLYNOMIAL OPTIMIZATION WITH SUMS-OF-SQUARES INTERPOLANTS Sercan Yıldız syildiz@samsi.info in collaboration with Dávid Papp (NCSU) OPT Transition Workshop May 02, 2017

OUTLINE Polynomial optimization and sums of squares Sums-of-squares hierarchy Semidefinite representation of sums-of-squares constraints Interior-point methods and conic optimization Sums-of-squares optimization with interior-point methods Computational complexity Preliminary results

POLYNOMIAL OPTIMIZATION Polynomial optimization problem: min z R n s.t. where f (z) g i (z) 0 for i = 1,..., m f, g 1,..., g m are n-variate polynomials Example: min z R 2 z 3 1 + 3z 2 1 z 2 6z 1 z 2 2 + 2z 3 2 s.t. z 2 1 + z 2 2 1 Some applications: Shape-constrained estimation Design of experiments Control theory Combinatorial optimization Computational geometry Optimal power flow Source: http://www.cds.caltech.edu/ murray/amwiki/images/thumb/1/ 19/Doscpp.png/270px- Doscpp.png Source: https://upload.wikimedia. org/wikipedia/commons/thumb/c/ cf/max- cut.svg/200px- Max- cut. svg.png Source: https://upload.wikimedia. org/wikipedia/commons/thumb/d/ d2/kissing- 2d.svg/ 190px- Kissing- 2d.svg.png

SUMS-OF-SQUARES RELAXATIONS Let f be an n-variate degree-2d polynomial. Unconstrained polynomial optimization: min z R n f (z) NP-hard already for d = 2! Equivalent dual formulation: max y R s.t. y f (z) y P where P = {f : f (z) 0 z R n } Sums-of-squares cone: SOS = {f : f = N j=1 f 2 j for some degree-d polynomials f j } SOS relaxation: f SOS f P max y R s.t. y f (z) y SOS

SUMS-OF-SQUARES RELAXATIONS Let f, g 1,..., g m be n-variate polynomials. Constrained polynomial optimization: min z R n s.t. f (z) g i (z) 0 for i = 1,..., m Feasible set: G = {z R n : g i (z) 0 for i = 1,..., m}. Dual formulation: max y R y s.t. Weighted SOS cone of order r: f (z) y P G where P G = {f : f (z) 0 z G} SOS G,r = {f : f = m i=0 g is i, s i SOS, deg(g i s i ) r} where g 0 1 SOS relaxation of order r: f SOS G,r f P G max y R s.t. y f (z) y SOS G,r

WHY DO WE LIKE SOS? Polynomial optimization is NP-hard. Increasing r produces a hierarchy of SOS relaxations for polynomial optimization problems. Under mild assumptions: The lower bounds from SOS relaxations converge to the true optimal value as r (follows from Putinar s Positivstellensatz). SOS relaxations can be represented as semidefinite programs of size O(mL 2 ) where L = ( ) n+r/2 n (follows from the results of Shor, Nesterov, Parrilo, Lasserre). There are efficient and stable numerical methods for solving SDPs.

SOS CONSTRAINTS ARE SEMIDEFINITE REPRESENTABLE Consider the cone of degree-2d SOS polynomials: SOS 2d = {f R[z] 2d : f = N j=1 f 2 j for some f j R[z] d } Theorem (Nesterov, 2000) The univariate polynomial f (z) = 2d f u=0 uz u is SOS iff there exists a (d + 1) (d + 1) PSD matrix S such that fu = k+l=u S kl u = 0,..., 2d.

SOS CONSTRAINTS ARE SEMIDEFINITE REPRESENTABLE More generally, in the n-variate case: Let L := dim(r[z] d ) = ( ) n+d n and U := dim(r[z]2d ) = ( ) n+2d n. Fix bases {p l } L l=1 and {q u} U u=1 for the linear spaces R[z] d and R[z] 2d. Theorem (Nesterov, 2000) The polynomial f (z) = U f u=1 uq u(z) is SOS iff there exists a L L PSD matrix S such that f = Λ (S) where Λ : R U S L is the linear map satisfying Λ([q u] U u=1) = [p l ] L l=1[p l ] L l=1. If univariate polynomials are represented in the monomial basis: x 0 x 1 x 2... x d x 1 x 2 x 2 Λ(x) =. x 2d 1 x d x 2d 1 x 2d These results easily extend to the weighted case. Λ (S) = [ k+l=u S ] 2d kl. u=0

SOS CONSTRAINTS ARE SEMIDEFINITE REPRESENTABLE To keep things simple, we focus on optimization over a single SOS cone. SOS problem: max y R k,s S L s.t. b y A y + Λ (S) = c S 0 Moment problem: A y + s = c s SOS := {s R U : s = Λ (S), S 0} min x R U s.t. c x Ax = b Λ(x) 0 x SOS := {x R U : Λ(x) 0} Disadvantages of existing approaches Problem: SDP representation roughly squares the number of variables. Solution: We solve the SOS and moment problems in their original space. Problem: Standard basis choices lead to ill-conditioning. Solution: We use orthogonal polynomials and interpolation bases.

INTERIOR-POINT METHODS FOR CONIC PROGRAMMING Primal-dual pair of conic programs: min c x max b y s.t. Ax = b x K s.t. A y + s = c s K K : closed, convex, pointed cone with nonempty interior Examples: K = R n +, S n +, SOS, P Interior-point methods make use of self-concordant barriers (SCBs). Examples: K = R n +: F (x) = log x K = S n +: F(X) = log det X

INTERIOR-POINT METHODS FOR CONIC PROGRAMMING Given SCBs F and F, IPMs converge to the optimal solution by solving a sequence of equality-constrained barrier problems for µ 0: min s.t. Primal IPMs: c x + µf(x) Ax = b solve only the primal barrier problem, are not considered to be practical. Primal-dual IPMs: max s.t. b y µf (s) A y + s = c solve both the primal and dual barrier problems simultaneously, are preferred in practice.

INTERIOR-POINT METHODS FOR CONIC PROGRAMMING In principle, any closed convex cone admits a SCB (Nesterov and Nemirovski, 1994). However, the success of the IPM approach depends on the availability of a SCB whose gradient and Hessian can be computed efficiently. For the cone P, there are complexity-based reasons for suspecting that there are no computationally tractable SCBs. For the cone SOS, there is evidence suggesting that there may not exist any tractable SCBs. On the other hand, the cone SOS inherits a tractable SCB from the PSD cone.

INTERIOR-POINT METHOD OF SKAJAA AND YE Until recently: The practical success of primal-dual IPMs had been limited to optimization over symmetric cones: LP, SOCP, SDP. Existing primal-dual IPMs for non-symmetric conic programs required both the primal and dual SCBs (e.g., Nesterov, Todd, and Ye, 1999; Nesterov, 2012). Skajaa and Ye (2015) proposed a primal-dual IPM for non-symmetric conic programs which requires a SCB for only the primal cone, achieves the best-known iteration complexity.

SOLVING SOS PROGRAMS WITH INTERIOR-POINT METHODS Using Skajaa and Ye s IPM with the SCB for SOS, the SOS and moment problems can be solved without recourse to SDP. For any ɛ (0, 1), the algorithm finds a primal-dual solution that has ɛ times the duality gap of an initial solution in O( L log(1/ɛ)) iterations where L = dim(r[x] d ). Each iteration of the IPM requires the computation of the Hessian of the SCB for SOS, the solution of a Newton system. Solving the Newton system requires O(U 3 ) operations where U = dim(r[z] 2d ).

SOLVING SOS PROGRAMS WITH INTERIOR-POINT METHODS The choice of bases {p l } L l=1 and {q u} U u=1 for R[z] d and R[z] 2d has a significant effect on how efficiently the Newton system can be compiled. In general, computing the Hessian requires O(L 2 U 2 ) operations. If both bases are chosen to be monomial bases, the Hessian can be computed faster but requires specialized methods such as FFT and the inversion of Hankel-like matrices. Following Löfberg and Parrilo (2004), we choose {q u} U u=1: Lagrange interpolating polynomials, {p l } L l=1: orthogonal polynomials. With this choice, the Hessian can be computed in O(LU 2 ) operations.

SOLVING SOS PROGRAMS WITH INTERIOR-POINT METHODS Putting everything together: The algorithm runs in O( L log(1/ɛ)) iterations. At each iteration: Computing the Hessian requires O(LU 2 ) operations, Solving the Newton system requires O(U 3 ) operations. Overall complexity: O(U 3 L log(1/ɛ)) operations. This matches the best-known complexity bounds for LP! In contrast: Solving the SDP formulation with a primal-dual IPM requires O(L 6.5 log(1/ɛ)) operations. For fixed n: L 2 U = Θ(d n ).

SOLVING SOS PROGRAMS WITH INTERIOR-POINT METHODS The conditioning of the moment problem is directly related to the conditioning of the interpolation problem with {p k p l } L k,l=1. Good interpolation nodes are understood well only in few low-dimensional domains: Chebyshev points in [ 1, 1], Padua points in [ 1, 1] 2 (Caliari et al., 2005). For problems in higher dimensions, we follow a heuristic approach to choose interpolation nodes.

A TOY EXAMPLE Minimizing the Rosenbrock function on the square: min (z 1 1) 2 + 100(z 2 z 2 z R 2 1 ) 2 s.t. z [ 1, 1] 2 We can obtain a lower bound on the optimal value of this problem from the moment relaxation of order r. For r = 60: The moment relaxation has 1891 variables. The SDP representation requires one 496 496 and two 494 494 matrix inequalities. Sedumi quits with numerical errors after 948 seconds. Primal infeasibility: 1.2 10 4. Our implementation solves the moment relaxation in 482 seconds.

FINAL REMARKS We have shown that SOS programs can be solved using a primal-dual IPM in O(U 3 L log(1/ɛ)) operations where L = ( ) ( n+d n and U = n+2d ) n. This improves upon the standard SDP-based approach which requires O(L 6.5 log(1/ɛ)) operations. In progress: Numerical stability. For multivariate problems, the conditioning of the problem formulation becomes important. How do we choose interpolation nodes for better-conditioned problems? Thank you!

REFERENCES M. Caliari, S. De Marchi, and M. Vianello. Bivariate polynomial interpolation on the square at new nodal sets. Applied Mathematics and Computation, 165:261-274, 2005. J. Löfberg and P. A. Parrilo. From coefficients to samples: a new approach to sos optimization. In 43rd IEEE Conference on Decision and Control, volume 3, pages 3154-3159. IEEE, 2004. Yu. Nesterov. Squared functional systems and optimization problems. In H. Frenk, K. Roos, T. Terlaky, and S. Zhang, editors, High Performance Optimization, volume 33 of Applied Optimization, chapter 17, pages 405-440. Springer, Boston, MA, 2000. Yu. Nesterov. Towards non-symmetric conic optimization. Optimization Methods & Software, 27(4-5):893-917, 2012.

REFERENCES Yu. Nesterov and A. Nemirovski. Interior-Point Polynomial Algorithms in Convex Programming, volume 13 of SIAM Studies in Applied Mathematics. Society for Industrial and Applied Mathematics (SIAM), Philadelphia, PA, 1994. Yu. Nesterov, M. J. Todd, and Y. Ye. Infeasible-start primal-dual methods and infeasibility detectors for nonlinear programming problems. Mathematical Programming, 84:227-267, 1999. A. Skajaa and Y. Ye. A homogeneous interior-point algorithm for nonsymmetric convex conic optimization. Mathematical Programming Ser. A, 150:391-422, 2015.