Fast ADMM for Sum of Squares Programs Using Partial Orthogonality

Similar documents
Distributed Control of Connected Vehicles and Fast ADMM for Sparse SDPs

Semidefinite Programming

Exploiting Structure in SDPs with Chordal Sparsity

DSOS/SDOS Programming: New Tools for Optimization over Nonnegative Polynomials

Dimension reduction for semidefinite programming

Optimization over Polynomials with Sums of Squares and Moment Matrices

Robust and Optimal Control, Spring 2015

POLYNOMIAL OPTIMIZATION WITH SUMS-OF-SQUARES INTERPOLANTS

6-1 The Positivstellensatz P. Parrilo and S. Lall, ECC

III. Applications in convex optimization

Global Optimization of Polynomials

Convex Optimization and l 1 -minimization

Croatian Operational Research Review (CRORR), Vol. 3, 2012

The Graph Realization Problem

Sparse Optimization Lecture: Basic Sparse Optimization Models

Strange Behaviors of Interior-point Methods. for Solving Semidefinite Programming Problems. in Polynomial Optimization

Analysis and synthesis: a complexity perspective

Sparse Matrix Theory and Semidefinite Optimization

Lecture 6: Conic Optimization September 8

Iterative LP and SOCP-based. approximations to. sum of squares programs. Georgina Hall Princeton University. Joint work with:

Semidefinite Programming Basics and Applications

Research overview. Seminar September 4, Lehigh University Department of Industrial & Systems Engineering. Research overview.

The maximal stable set problem : Copositive programming and Semidefinite Relaxations

12. Interior-point methods

SEMIDEFINITE PROGRAM BASICS. Contents

How to generate weakly infeasible semidefinite programs via Lasserre s relaxations for polynomial optimization

Fast Algorithms for SDPs derived from the Kalman-Yakubovich-Popov Lemma

l p -Norm Constrained Quadratic Programming: Conic Approximation Methods

CSC Linear Programming and Combinatorial Optimization Lecture 10: Semidefinite Programming

Lecture: Introduction to LP, SDP and SOCP

Continuous Optimisation, Chpt 9: Semidefinite Optimisation

Global Optimization with Polynomials

Lecture 7: Convex Optimizations

SOSTOOLS. Sum of Squares Optimization Toolbox for MATLAB. User s guide. Version st July University of Oxford Oxford, U.K.

An accelerated first-order method for solving SOS relaxations of unconstrained polynomial optimization problems

arxiv: v1 [math.oc] 26 Sep 2015

16.1 L.P. Duality Applied to the Minimax Theorem

Analytical Validation Tools for Safety Critical Systems

Pre- and Post-processing Sum-of-squares Programs in Practice

CSCI 1951-G Optimization Methods in Finance Part 10: Conic Optimization

A CONIC DANTZIG-WOLFE DECOMPOSITION APPROACH FOR LARGE SCALE SEMIDEFINITE PROGRAMMING

Lecture 6 Verification of Hybrid Systems

A PARALLEL INTERIOR POINT DECOMPOSITION ALGORITHM FOR BLOCK-ANGULAR SEMIDEFINITE PROGRAMS

Real Symmetric Matrices and Semidefinite Programming

Converse Results on Existence of Sum of Squares Lyapunov Functions

Distributed Stability Analysis and Control of Dynamic Networks

Convex Optimization. (EE227A: UC Berkeley) Lecture 28. Suvrit Sra. (Algebra + Optimization) 02 May, 2013

Facial Reduction and Geometry on Conic Programming

Optimization over Nonnegative Polynomials: Algorithms and Applications. Amir Ali Ahmadi Princeton, ORFE

ELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications

Continuous Optimisation, Chpt 9: Semidefinite Problems

The Q Method for Symmetric Cone Programmin

Exact SDP Relaxations for Classes of Nonlinear Semidefinite Programming Problems

A PARALLEL INTERIOR POINT DECOMPOSITION ALGORITHM FOR BLOCK-ANGULAR SEMIDEFINITE PROGRAMS IN POLYNOMIAL OPTIMIZATION

Interior Point Methods: Second-Order Cone Programming and Semidefinite Programming

Asynchronous Algorithms for Conic Programs, including Optimal, Infeasible, and Unbounded Ones

SPARSE SECOND ORDER CONE PROGRAMMING FORMULATIONS FOR CONVEX OPTIMIZATION PROBLEMS

Semidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 5

INTERIOR-POINT METHODS ROBERT J. VANDERBEI JOINT WORK WITH H. YURTTAN BENSON REAL-WORLD EXAMPLES BY J.O. COLEMAN, NAVAL RESEARCH LAB

Lecture 14: Optimality Conditions for Conic Problems

Lecture 5. The Dual Cone and Dual Problem

E5295/5B5749 Convex optimization with engineering applications. Lecture 5. Convex programming and semidefinite programming

Semidefinite Programming

A PARALLEL TWO-STAGE INTERIOR POINT DECOMPOSITION ALGORITHM FOR LARGE SCALE STRUCTURED SEMIDEFINITE PROGRAMS

Advances in Convex Optimization: Theory, Algorithms, and Applications

Lecture 17: Primal-dual interior-point methods part II

A semidefinite relaxation scheme for quadratically constrained quadratic problems with an additional linear constraint

Convex sets, conic matrix factorizations and conic rank lower bounds

Region of attraction approximations for polynomial dynamical systems

MIT Algebraic techniques and semidefinite optimization February 14, Lecture 3

Convex Optimization. (EE227A: UC Berkeley) Lecture 6. Suvrit Sra. (Conic optimization) 07 Feb, 2013

I.3. LMI DUALITY. Didier HENRION EECI Graduate School on Control Supélec - Spring 2010

EE 227A: Convex Optimization and Applications October 14, 2008

Conic optimization under combinatorial sparsity constraints

Projection methods for conic feasibility problems; applications to sum-of-squares decompositions

Uses of duality. Geoff Gordon & Ryan Tibshirani Optimization /

CS711008Z Algorithm Design and Analysis

Solving Global Optimization Problems with Sparse Polynomials and Unbounded Semialgebraic Feasible Sets

A new approximation hierarchy for polynomial conic optimization

Polynomial level-set methods for nonlinear dynamical systems analysis

Relaxations and Randomized Methods for Nonconvex QCQPs

m i=1 c ix i i=1 F ix i F 0, X O.

Semidefinite programming lifts and sparse sums-of-squares

Lecture 5. Theorems of Alternatives and Self-Dual Embedding

Lecture: Algorithms for LP, SOCP and SDP

Constructing Lyapunov-Krasovskii Functionals For Linear Time Delay Systems

Convex computation of the region of attraction for polynomial control systems

What can be expressed via Conic Quadratic and Semidefinite Programming?

Primal-Dual Geometry of Level Sets and their Explanatory Value of the Practical Performance of Interior-Point Methods for Conic Optimization

15. Conic optimization

Degeneracy in Maximal Clique Decomposition for Semidefinite Programs

Agenda. Interior Point Methods. 1 Barrier functions. 2 Analytic center. 3 Central path. 4 Barrier method. 5 Primal-dual path following algorithms

10725/36725 Optimization Homework 4

Research Reports on Mathematical and Computing Sciences

A Simple Derivation of a Facial Reduction Algorithm and Extended Dual Systems

Convex Optimization. Newton s method. ENSAE: Optimisation 1/44

Lecture 20: November 1st

A Julia JuMP-based module for polynomial optimization with complex variables applied to ACOPF

Trust Region Problems with Linear Inequality Constraints: Exact SDP Relaxation, Global Optimality and Robust Optimization

Numerical Optimization

Transcription:

Fast ADMM for Sum of Squares Programs Using Partial Orthogonality Antonis Papachristodoulou Department of Engineering Science University of Oxford www.eng.ox.ac.uk/control/sysos antonis@eng.ox.ac.uk with Yang Zheng (U Oxford) and Giovanni Fantuzzi (Imperial CL)

OULINE 1 Introduction SOS programs, SDPs and partial orthogonality 3 A fast ADMM algorithm 4 Numerical examples 5 Conclusion

OULINE 1 Introduction SOS programs, SDPs and partial orthogonality 3 A fast ADMM algorithm 4 Numerical examples 5 Conclusion

5. Conclusion Polynomial Optimisation Is px ( ) = px α 0? α Unconstrained polynomial optimisation α min p( x) x n max x n γ s.t. px ( ) γ 0 p( x) γ x Other areas: graph partition, matrix copositivity test, etc.

Analysis in Dynamical Systems Lyapunov: x = f( x), f(0) = 0, f Lipschitz, n xt ( ) D x, 0 D If V : D R cont. differentiable s.t. V(0) = 0, and V( x) > 0 in D\ {0} V( x) V ( x) = f( x) > 0 in D\{0} x then x = 0 is asymptotically stable. V(x) n R Checking if p( x) 0 is NP-hard when deg( p) 4, p [ x].

Positive Polynomials and Sum of Squares Shor: px ( ) is Sum of Squares px ( ) is positive semi-definite m = qi ( x) px i = 1 px ( ) ( ) 0 Worst-case polynomial NP-hard when deg( p) 4 time complexity Can be solved using semidefinite programming (SDP) (Parrilo), which can be setup and solved using SOSOOLS: www.eng.ox.ac.uk/control/sostools Can also search for unknown coefficients of p( x) so that px ( ) is SOS P., Anderson, Valmorbida, Prajna, Seiler, Parrilo, 013

SOSOOLS Formulates and solves the equivalent semidefinite programme (SDP) www.eng.ox.ac.uk/control/sostools SOSOOLS SOS program SOSP solution SDP SDP solution SeDuMi SDP3 CSDP SDPNAL SDPA SOSOOLS

x Van der Pol Oscillator = x 1 x = x (1 x ) x 1 1

Sum of Squares and Semidefinite Programming m ( ) = qi ( x) = vd( x) Xvd( x) i = 1 px Proposition: p is SOS X 0, v ( x) monomials of degree d deg( p) Example: p( x, x ) = 5x + x x x x x x x 4 4 3 3 1 1 1 1 1 X v d ( x) 1 11 1 13 1 1 3 x q q q x = x q q q x xx q q q xx 1 13 3 33 1 i i Expand out and match coefficients: q = 5, q =, q + q = 1, q = 1, q = 1 11 1 33 13 3 Require that X 0

What restricts SOS methods? Computation m ( ) i ( ) d( ) d( ), i = 1 px = q x = v x Xv x x { α n α } d v ( x) = x d n+ d 1 x x x x x x x, of size. d d = 1 n 1 1 n Reduce the size of v d (x)? Newton polytope, diagonal inconsistency, symmetry properties, and facial reduction Alternative formulation (DSOS/SDSOS), leading to linear programs (LPs) or second-order-cone programs (SOCPs) Solving the SDPs by more scalable first-order methods (FOMs) at the cost of reduced accuracy. n

OULINE 1 Introduction SOS programs, SDPs and partial orthogonality 3 A fast ADMM algorithm 4 Numerical examples 5 Conclusion

5. Conclusion Formalism t sx ( ) g( x) ug( x) is SOS = 0 i = 1 iff s( x) = v ( x) Xv ( x) = X, v ( x) v ( x), X 0 d d d d 1 if β + γ = α Indicator matrices ( Bα ) βγ, = 0 otherwise hen v ( x) v ( x) B x = d d α t α And therefore g0, α ug i i, α x = Bα, X x α i = 1 α t Coefficient Matching Conditions g0, α ug i i, α = Bα, X i = 1 i i α α α

5. Conclusion Our strategy min wu s. t. sx ( ) = g( x) ug( x), 0 min wu t t n i i Bα X + ug i i, α = g0, α, α d i = 1 i = 1 s Σ[ x] X 0 s. t.,,, Size of the SDP Partial orthogonality in the resulting Semidefinite Programme A fast first order (ADMM) algorithm that exploits the partial orthogonality trades accuracy for scalability. Implemented in CDCS-sos, downloadable

5. Conclusion Standard SDP formulation A Notation min wu Standard SDP formulation t n α + i i, α = 0, α, α d i = 1 s. t. B, X ug g,, X 0 g1,1 gt,1 vec( B1 ) =, A = 1 g1, m g t, m vec( Bm) min ξ c ξ s. t. Aξ = b, ξ K A A A R m ( t+ N ) [ 1 ], m b [ g,, g ] R, 0,1 0, m t+ N c [ w,0,,0] R, ξ K t+ N [ u, vec( X) ] R, R t S +

5. Conclusion Standard SDP formulation min ξ c ξ s. t. Aξ = b, ξ K A g1,1 gt,1 vec( B1 ) =, A = 1 g1, m g t, m vec( Bm) m A [ A A ] R + 1 ( t N ) 1 he density of nonzero elements in is O. n A d AA is diagonal. he m m matrix AA is of "diagonal plus low rank" form.

Partial Orthogonality g1,1 gt,1 vec( B1 ) t g0, α ug i i, α = Bα, A1 =, A = i = 1 g1, m g t, m vec( Bm) AA 1 1 AA 1. D. Bertsimas, R. M. Freund, and X. A. Sun, An accelerated first order method for solving sos relaxations of unconstrained polynomial optimization problems, Optimization Methods and Software, vol. 8, no. 3, pp. 44 441, 013.. D. Henrion and J. Malick, Projection methods in conic optimization, in Handbook on Semidefinite, Conic and Polynomial Optimization. Springer, 01, pp. 565 600.

OULINE 1 Introduction SOS programs, SDPs and partial orthogonality 3 A fast ADMM algorithm 4 Numerical examples 5 Conclusion

ADMM for the homogeneous self-dual embedding 3. ADMM for the Homogenous Self dual Embedding min ξ s.t. min c ξ wu Aξ = b ξ K t n α + i i, α = 0, α, α d i = 1 s. t. B, X ug g,, X 0 max y s.t. by Ay z c z + = K KK conditions Primal feasibility Aξ = b, ξ K Dual feasibility A y + z = c, z K Zero-duality gap c ξ b y = 0 z 0 A c ξ s = A 0 b y κ c b 0 τ

ADMM for the Homogeneous self-dual embedding 3. ADMM for the Homogenous Self dual Embedding z 0 A c ξ s = A 0 b y κ c b 0 τ τ, κ are two non-negative and complementary variables Notational simplicity z ξ 0 A c m v s, u y, Q A 0 b, C = K R R κ τ c b 0 + Feasibility problem find ( uv, ) s.t. v = Qu ( uv, ) C C

ADMM for the Homogeneous self-dual embedding find ( uv, ) s.t. v = Qu, ( u, v) C C ADMM steps (similar to solver SCS [1]) k+ 1 1 k k ˆ ( ) ( ) u = I + Q u + v k+ 1 k+ 1 k u = P( uˆ v ) C v = v uˆ + u k+ 1 k k+ 1 k+ 1 Affine projection Conic projection 0 A c Q A 0 b c b 0 Affine projection takes advantage of diagonal plus low rank for efficient computation [1] O Donoghue, B., Chu, E., Parikh, N. and Boyd, S. (016). Conic optimization via operator splitting and homogeneous self-dual embedding. Journal of Optimization heory and Applications, 169(3), 104 1068

ADMM for the Homogeneous self-dual embedding k+ 1 1 k k ˆ ( ) ( ) u = I + Q u + v k+ 1 k+ 1 k u = P( uˆ v ) C v = v uˆ + u k+ 1 k k+ 1 k+ 1 Affine projection i i i In affine projection, one needs to invert/factorize m m I + AA R Recall that AA is "diagonal plus low rank" ( I + AA ) = ( P + AA ) = P P A( I + A P A) A P 1 1 1 1 1 1 1 1 1 1 1 1 1 1 t t I + A P A R 1 1 Since in typical SOS programs t m, partial orthogonality can provide a speed-up in the affine projection.

OULINE 1 Introduction SOS programs, SDPs and partial orthogonality 3 A fast ADMM algorithm 4 Numerical examples 5 Conclusion

CDCS: Cone Decomposition Conic Solver 4. CDCS: Cone Decomposition Conic Solver An open source MALAB solver for partially decomposable conic programs; CDCS supports constraints on the following cones: Free variables non-negative orthant second-order cone the positive semidefinite cone. Input-output format is aligned with SeDuMi; SOS module can be invoked with option sos Works with latest YALMIP release. Syntax: [x,y,z,info] = cdcs(at,b,c,k,opts); Download from https://github.com/oxfordcontrol/cdcs

5. Conclusion Numerical Examples Implemented in CDCS, invoked by the option sos Compared to SCS and SeDuMi, SDPA, SDP3 Stopping condition for CDCS and SCS: ε = 10-4, 000 iterations min ( xx + x x x 3 x x ) x i j i j j i j 1 < i j n n s. t. x 1 i i = 1

5. Conclusion Numerical Examples min ( xx + x x x 3 x x ) x i j i j j i j 1 < i j n n s. t. x 1 i i = 1

5. Conclusion Lyapunov function construction Randomly generated polynomial dynamical systems of degree three with a locally asymptotically stable equilibrium at the origin.

5. Conclusion Nuclear Receptor Signalling ested our algorithm on a model of nuclear receptor signalling. he model has 37 states, and a cubic vector field ested local stability in the ball with radius 0.01 Constructing a quadratic Lyapunov function Dimension of the resulting SDP: Cone size: K.s = [37 37 741], K.f = 1406; Number of constraints: 10 676, the size of A:, (5535 is the number of variables) Interior point solvers First-order solvers SeDuMi SDP3 SDPA CSDP SCS CDCS Failed Failed Failed Failed 151 s 57.3 s Failed: Out of memory Numerical results on a PC with.8 GHz Intel Core i7 and 8 GB of RAM

OULINE 1 Introduction SOS programs, SDPs and partial orthogonality 3 A fast ADMM algorithm 4 Numerical examples 5 Conclusion

5. Conclusion Conclusions SOS programs have found in a wide range of applications, e.g. polynomial optimization problems and nonlinear systems analysis questions. One fundamental challenge is the poor scalability of SOS programs to large instances Fortunately, the resulting SDPs are highly sparse and structured, (partial orthogonality in this talk) We have developed a fast ADMM algorithm to solve this class of SDPs at the cost of reduced accuracy. CDCS: Download from https://github.com/oxfordcontrol/cdcs

hank you for your attention! Q & A 1. Zheng, Y., Fantuzzi, G., & Papachristodoulou, A. (017). Fast ADMM for sum of squares programs using partial orthogonality. arxiv preprint arxiv:1708.04174.. Zheng, Yang, Giovanni Fantuzzi, and Antonis Papachristodoulou. "Exploiting Sparsity in the Coefficient Matching Conditions in Sum of Squares Programming using ADMM." IEEE Control Systems Letters (017). Paper hb03.3