HYPERBOLIC POLYNOMIALS, INTERLACERS AND SUMS OF SQUARES

Similar documents
Exact algorithms: from Semidefinite to Hyperbolic programming

NOTES ON HYPERBOLICITY CONES

HYPERBOLICITY CONES AND IMAGINARY PROJECTIONS

Hyperbolic Polynomials and Generalized Clifford Algebras

SPECTRAHEDRA. Bernd Sturmfels UC Berkeley

Symbolic computation in hyperbolic programming

Semidefinite Programming

SPECTRAHEDRA. Bernd Sturmfels UC Berkeley

The Geometry of Semidefinite Programming. Bernd Sturmfels UC Berkeley

Primal-Dual Symmetric Interior-Point Methods from SDP to Hyperbolic Cone Programming and Beyond

Quartic Spectrahedra

Representations of Positive Polynomials: Theory, Practice, and

Quartic Spectrahedra

Unbounded Convex Semialgebraic Sets as Spectrahedral Shadows

Polynomial-sized semidefinite representations of derivative relaxations of spectrahedral cones

Gram Spectrahedra. Lynn Chua UC Berkeley. joint work with Daniel Plaumann, Rainer Sinn, and Cynthia Vinzant

QUARTIC SPECTRAHEDRA. Bernd Sturmfels UC Berkeley and MPI Bonn. Joint work with John Christian Ottem, Kristian Ranestad and Cynthia Vinzant

Describing convex semialgebraic sets by linear matrix inequalities. Markus Schweighofer. Universität Konstanz

Convex algebraic geometry, optimization and applications

Ranks of Real Symmetric Tensors

The Algebraic Degree of Semidefinite Programming

arxiv:math/ v2 [math.oc] 25 May 2004

Semi-definite representibility. For fun and profit

An E cient A ne-scaling Algorithm for Hyperbolic Programming

SPECTRA - a Maple library for solving linear matrix inequalities in exact arithmetic

DEGREES PhD: Doktor der Naturwissenschaften (summa cum laude) May 2014; Advisor: Claus Scheiderer Diploma Degree in Mathematics (grade 1.

ALGEBRAIC DEGREE OF POLYNOMIAL OPTIMIZATION. 1. Introduction. f 0 (x)

arxiv: v3 [math.ag] 2 May 2018

Lecture 2: Stable Polynomials

Semidefinite programming and convex algebraic geometry

ALGEBRA: From Linear to Non-Linear. Bernd Sturmfels University of California at Berkeley

Sixty-Four Curves of Degree Six

Optimization over Polynomials with Sums of Squares and Moment Matrices

arxiv:math/ v1 [math.oc] 11 Jun 2003

DISCRIMINANTS, SYMMETRIZED GRAPH MONOMIALS, AND SUMS OF SQUARES

d A 0 + m t k A k 0 whenever λ min (B k (x)) t k λ max (B k (x)) for k = 1, 2,..., m x n B n (k).

Hyperbolicity Cones of Elementary Symmetric Polynomials. Masterarbeit

HYPERBOLIC PROGRAMS, AND THEIR DERIVATIVE RELAXATIONS

Rank-one Generated Spectral Cones Defined by Two Homogeneous Linear Matrix Inequalities

Complexity of Deciding Convexity in Polynomial Optimization

Voronoi Cells of Varieties

Real Algebraic Geometry in Convex Optimization. Cynthia Vinzant

Equivariant semidefinite lifts and sum of squares hierarchies

Exact algorithms for linear matrix inequalities

Exact SDP Relaxations for Classes of Nonlinear Semidefinite Programming Problems

Open Problems in Algebraic Statistics

12. Hilbert Polynomials and Bézout s Theorem

Large graphs and symmetric sums of squares

Lecture Note 5: Semidefinite Programming for Stability Analysis

Positive semidefinite rank

Convex optimization. Javier Peña Carnegie Mellon University. Universidad de los Andes Bogotá, Colombia September 2014

Supplement: Universal Self-Concordant Barrier Functions

Convex Optimization. (EE227A: UC Berkeley) Lecture 28. Suvrit Sra. (Algebra + Optimization) 02 May, 2013

6-1 The Positivstellensatz P. Parrilo and S. Lall, ECC

Largest dual ellipsoids inscribed in dual cones

Examples of numerics in commutative algebra and algebraic geo

The Degree of Central Curve in Quadratic Programming

arxiv: v1 [math.co] 9 Sep 2007

Lecture 7: Positive Semidefinite Matrices

MAKSYM FEDORCHUK. n ) = z1 d 1 zn d 1.

The Geometry of Polynomials and Applications

Projective Varieties. Chapter Projective Space and Algebraic Sets

Global Optimization of Polynomials

Semidefinite representation of convex sets and convex hulls

LECTURE 5, FRIDAY

On self-concordant barriers for generalized power cones

UCSD Mathematics Department Tel: (858) Gilman Drive, La Jolla, CA Fax: (858)

arxiv: v3 [math.ag] 15 Dec 2016

Robust and Optimal Control, Spring 2015

The Pythagoras numbers of projective varieties.

d + x u1 y v p 1 (x, y) = det

Semidefinite Representation of the k-ellipse

Advances in Convex Optimization: Theory, Algorithms, and Applications

Hilbert s 17th Problem to Semidefinite Programming & Convex Algebraic Geometry

An Algebraic and Geometric Perspective on Exponential Families

(Uniform) determinantal representations

The Central Curve in Linear Programming

Local properties of plane algebraic curves

10. Smooth Varieties. 82 Andreas Gathmann

MATH 3321 Sample Questions for Exam 3. 3y y, C = Perform the indicated operations, if possible: (a) AC (b) AB (c) B + AC (d) CBA

ALGEBRAIC BOUNDARIES OF HILBERT S SOS CONES

Computational Problems Using Riemann Theta Functions in Sage

An Introduction to Polynomial and Semi-Algebraic Optimization

LECTURE 7, WEDNESDAY

Convex sets, conic matrix factorizations and conic rank lower bounds

Converse Results on Existence of Sum of Squares Lyapunov Functions

Approximate Optimal Designs for Multivariate Polynomial Regression

Convex algebraic geometry, optimization and applications

that a broad class of conic convex polynomial optimization problems, called

LMI MODELLING 4. CONVEX LMI MODELLING. Didier HENRION. LAAS-CNRS Toulouse, FR Czech Tech Univ Prague, CZ. Universidad de Valladolid, SP March 2009

Semidefinite approximations of the matrix logarithm

Using Schur Complement Theorem to prove convexity of some SOC-functions

SEMIDEFINITE PROGRAM BASICS. Contents

Semidefinite Programming Basics and Applications

12. Interior-point methods

Real Symmetric Matrices and Semidefinite Programming

arxiv: v3 [math.oc] 29 Dec 2011

Semidefinite programming lifts and sparse sums-of-squares

Discriminants and Nonnegative Polynomials

A brief history of noncommutative Positivstellensätze. Jaka Cimprič, University of Ljubljana, Slovenia

Transcription:

HYPERBOLIC POLYNOMIALS, INTERLACERS AND SUMS OF SQUARES DANIEL PLAUMANN Universität Konstanz joint work with Mario Kummer (U. Konstanz) Cynthia Vinzant (U. of Michigan) Out[121]= POLYNOMIAL OPTIMISATION Isaac Newton Institute Cambridge 19 July 2013 In[137]:= p1 = ContourPlot3D@x ã 1.6, 8x, -2, 2<, 8z, -4, Mesh Ø None, ContourStyle Ø 8Magenta<, Boxe p2 = ContourPlot3D@z ã 1.5, 8x, -2, 2<, 8z, -4, Mesh Ø None, ContourStyle Ø 8Green<, Boxed Ø Show@8f, p1<d Show@8f, p2<d Show@8f, p1, p2<d Universität Konstanz

Two flavours of real polynomials Let f R[x] d be homogeneous of degree d in x = (x 1,..., x n ). Positive polynomials. f 0 on R n. Ideal representation: Sum of squares f = f1 2 + + fr 2

Two flavours of real polynomials Let f R[x] d be homogeneous of degree d in x = (x 1,..., x n ). Positive polynomials. f 0 on R n. Ideal representation: Sum of squares f = f1 2 + + fr 2 Hyperbolic polynomials. Ideal representation: Determinantal polynomial. f = det(m), M(x) = x 1 M 1 + + x n M n with M 1,..., M n Sym d (R), M(e) positive definite. f R[x] is hyperbolic with respect to e R n if p(e) > 0 and for all v R n f (v te) R[t] has only real zeros in t.

Two flavours of real polynomials Let f R[x] d be homogeneous of degree d in x = (x 1,..., x n ). Positive polynomials. f 0 on R n. Ideal representation: Sum of squares f = f1 2 + + fr 2 Existence only for n 2 or d 2 or (n, d) = (3, 4) (Hilbert 1888) Hyperbolic polynomials. Ideal representation: Determinantal polynomial. f = det(m), M(x) = x 1 M 1 + + x n M n with M 1,..., M n Sym d (R), M(e) positive definite. Existence only for n 3 (Helton-Vinnikov 2004) f R[x] is hyperbolic with respect to e R n if p(e) > 0 and for all v R n f (v te) R[t] has only real zeros in t.

Hyperbolicity cones f hyperbolic with respect to e: f (e) > 0 and f (v te) R[t] has only real roots for all v R n. Hyperbolic polynomials can be thought of as generalised characteristic polynomials.

Hyperbolicity cones In[121]:= f = ContourPlot3D@2 x^4 + y^4 + z^4-3 y^2 x^2-3 z^2 x^2 + y 8x, -2, 2<, 8z, -4, 4<, 8y, -4, 4<, Mesh Ø None, ContourStyle Ø 8 Blue, Opacity@.5D<, Boxed Ø False, Axes Ø f hyperbolic with respect to e: f (e) > 0 and f (v te) R[t] has only real roots for all v R n. Out[121]= Hyperbolic polynomials can be thought of as generalised characteristic polynomials. n = 3, d = 4 C e ( f ) = {v R n All roots of f (v te) are non-negative}, is a closed convex cone, called the hyperbolicity cone. In[137]:= p1 = ContourPlot3D@x ã 1.6, 8x, -2, 2<, 8z, -4, 4<, 8y, -4, 4< Mesh Ø None, ContourStyle Ø 8Magenta<, Boxed Ø False, Axe p2 = ContourPlot3D@z ã 1.5, 8x, -2, 2<, 8z, -4, 4<, 8y, -4, 4< Mesh Ø None, ContourStyle Ø 8Green<, Boxed Ø False, Axes Ø Show@8f, p1<d Show@8f, p2<d Show@8f, p1, p2<d

Hyperbolicity cones In[121]:= f = ContourPlot3D@2 x^4 + y^4 + z^4-3 y^2 x^2-3 z^2 x^2 + y 8x, -2, 2<, 8z, -4, 4<, 8y, -4, 4<, Mesh Ø None, ContourStyle Ø 8 Blue, Opacity@.5D<, Boxed Ø False, Axes Ø f hyperbolic with respect to e: f (e) > 0 and f (v te) R[t] has only real roots for all v R n. Out[121]= Hyperbolic polynomials can be thought of as generalised characteristic polynomials. n = 3, d = 4 C e ( f ) = {v R n All roots of f (v te) are non-negative}, is a closed convex cone, called the hyperbolicity cone. In[137]:= p1 = ContourPlot3D@x ã 1.6, 8x, -2, 2<, 8z, -4, 4<, 8y, -4, 4< Mesh Ø None, ContourStyle Ø 8Magenta<, Boxed Ø False, Axe p2 = ContourPlot3D@z ã 1.5, 8x, -2, 2<, 8z, -4, 4<, 8y, -4, 4< Mesh Ø None, ContourStyle Ø 8Green<, Boxed Ø False, Axes Ø Show@8f, p1<d Show@8f, p2<d Show@8f, p1, p2<d Let f = det(m), M(x) = x 1 M 1 + + x n M n symmetric, M(e) 0. C e ( f ) = {v R n M(v) 0}. is the spectrahedral cone defined by M 1,..., M n. For if M(e) = I d, then f (v te) is the characteristic polynomial of M(v).

Optimisation Semidefinite programming relaxations f 0 on R n? f is a sum of squares in R[x]? SDP

Optimisation Semidefinite programming relaxations f 0 on R n? f is a sum of squares in R[x]? SDP Let f be a hyperbolic polynomial. Hyperbolic programme? SDP Optimise over C e ( f )? SDP formulation/relaxation of C e ( f )

Optimisation Semidefinite programming relaxations f 0 on R n? f is a sum of squares in R[x]? SDP Let f be a hyperbolic polynomial. Hyperbolic programme? SDP Optimise over C e ( f )? SDP formulation/relaxation of C e ( f ) Hyperbolic programming. Studied by Güler, Lewis, Renegar, Tunçel and others. Can use log f as a barrier function in interior point methods.

Interlacers Let f be hyperbolic w.r.t. e, deg( f ) = d, irreducible g be hyperbolic w.r.t. e, deg(g) = d 1

Interlacers Let f be hyperbolic w.r.t. e, deg( f ) = d, irreducible g be hyperbolic w.r.t. e, deg(g) = d 1 Call g an interlacer of f if the roots of g(v te) lie between those of f (v te) for all v R n. 2 1-2 -1 0 1 2 3-2 0-1 Quartic with an interlacing cubic α 1 α d roots of f (v te) β 1 β d 1 roots of g(v te) α 1 β 1 α 2 β 2 β d 1 α d

Interlacers Let f be hyperbolic w.r.t. e, deg( f ) = d, irreducible g be hyperbolic w.r.t. e, deg(g) = d 1 Call g an interlacer of f if the roots of g(v te) lie between those of f (v te) for all v R n. 2 1-2 -1 0 1 2 3-2 0-1 Quartic with an interlacing cubic Directional derivatives g = D v f with v C e ( f ) are interlacers. If f = det(m) with M(e) 0, then any symmetric minor of size d 1 of M is an interlacer.

Interlacers Plane quartic curve with two interlacing cubics The product of two interlacers of f is non-negative on V R ( f ) = {v R n f (v) = 0}.

Interlacers Plane quartic curve with two interlacing cubics The product of two interlacers of f is non-negative on V R ( f ) = {v R n f (v) = 0}. Proposition The set Int e ( f ) of interlacers is a convex cone, namely Int e ( f ) = {g R[x] d 1 D e f g f D e g 0 on R n } Reason. If f, g R[t] have real roots with deg(g) = deg( f ) 1, then g interlaces f if and only if the Wronskian f g f g is everywhere positive or negative (Bézout).

Representing the hyperbolicity cone The convex cone of interlacers is Int e ( f ) = {g R[x] d 1 D e f g f D e g 0 on R n }

Representing the hyperbolicity cone The convex cone of interlacers is Int e ( f ) = {g R[x] d 1 D e f g f D e g 0 on R n } A directional derivative D v f is an interlacer if and only if the point v is in the hyperbolicity cone C e ( f ).

Representing the hyperbolicity cone The convex cone of interlacers is Int e ( f ) = {g R[x] d 1 D e f g f D e g 0 on R n } A directional derivative D v f is an interlacer if and only if the point v is in the hyperbolicity cone C e ( f ). Corollary C e ( f ) = {v R n D e f D v f f D e D v f 0 on R n }

Representing the hyperbolicity cone The convex cone of interlacers is Int e ( f ) = {g R[x] d 1 D e f g f D e g 0 on R n } A directional derivative D v f is an interlacer if and only if the point v is in the hyperbolicity cone C e ( f ). Corollary Write C e ( f ) = {v R n D e f D v f f D e D v f 0 on R n } W e,v ( f ) = D e f D v f f D e D v f for the family of Wronkians associated with f. This realises the hyperbolicity cone as a linear slice of the cone of non-negative polynomials.

A sums of squares relaxation We obtain an increasing sequence of inner approximations C e (N) ( f ) ={v R n (x 2 1 + + x2 n) N W e,v ( f ) is a sum of squares C e ( f ) for N = 0, 1, 2,... If f is strictly hyperbolic (V R ( f ) smooth), then C e (N) ( f ) = C e ( f ) N 0 by a result of Reznick on sums of squares with denominators. }

A sums of squares relaxation We obtain an increasing sequence of inner approximations C e (N) ( f ) ={v R n (x 2 1 + + x2 n) N W e,v ( f ) is a sum of squares C e ( f ) for N = 0, 1, 2,... If f is strictly hyperbolic (V R ( f ) smooth), then C e (N) ( f ) = C e ( f ) N 0 by a result of Reznick on sums of squares with denominators. Theorem (Netzer-Sanyal 2012) If f is strictly hyperbolic, then C e ( f ) is a projected spectrahedral cone. Theorem (Parrilo-Saunderson 2012) Derivative cones for determinantal polynomials are projected spectrahedral cones. }

Results on determinantal representations Effective version of the Helton-Vinnikov theorem for Hermitian determinantal representations. Theorem If f R[x, y, z] is hyperbolic with respect to e, we can construct a representation f = det(m) with M(e) 0 and M Hermitian, starting from any interlacer of f. 2 1 0-1 -2-2 -1 0 1 2 3

Results on determinantal representations Theorem If f k = det(m) is a determinantal representation with M(e) 0, k 1, then all the Wronskians W e,v ( f ) with v C e ( f ) are sums of squares.

Results on determinantal representations Theorem If f k = det(m) is a determinantal representation with M(e) 0, k 1, then all the Wronskians W e,v ( f ) with v C e ( f ) are sums of squares. Example The Vámos polynomial h(x 1,, x 8 ) = x i, I ( [8] 4 )/C i I where C = {{1, 2, 3, 4}, {1, 2, 5, 6}, {1, 2, 7, 8}, {3, 4, 5, 6}, {3, 4, 7, 8}} is the bases-generating polynomial of the matroid with circuits in C. The polynomial h is hyperbolic (Wagner-Wei 2009) but no power of h has a definite determinantal representation (Brändén 2011). We show that W e7,e 8 (h) is not a sum of squares, yielding a new proof of Brändén s result.

Results on determinantal representations Theorem If f k = det(m) is a determinantal representation with M(e) 0, k 1, then all the Wronskians W e,v ( f ) with v C e ( f ) are sums of squares. Theorem If f is multi-affine and hyperbolic with respect to e, then f has a determinantal representation f = det(m) with M(e) 0 if and only if the Wronskians W ei,e j are squares in R[x] for all i, j.

Results on determinantal representations Theorem If f k = det(m) is a determinantal representation with M(e) 0, k 1, then all the Wronskians W e,v ( f ) with v C e ( f ) are sums of squares. Theorem If f is multi-affine and hyperbolic with respect to e, then f has a determinantal representation f = det(m) with M(e) 0 if and only if the Wronskians W ei,e j are squares in R[x] for all i, j. Example We can use this to reprove a classical result saying that the d-th elementary symmetric polynomial in n variables, which is multi-affine and hyperbolic with respect to (1,..., 1), has a definite determinantal representation if and only if d {1, n 1, n}.

The interlacer cone Example Let f = det(x) where X is a d d symmetric matrix of variables. Int Id (det(x)) = { tr(m X adj ) M Sym + d (R) } Sym+ d (R).

The interlacer cone Example Let f = det(x) where X is a d d symmetric matrix of variables. Int Id (det(x)) = { tr(m X adj ) M Sym + d (R) } Sym+ d (R). Theorem Let f R[x] d be hyperbolic with respect to e R n and assume that the projective variety V C ( f ) is smooth. The algebraic boundary of the cone Int e ( f ) is the irreducible hypersurface in C[x] d 1 given by g C[x] d 1 p P n 1 such that f (p) = g(p) = 0 and rank ( f (p) g(p) ) 1.

References M. Kummer, D. Plaumann, C. Vinzant. Hyperbolic polynomials, interlacers and sums of squares. Preprint (2013). arxiv:1212.6696 P. Brändén. Obstructions to determinantal representability. Adv. Math., 226(2), 1202 1212 (2011). arxiv:1004.1382 M. Kummer. A Note on the Hyperbolicity Cone of the Specialized Vámos Polynomial. Preprint (2013). arxiv:1306.4483 T. Netzer, R. Sanyal. Smooth hyperbolicity cones are spectrahedral shadows. Preprint (2012). arxiv:1208.0441 P. Parrilo, J. Saunderson. Polynomial-sized semidefinite representations of derivative relaxations of spectrahedral cones. Preprint (2012). arxiv:1208.1443 D. Plaumann, B. Sturmfels, C. Vinzant. Computing determinantal representations of Helton-Vinnikov curves. In: Operator Theory: Advances and Applications, Vol. 222 (2010). arxiv:1011.6057 D. Plaumann, C. Vinzant. Determinantal representations of hyperbolic plane curves: An elementary approach. To appear in J. Symbolic Computation (2010). arxiv:1207.7047 Surveys G. Blekherman, P. Parrilo, and R. Thomas (editors). Semidefinite Optimization and Convex Algebraic Geometry, MOS-SIAM Series on Optimization. To appear (2012). D. Plaumann. Geometry of Linear Matrix Inequalities. Lecture Notes, University of Konstanz (2013). (link). V. Vinnikov, LMI Representations of Convex Semialgebraic Sets: Past, Present, and Future. In: Operator Theory: Advances and Applications, Vol. 222 (2011). arxiv:1205.2286