The moment-lp and moment-sos approaches in optimization

Size: px
Start display at page:

Download "The moment-lp and moment-sos approaches in optimization"

Transcription

1 The moment-lp and moment-sos approaches in optimization LAAS-CNRS and Institute of Mathematics, Toulouse, France Workshop Linear Matrix Inequalities, Semidefinite Programming and Quantum Information Theory 2016, Toulouse, January 2016

2

3

4

5 tif pixels 13/09/12 10:09 Page 1 sur 1 See in particular the Chapter by M. Laurent

6 M.F. Anjos, Polytechnique Montréal, QC, Canada; J.B. Lasserre, LAAS, Toulouse Cedex 4, France (Eds.) Handbook on Semidefinite, Conic and Polynomial Optimization Series: International Series in Operations Research & Management Science Summarizes research and developments of last ten years and brings field up to date Individual sections covering theory, algorithms, software and applications Editors are quite prominent in the field. 2012, 2012, XI, 957 p. 57 illus. Printed book Hardcover 199, $ *213,95 (D) 219,94 (A) CHF ebook For individual purchases buy at a lower price on springer.com. A free preview is available on SpringerLink. Also available from libraries offering Springer s ebook Collection. springer.com/ebooks MyCopy Printed ebook exclusively available to patrons whose library offers Springer s ebook Collection.*** $ springer.com/mycopy Semidefinite and conic optimization is a major and thriving research area within the optimization community. Although semidefinite optimization has been studied (under different names) since at least the 1940s, its importance grew immensely during the 1990s after polynomial-time interior-point methods for linear optimization were extended to solve semidefinite optimization problems. Since the beginning of the 21st century, not only has research into semidefinite and conic optimization continued unabated, but also a fruitful interaction has developed with algebraic geometry through the close connections between semidefinite matrices and polynomial optimization. This has brought about important new results and led to an even higher level of research activity. This Handbook on Semidefinite, Conic and Polynomial Optimization provides the reader with a snapshot of the state-of-the-art in the growing and mutually enriching areas of semidefinite optimization, conic optimization, and polynomial optimization. It contains a compendium of the recent research activity that has taken place in these thrilling areas, and will appeal to doctoral students, young graduates, and experienced researchers alike. The Handbook s thirty-one chapters are organized into four parts:theory, covering significant theoretical developments as well as the interactions between conic optimization and polynomial optimization;algorithms, documenting the directions of current algorithmic development;software, providing an overview of the state-ofthe-art;applications, dealing with the application areas where semidefinite and conic optimization has made a significant impact in recent years.

7 Why polynomial optimization? LP- and SDP- CERTIFICATES of POSITIVITY The moment-lp and moment-sos approaches

8 Why polynomial optimization? LP- and SDP- CERTIFICATES of POSITIVITY The moment-lp and moment-sos approaches

9 Why polynomial optimization? LP- and SDP- CERTIFICATES of POSITIVITY The moment-lp and moment-sos approaches

10 Consider the polynomial optimization problem: P : f = min{ f (x) : g j (x) 0, j = 1,..., m } for some polynomials f, g j R[x]. Why Polynomial Optimization? After all... P is just a particular case of Non Linear Programming (NLP)!

11 Consider the polynomial optimization problem: P : f = min{ f (x) : g j (x) 0, j = 1,..., m } for some polynomials f, g j R[x]. Why Polynomial Optimization? After all... P is just a particular case of Non Linear Programming (NLP)!

12 True!... if one is interested with a LOCAL optimum only!! When searching for a local minimum... Optimality conditions and descent algorithms use basic tools from REAL and CONVEX analysis and linear algebra The focus is on how to improve f by looking at a NEIGHBORHOOD of a nominal point x K, i.e., LOCALLY AROUND x K, and in general, no GLOBAL property of x K can be inferred. The fact that f and g j are POLYNOMIALS does not help much!

13 True!... if one is interested with a LOCAL optimum only!! When searching for a local minimum... Optimality conditions and descent algorithms use basic tools from REAL and CONVEX analysis and linear algebra The focus is on how to improve f by looking at a NEIGHBORHOOD of a nominal point x K, i.e., LOCALLY AROUND x K, and in general, no GLOBAL property of x K can be inferred. The fact that f and g j are POLYNOMIALS does not help much!

14 True!... if one is interested with a LOCAL optimum only!! When searching for a local minimum... Optimality conditions and descent algorithms use basic tools from REAL and CONVEX analysis and linear algebra The focus is on how to improve f by looking at a NEIGHBORHOOD of a nominal point x K, i.e., LOCALLY AROUND x K, and in general, no GLOBAL property of x K can be inferred. The fact that f and g j are POLYNOMIALS does not help much!

15 True!... if one is interested with a LOCAL optimum only!! When searching for a local minimum... Optimality conditions and descent algorithms use basic tools from REAL and CONVEX analysis and linear algebra The focus is on how to improve f by looking at a NEIGHBORHOOD of a nominal point x K, i.e., LOCALLY AROUND x K, and in general, no GLOBAL property of x K can be inferred. The fact that f and g j are POLYNOMIALS does not help much!

16 BUT for GLOBAL Optimization... the picture is different! Remember that for the GLOBAL minimum f : f = sup { λ : f (x) λ 0 x K}. (Not true for a global minimum!)) and so to compute f... one needs to handle EFFICIENTLY the difficult constraint f (x) λ 0 x K, i.e. one needs TRACTABLE CERTIFICATES of POSITIVITY on K for the polynomial x f (x) λ!

17 BUT for GLOBAL Optimization... the picture is different! Remember that for the GLOBAL minimum f : f = sup { λ : f (x) λ 0 x K}. (Not true for a global minimum!)) and so to compute f... one needs to handle EFFICIENTLY the difficult constraint f (x) λ 0 x K, i.e. one needs TRACTABLE CERTIFICATES of POSITIVITY on K for the polynomial x f (x) λ!

18 BUT for GLOBAL Optimization... the picture is different! Remember that for the GLOBAL minimum f : f = sup { λ : f (x) λ 0 x K}. (Not true for a global minimum!)) and so to compute f... one needs to handle EFFICIENTLY the difficult constraint f (x) λ 0 x K, i.e. one needs TRACTABLE CERTIFICATES of POSITIVITY on K for the polynomial x f (x) λ!

19 REAL ALGEBRAIC GEOMETRY helps!!!! Indeed, POWERFUL CERTIFICATES OF POSITIVITY EXIST! Moreover... and importantly, Such certificates are amenable to PRACTICAL COMPUTATION! ( Stronger Positivstellensatzë exist for analytic functions but are useless from a computational viewpoint.)

20 REAL ALGEBRAIC GEOMETRY helps!!!! Indeed, POWERFUL CERTIFICATES OF POSITIVITY EXIST! Moreover... and importantly, Such certificates are amenable to PRACTICAL COMPUTATION! ( Stronger Positivstellensatzë exist for analytic functions but are useless from a computational viewpoint.)

21 REAL ALGEBRAIC GEOMETRY helps!!!! Indeed, POWERFUL CERTIFICATES OF POSITIVITY EXIST! Moreover... and importantly, Such certificates are amenable to PRACTICAL COMPUTATION! ( Stronger Positivstellensatzë exist for analytic functions but are useless from a computational viewpoint.)

22 SOS-based certificate K = { x : g j (x) 0, j = 1,..., m } Theorem (Putinar s Positivstellensatz) If K is compact (+ a technical Archimedean assumption) and f > 0 on K then: f (x) = σ 0 (x) + m σ j (x) g j (x), x R n, j=1 for some SOS polynomials (σ j ) R[x].

23 However... In Putinar s theorem... nothing is said on the DEGREE of the SOS polynomials (σ j )! BUT... GOOD news..!! Testing whether holds for some SOS (σ j ) R[x] with a degree bound, is SOLVING an SDP!

24 However... In Putinar s theorem... nothing is said on the DEGREE of the SOS polynomials (σ j )! BUT... GOOD news..!! Testing whether holds for some SOS (σ j ) R[x] with a degree bound, is SOLVING an SDP!

25 Semidefinite Programming The CONVEX optimization problem: P min x R n { c x n A i x i i=1 b}, where c R n and b, A i S m (m m symmetric matrices), is called a semidefinite program. The notation 0" means the real symmetric matrix " is positive semidefinite, i.e., all its (real) EIGENVALUES are nonnegative.

26 Example P : min {x 1 + x 2 : x [ 3 + 2x1 + x s.t. 2 x 1 5 x 1 5 x 1 2x 2 or, equivalently ] }, 0 P : min {x 1 + x 2 : x [ 3 5 s.t. 5 0 ] + x 1 [ ] + x 2 [ ] } 0

27 P and its dual P are convex problems that are solvable in polynomial time to arbitrary precision ɛ > 0. = generalization to the convex cone S + m (X 0) of Linear Programming on the convex polyhedral cone R m + (x 0). Indeed, with DIAGONAL matrices Semidefinite programming = Linear Programming! Several academic SDP software packages exist, (e.g. MATLAB LMI toolbox, SeduMi, SDPT3,...). However, so far, size limitation is more severe than for LP software packages. Pioneer contributions by A. Nemirovsky, Y. Nesterov, N.Z. Shor, B.D. Yudin,...

28 P and its dual P are convex problems that are solvable in polynomial time to arbitrary precision ɛ > 0. = generalization to the convex cone S + m (X 0) of Linear Programming on the convex polyhedral cone R m + (x 0). Indeed, with DIAGONAL matrices Semidefinite programming = Linear Programming! Several academic SDP software packages exist, (e.g. MATLAB LMI toolbox, SeduMi, SDPT3,...). However, so far, size limitation is more severe than for LP software packages. Pioneer contributions by A. Nemirovsky, Y. Nesterov, N.Z. Shor, B.D. Yudin,...

29 P and its dual P are convex problems that are solvable in polynomial time to arbitrary precision ɛ > 0. = generalization to the convex cone S + m (X 0) of Linear Programming on the convex polyhedral cone R m + (x 0). Indeed, with DIAGONAL matrices Semidefinite programming = Linear Programming! Several academic SDP software packages exist, (e.g. MATLAB LMI toolbox, SeduMi, SDPT3,...). However, so far, size limitation is more severe than for LP software packages. Pioneer contributions by A. Nemirovsky, Y. Nesterov, N.Z. Shor, B.D. Yudin,...

30 Dual side of Putinar s theorem: The K -moment problem Given a real sequence y = (y α ), α N n, does there exist a Borel measure µ on K such that y α = x α 1 1 x n αn dµ, α N n. K Introduce the so-called Riesz linear functional L y : R[x] R : ( f = ) f α x α L y (f ) = f α y α α α N n

31 Dual side of Putinar s theorem: The K -moment problem Given a real sequence y = (y α ), α N n, does there exist a Borel measure µ on K such that y α = x α 1 1 x n αn dµ, α N n. K Introduce the so-called Riesz linear functional L y : R[x] R : ( f = ) f α x α L y (f ) = f α y α α α N n

32 Theorem If K = {x : g j (x) 0, j = 1,..., m} is compact and satisfies an Archimedean assumption then holds if and only if for every h R[x] 2 : ( ) L y (h 2 ) 0; L y (h 2 g j ) 0, j = 1,..., m. The condition ( ) is equivalent to m + 1 positive semidefiniteness of some moment and localizing matrices, i.e., M(y) 0; M(g j y) 0, j = 1,..., m. whose rows & columns are indexed by N n, and entries are LINEAR in the y α s

33 Theorem If K = {x : g j (x) 0, j = 1,..., m} is compact and satisfies an Archimedean assumption then holds if and only if for every h R[x] 2 : ( ) L y (h 2 ) 0; L y (h 2 g j ) 0, j = 1,..., m. The condition ( ) is equivalent to m + 1 positive semidefiniteness of some moment and localizing matrices, i.e., M(y) 0; M(g j y) 0, j = 1,..., m. whose rows & columns are indexed by N n, and entries are LINEAR in the y α s

34 LP-based certificate K = {x : g j (x) 0; (1 g j (x)) 0, j = 1,..., m} Theorem (Krivine-Vasilescu-Handelman s Positivstellensatz) Let K be compact and the family {1, g j } generate R[x]. If f > 0 on K then: ( ) f (x) = α,β c αβ m j=1 g j (x) α j (1 g j (x)) β j,, x R n, for some NONNEGATIVE scalars (c αβ ).

35 However... Again in Krivine s theorem In ( )... nothing is said on how many nonnegative scalars c αβ are needed! BUT... GOOD news... again!! Testing whether ( ) holds for some nonnegative scalars (c αβ ) is SOLVING an LP!

36 However... Again in Krivine s theorem In ( )... nothing is said on how many nonnegative scalars c αβ are needed! BUT... GOOD news... again!! Testing whether ( ) holds for some nonnegative scalars (c αβ ) is SOLVING an LP!

37 Dual side of Krivine s theorem: The K -moment problem Theorem If K = {x : g j (x) 0, j = 1,..., m} is compact, 0 g j 1 on K, and {1, g j } generates R[x], then holds if and only if ( ) L y m α g j β j (1 g j j 0, α, β N m. j=1 The condition ( ) is equivalent to countably many LINEAR INEQUALITIES on the y α s

38 Dual side of Krivine s theorem: The K -moment problem Theorem If K = {x : g j (x) 0, j = 1,..., m} is compact, 0 g j 1 on K, and {1, g j } generates R[x], then holds if and only if ( ) L y m α g j β j (1 g j j 0, α, β N m. j=1 The condition ( ) is equivalent to countably many LINEAR INEQUALITIES on the y α s

39 HENCE... SUCH POSITIVITY CERTIFICATES allow to infer GLOBAL Properties of FEASIBILITY and OPTIMALITY,... the analogue of (well-known) previous ones valid in the CONVEX CASE ONLY!

40 HENCE... SUCH POSITIVITY CERTIFICATES allow to infer GLOBAL Properties of FEASIBILITY and OPTIMALITY,... the analogue of (well-known) previous ones valid in the CONVEX CASE ONLY!

41 In addition, polynomials NONNEGATIVE ON A SET K R n are ubiquitous. They also appear in many important applications (outside optimization),... modeled as particular instances of the so called Generalized Moment Problem, among which: Probability, Optimal and Robust Control, Game theory, Signal processing, multivariate integration, etc. (GMP) : s inf { f i dµ i : µ i M(K i ) i=1 K i s i=1 K i h ij dµ i = b j, j J} with M(K i ) space of Borel measures on K i R n i, i = 1,..., s.

42 In addition, polynomials NONNEGATIVE ON A SET K R n are ubiquitous. They also appear in many important applications (outside optimization),... modeled as particular instances of the so called Generalized Moment Problem, among which: Probability, Optimal and Robust Control, Game theory, Signal processing, multivariate integration, etc. (GMP) : s inf { f i dµ i : µ i M(K i ) i=1 K i s i=1 K i h ij dµ i = b j, j J} with M(K i ) space of Borel measures on K i R n i, i = 1,..., s.

43 The DUAL of the GMP is the linear program GMP : sup λ j { s λ j b j : j J f i j J λ j h ij 0 on K i, i = 1,..., s } And one can see that... the constraints of GMP state that some functions f i j J λ j h ij must be nonnegative on a certain set K i, i = 1,..., s.

44 A couple of examples I: Global OPTIM f = inf x { f (x) : x K } is the SIMPLEST example of the GMP because... f = inf { f dµ : µ M(K) K K 1 dµ = 1} Indeed if f (x) f for all x K and µ is a probability measure on K, then K f dµ f dµ = f. On the other hand, for every x K the probability measure µ := δ x is such that f dµ = f (x).

45 A couple of examples I: Global OPTIM f = inf x { f (x) : x K } is the SIMPLEST example of the GMP because... f = inf { f dµ : µ M(K) K K 1 dµ = 1} Indeed if f (x) f for all x K and µ is a probability measure on K, then K f dµ f dµ = f. On the other hand, for every x K the probability measure µ := δ x is such that f dµ = f (x).

46 A couple of examples I: Global OPTIM f = inf x { f (x) : x K } is the SIMPLEST example of the GMP because... f = inf { f dµ : µ M(K) K K 1 dµ = 1} Indeed if f (x) f for all x K and µ is a probability measure on K, then K f dµ f dµ = f. On the other hand, for every x K the probability measure µ := δ x is such that f dµ = f (x).

47 A couple of examples I: Global OPTIM f = inf x { f (x) : x K } is the SIMPLEST example of the GMP because... f = inf { f dµ : µ M(K) K K 1 dµ = 1} Indeed if f (x) f for all x K and µ is a probability measure on K, then K f dµ f dµ = f. On the other hand, for every x K the probability measure µ := δ x is such that f dµ = f (x).

48 II. Let K R n and S K be given, and let Γ N n be also given. BOUNDS on measures with moment conditions max { 1 S, µ : x α dµ = m α, α Γ } µ M(K) to compute an upper bound on µ(s) over all distributions µ M(K) with a certain fixed number of moments m α. K If Γ = N n then one may use this to compute the Lebesgue volume of a compact basic semi-algebraic set S K := [ 1, 1] n. Take m α := x α dx, α N n. [ 1,1] n

49 II. Let K R n and S K be given, and let Γ N n be also given. BOUNDS on measures with moment conditions max { 1 S, µ : x α dµ = m α, α Γ } µ M(K) to compute an upper bound on µ(s) over all distributions µ M(K) with a certain fixed number of moments m α. K If Γ = N n then one may use this to compute the Lebesgue volume of a compact basic semi-algebraic set S K := [ 1, 1] n. Take m α := x α dx, α N n. [ 1,1] n

50 III. For instance, one may also want: To approximate sets defined with QUANTIFIERS, like.e.g., R f := {x B : f (x, y) 0 for all y such that (x, y) K} D f := {x B : f (x, y) 0 for some y such that (x, y) K} where f R[x, y], B is a simple set (box, ellipsoid). To compute convex polynomial underestimators p f of a polynomial f on a box B R n. (Very useful in MINLP.)

51 III. For instance, one may also want: To approximate sets defined with QUANTIFIERS, like.e.g., R f := {x B : f (x, y) 0 for all y such that (x, y) K} D f := {x B : f (x, y) 0 for some y such that (x, y) K} where f R[x, y], B is a simple set (box, ellipsoid). To compute convex polynomial underestimators p f of a polynomial f on a box B R n. (Very useful in MINLP.)

52 The moment-lp and moment-sos approaches consist of using a certain type of positivity certificate (Krivine-Vasilescu-Handelman s or Putinar s certificate) in potentially any application where such a characterization is needed. (Global optimization is only one example.) In many situations this amounts to solving a HIERARCHY of : LINEAR PROGRAMS, or SEMIDEFINITE PROGRAMS... of increasing size!.

53 The moment-lp and moment-sos approaches consist of using a certain type of positivity certificate (Krivine-Vasilescu-Handelman s or Putinar s certificate) in potentially any application where such a characterization is needed. (Global optimization is only one example.) In many situations this amounts to solving a HIERARCHY of : LINEAR PROGRAMS, or SEMIDEFINITE PROGRAMS... of increasing size!.

54 The moment-lp and moment-sos approaches consist of using a certain type of positivity certificate (Krivine-Vasilescu-Handelman s or Putinar s certificate) in potentially any application where such a characterization is needed. (Global optimization is only one example.) In many situations this amounts to solving a HIERARCHY of : LINEAR PROGRAMS, or SEMIDEFINITE PROGRAMS... of increasing size!.

55 The moment-lp and moment-sos approaches consist of using a certain type of positivity certificate (Krivine-Vasilescu-Handelman s or Putinar s certificate) in potentially any application where such a characterization is needed. (Global optimization is only one example.) In many situations this amounts to solving a HIERARCHY of : LINEAR PROGRAMS, or SEMIDEFINITE PROGRAMS... of increasing size!.

56 LP- and SDP-hierarchies for optimization Replace f = sup λ,σj { λ : f (x) λ 0 x K} with: The SDP-hierarchy indexed by d N: m fd = sup { λ : f λ = σ 0 + σ }{{} j g j ; deg (σ j g j ) 2d } }{{} SOS j=1 SOS or, the LP-hierarchy indexed by d N: θ d = sup { λ : f λ = α,β c αβ }{{} 0 m α g j j (1 g j ) β j ; α + β 2d} j=1

57 LP- and SDP-hierarchies for optimization Replace f = sup λ,σj { λ : f (x) λ 0 x K} with: The SDP-hierarchy indexed by d N: m fd = sup { λ : f λ = σ 0 + σ }{{} j g j ; deg (σ j g j ) 2d } }{{} SOS j=1 SOS or, the LP-hierarchy indexed by d N: θ d = sup { λ : f λ = α,β c αβ }{{} 0 m α g j j (1 g j ) β j ; α + β 2d} j=1

58 Theorem Both sequence (f d ), and (θ d), d N, are MONOTONE NON DECREASING and when K is compact (and satisfies a technical Archimedean assumption) then: f = lim fd = lim θ d. d d

59 What makes this approach exciting is that it is at the crossroads of several disciplines/applications: Commutative, Non-commutative, and Non-linear ALGEBRA Real algebraic geometry, and Functional Analysis Optimization, Convex Analysis Computational Complexity in Computer Science, which BENEFIT from interactions! As mentioned... potential applications are ENDLESS!

60 What makes this approach exciting is that it is at the crossroads of several disciplines/applications: Commutative, Non-commutative, and Non-linear ALGEBRA Real algebraic geometry, and Functional Analysis Optimization, Convex Analysis Computational Complexity in Computer Science, which BENEFIT from interactions! As mentioned... potential applications are ENDLESS!

61 Has already been proved useful and successful in applications with modest problem size, notably in optimization, control, robust control, optimal control, estimation, computer vision, etc. (If sparsity then problems of larger size can be addressed) HAS initiated and stimulated new research issues: in Convex Algebraic Geometry (e.g. semidefinite representation of convex sets, algebraic degree of semidefinite programming and polynomial optimization) in Computational algebra (e.g., for solving polynomial equations via SDP and Border bases) Computational Complexity where LP- and SDP-HIERARCHIES have become an important tool to analyze Hardness of Approximation for 0/1 combinatorial problems ( links with quantum computing)

62 Has already been proved useful and successful in applications with modest problem size, notably in optimization, control, robust control, optimal control, estimation, computer vision, etc. (If sparsity then problems of larger size can be addressed) HAS initiated and stimulated new research issues: in Convex Algebraic Geometry (e.g. semidefinite representation of convex sets, algebraic degree of semidefinite programming and polynomial optimization) in Computational algebra (e.g., for solving polynomial equations via SDP and Border bases) Computational Complexity where LP- and SDP-HIERARCHIES have become an important tool to analyze Hardness of Approximation for 0/1 combinatorial problems ( links with quantum computing)

63 Recall that both LP- and SDP- hierarchies are GENERAL PURPOSE METHODS... NOT TAILORED to solving specific hard problems!!

64 Recall that both LP- and SDP- hierarchies are GENERAL PURPOSE METHODS... NOT TAILORED to solving specific hard problems!!

65 A remarkable property of the SOS hierarchy: I When solving the optimization problem P : f = min {f (x) : g j (x) 0, j = 1,..., m} one does NOT distinguish between CONVEX, CONTINUOUS NON CONVEX, and 0/1 (and DISCRETE) problems! A boolean variable x i is modelled via the equality constraint x 2 i x i = 0". In Non Linear Programming (NLP), modeling a 0/1 variable with the polynomial equality constraint xi 2 x i = 0" and applying a standard descent algorithm would be considered stupid"! Each class of problems has its own ad hoc tailored algorithms.

66 A remarkable property of the SOS hierarchy: I When solving the optimization problem P : f = min {f (x) : g j (x) 0, j = 1,..., m} one does NOT distinguish between CONVEX, CONTINUOUS NON CONVEX, and 0/1 (and DISCRETE) problems! A boolean variable x i is modelled via the equality constraint x 2 i x i = 0". In Non Linear Programming (NLP), modeling a 0/1 variable with the polynomial equality constraint xi 2 x i = 0" and applying a standard descent algorithm would be considered stupid"! Each class of problems has its own ad hoc tailored algorithms.

67 A remarkable property of the SOS hierarchy: I When solving the optimization problem P : f = min {f (x) : g j (x) 0, j = 1,..., m} one does NOT distinguish between CONVEX, CONTINUOUS NON CONVEX, and 0/1 (and DISCRETE) problems! A boolean variable x i is modelled via the equality constraint x 2 i x i = 0". In Non Linear Programming (NLP), modeling a 0/1 variable with the polynomial equality constraint xi 2 x i = 0" and applying a standard descent algorithm would be considered stupid"! Each class of problems has its own ad hoc tailored algorithms.

68 Even though the moment-sos approach DOES NOT SPECIALIZE to each class of problems: It recognizes the class of (easy) SOS-convex problems as FINITE CONVERGENCE occurs at the FIRST relaxation in the hierarchy. Finite convergence also occurs for general convex problems and generically for non convex problems (NOT true for the LP-hierarchy.) The SOS-hierarchy dominates other lift-and-project hierarchies (i.e. provides the best lower bounds) for hard 0/1 combinatorial optimization problems! The Computer Science community talks about a META-Algorithm.

69 Even though the moment-sos approach DOES NOT SPECIALIZE to each class of problems: It recognizes the class of (easy) SOS-convex problems as FINITE CONVERGENCE occurs at the FIRST relaxation in the hierarchy. Finite convergence also occurs for general convex problems and generically for non convex problems (NOT true for the LP-hierarchy.) The SOS-hierarchy dominates other lift-and-project hierarchies (i.e. provides the best lower bounds) for hard 0/1 combinatorial optimization problems! The Computer Science community talks about a META-Algorithm.

70 Even though the moment-sos approach DOES NOT SPECIALIZE to each class of problems: It recognizes the class of (easy) SOS-convex problems as FINITE CONVERGENCE occurs at the FIRST relaxation in the hierarchy. Finite convergence also occurs for general convex problems and generically for non convex problems (NOT true for the LP-hierarchy.) The SOS-hierarchy dominates other lift-and-project hierarchies (i.e. provides the best lower bounds) for hard 0/1 combinatorial optimization problems! The Computer Science community talks about a META-Algorithm.

71 Even though the moment-sos approach DOES NOT SPECIALIZE to each class of problems: It recognizes the class of (easy) SOS-convex problems as FINITE CONVERGENCE occurs at the FIRST relaxation in the hierarchy. Finite convergence also occurs for general convex problems and generically for non convex problems (NOT true for the LP-hierarchy.) The SOS-hierarchy dominates other lift-and-project hierarchies (i.e. provides the best lower bounds) for hard 0/1 combinatorial optimization problems! The Computer Science community talks about a META-Algorithm.

72 Even though the moment-sos approach DOES NOT SPECIALIZE to each class of problems: It recognizes the class of (easy) SOS-convex problems as FINITE CONVERGENCE occurs at the FIRST relaxation in the hierarchy. Finite convergence also occurs for general convex problems and generically for non convex problems (NOT true for the LP-hierarchy.) The SOS-hierarchy dominates other lift-and-project hierarchies (i.e. provides the best lower bounds) for hard 0/1 combinatorial optimization problems! The Computer Science community talks about a META-Algorithm.

73 A remarkable property: II FINITE CONVERGENCE of the SOS-hierarchy is GENERIC!... and provides a GLOBAL OPTIMALITY CERTIFICATE, the analogue for the NON CONVEX CASE of the KKT-OPTIMALITY conditions in the CONVEX CASE!

74 Theorem (Marshall, Nie) Let x K be a global minimizer of P : f = min {f (x) : g j (x) 0, j = 1,..., m}. and assume that: (i) The gradients { g j (x )} are linearly independent, (ii) Strict complementarity holds (λ j g j (x ) = 0 for all j.) (iii) Second-order sufficiency conditions hold at (x, λ ) K R m +. m Then f (x) f = σ0 (x) + σj (x)g j(x), SOS polynomials {σ j }. j=1 x R n, for some Moreover, the conditions (i)-(ii)-(iii) HOLD GENERICALLY!

75 Theorem (Marshall, Nie) Let x K be a global minimizer of P : f = min {f (x) : g j (x) 0, j = 1,..., m}. and assume that: (i) The gradients { g j (x )} are linearly independent, (ii) Strict complementarity holds (λ j g j (x ) = 0 for all j.) (iii) Second-order sufficiency conditions hold at (x, λ ) K R m +. m Then f (x) f = σ0 (x) + σj (x)g j(x), SOS polynomials {σ j }. j=1 x R n, for some Moreover, the conditions (i)-(ii)-(iii) HOLD GENERICALLY!

76 Certificates of positivity already exist in convex optimization f = f (x ) = min { f (x) : g j (x) 0, j = 1,..., m } when f and g j are CONVEX. Indeed if Slater s condition holds there exist nonnegative KKT-multipliers λ j R m + such that: f (x ) m λ j g j (x ) = 0; λ j g j (x ) = 0, j = 1,..., m. j=1... and so... the Lagrangian L λ (x) := f (x) f j=1 λ j g j (x), satisfies L λ (x ) = 0 and L λ (x) 0 for all x. Therefore: L λ (x) 0 f (x) f x K!

77 Certificates of positivity already exist in convex optimization f = f (x ) = min { f (x) : g j (x) 0, j = 1,..., m } when f and g j are CONVEX. Indeed if Slater s condition holds there exist nonnegative KKT-multipliers λ j R m + such that: f (x ) m λ j g j (x ) = 0; λ j g j (x ) = 0, j = 1,..., m. j=1... and so... the Lagrangian L λ (x) := f (x) f j=1 λ j g j (x), satisfies L λ (x ) = 0 and L λ (x) 0 for all x. Therefore: L λ (x) 0 f (x) f x K!

78 In summary: KKT-OPTIMALITY when f and g j are CONVEX m f (x ) λ j g j(x ) = 0 f (x) f j=1 m λ j g j(x) j=1 PUTINAR s CERTIFICATE in the non CONVEX CASE m f (x ) σ j (x ) g j (x ) = 0 f (x) f j=1 m σj (x)g j(x) j=1 0 for all x R n (= σ0 (x)) 0 for all x Rn. for some SOS {σj }, and σj (x ) = λ j.

79 In summary: KKT-OPTIMALITY when f and g j are CONVEX m f (x ) λ j g j(x ) = 0 f (x) f j=1 m λ j g j(x) j=1 PUTINAR s CERTIFICATE in the non CONVEX CASE m f (x ) σ j (x ) g j (x ) = 0 f (x) f j=1 m σj (x)g j(x) j=1 0 for all x R n (= σ0 (x)) 0 for all x Rn. for some SOS {σj }, and σj (x ) = λ j.

80 The no-free lunch rule... The size of SDP-relaxations grows rapidly with the original problem size... In particular: O(n 2d ) variables for the d th SDP-relaxation in the hierarchy O(n d ) matrix size for the LMIs In view of the present status of SDP-solvers... only small to medium size problems can be solved by "standard" SDP-relaxations How to handle larger size problems?

81 The no-free lunch rule... The size of SDP-relaxations grows rapidly with the original problem size... In particular: O(n 2d ) variables for the d th SDP-relaxation in the hierarchy O(n d ) matrix size for the LMIs In view of the present status of SDP-solvers... only small to medium size problems can be solved by "standard" SDP-relaxations How to handle larger size problems?

82 The no-free lunch rule... The size of SDP-relaxations grows rapidly with the original problem size... In particular: O(n 2d ) variables for the d th SDP-relaxation in the hierarchy O(n d ) matrix size for the LMIs In view of the present status of SDP-solvers... only small to medium size problems can be solved by "standard" SDP-relaxations How to handle larger size problems?

83 The no-free lunch rule... The size of SDP-relaxations grows rapidly with the original problem size... In particular: O(n 2d ) variables for the d th SDP-relaxation in the hierarchy O(n d ) matrix size for the LMIs In view of the present status of SDP-solvers... only small to medium size problems can be solved by "standard" SDP-relaxations How to handle larger size problems?

84 The no-free lunch rule... The size of SDP-relaxations grows rapidly with the original problem size... In particular: O(n 2d ) variables for the d th SDP-relaxation in the hierarchy O(n d ) matrix size for the LMIs In view of the present status of SDP-solvers... only small to medium size problems can be solved by "standard" SDP-relaxations How to handle larger size problems?

85 develop more efficient general purpose SDP-solvers... (limited impact)... or perhaps dedicated solvers...? exploit symmetries when present... Recent promising works by De Klerk, Gaterman, Gvozdenovic, Laurent, Pasechnick, Parrilo, Schrijver.. in particular for combinatorial optimization problems. Algebraic techniques permit to define an equivalent SDP of much smaller size. exploit sparsity in the data. In general, each constraint involves a small number of variables κ, and the cost criterion is a sum of polynomials involving also a small number of variables. Recent works by Kim, Kojima, Lasserre, Maramatsu and Waki

86 develop more efficient general purpose SDP-solvers... (limited impact)... or perhaps dedicated solvers...? exploit symmetries when present... Recent promising works by De Klerk, Gaterman, Gvozdenovic, Laurent, Pasechnick, Parrilo, Schrijver.. in particular for combinatorial optimization problems. Algebraic techniques permit to define an equivalent SDP of much smaller size. exploit sparsity in the data. In general, each constraint involves a small number of variables κ, and the cost criterion is a sum of polynomials involving also a small number of variables. Recent works by Kim, Kojima, Lasserre, Maramatsu and Waki

87 develop more efficient general purpose SDP-solvers... (limited impact)... or perhaps dedicated solvers...? exploit symmetries when present... Recent promising works by De Klerk, Gaterman, Gvozdenovic, Laurent, Pasechnick, Parrilo, Schrijver.. in particular for combinatorial optimization problems. Algebraic techniques permit to define an equivalent SDP of much smaller size. exploit sparsity in the data. In general, each constraint involves a small number of variables κ, and the cost criterion is a sum of polynomials involving also a small number of variables. Recent works by Kim, Kojima, Lasserre, Maramatsu and Waki

88 In recent works, Koijma s group has developed a systematic procedure to discover sparsity pattern p p I = {1, 2,..., n} = I j. J = {1, 2,..., m} = J k. j=1 k=1 Essentially one looks for maximal cliques {I j } in some chordal graph extension of a graph associated with the problem. P : f = min {f (x) : g j (x) 0, j = 1,..., m}.

89 Assumption I: Sparsity pattern f = p k=1 f k with f k R[X i ; i I k ]. For all j J k, g j R[X i ; i I k ]. Assumption II: The Running Intersection Property For every k = 2,... p k 1 I k I s for some s k 1, j=1 I j

90 Assumption I: Sparsity pattern f = p k=1 f k with f k R[X i ; i I k ]. For all j J k, g j R[X i ; i I k ]. Assumption II: The Running Intersection Property For every k = 2,... p k 1 I k I s for some s k 1, j=1 I j

91 Theorem (Lass (Sparse version of Putinar s Positivstellensatz)) Let Assumption I and II hold and an archimedean assumption holds for K. If f > 0 on K then: p f = σ 0k + σ jk g j, j Jk k=1 for some s.o.s. polynomials σ jk R[X i : i I k ]. Examples with n large (say with n = 1000) and small clique size κ (e.g. κ = 3, 4,..7) are solved relatively easily with Kojima s group software SparsePOP

92 Theorem (Lass (Sparse version of Putinar s Positivstellensatz)) Let Assumption I and II hold and an archimedean assumption holds for K. If f > 0 on K then: p f = σ 0k + σ jk g j, j Jk k=1 for some s.o.s. polynomials σ jk R[X i : i I k ]. Examples with n large (say with n = 1000) and small clique size κ (e.g. κ = 3, 4,..7) are solved relatively easily with Kojima s group software SparsePOP

93 There has been also recents attempts to use other types of algebraic certificates of positivity that try to avoid the size explosion due to the semidefinite matrices associated with the SOS weights in Putinar s positivity certificate Recent work by Ahmadi et al. and Lasserre, Toh and Zhang

The moment-lp and moment-sos approaches

The moment-lp and moment-sos approaches The moment-lp and moment-sos approaches LAAS-CNRS and Institute of Mathematics, Toulouse, France CIRM, November 2013 Semidefinite Programming Why polynomial optimization? LP- and SDP- CERTIFICATES of POSITIVITY

More information

Moments and Positive Polynomials for Optimization II: LP- VERSUS SDP-relaxations

Moments and Positive Polynomials for Optimization II: LP- VERSUS SDP-relaxations Moments and Positive Polynomials for Optimization II: LP- VERSUS SDP-relaxations LAAS-CNRS and Institute of Mathematics, Toulouse, France Tutorial, IMS, Singapore 2012 LP-relaxations LP- VERSUS SDP-relaxations

More information

Moments and Positive Polynomials for Optimization II: LP- VERSUS SDP-relaxations

Moments and Positive Polynomials for Optimization II: LP- VERSUS SDP-relaxations Moments and Positive Polynomials for Optimization II: LP- VERSUS SDP-relaxations LAAS-CNRS and Institute of Mathematics, Toulouse, France EECI Course: February 2016 LP-relaxations LP- VERSUS SDP-relaxations

More information

Optimization over Polynomials with Sums of Squares and Moment Matrices

Optimization over Polynomials with Sums of Squares and Moment Matrices Optimization over Polynomials with Sums of Squares and Moment Matrices Monique Laurent Centrum Wiskunde & Informatica (CWI), Amsterdam and University of Tilburg Positivity, Valuations and Quadratic Forms

More information

A new look at nonnegativity on closed sets

A new look at nonnegativity on closed sets A new look at nonnegativity on closed sets LAAS-CNRS and Institute of Mathematics, Toulouse, France IPAM, UCLA September 2010 Positivstellensatze for semi-algebraic sets K R n from the knowledge of defining

More information

Convex Optimization & Parsimony of L p-balls representation

Convex Optimization & Parsimony of L p-balls representation Convex Optimization & Parsimony of L p -balls representation LAAS-CNRS and Institute of Mathematics, Toulouse, France IMA, January 2016 Motivation Unit balls associated with nonnegative homogeneous polynomials

More information

6-1 The Positivstellensatz P. Parrilo and S. Lall, ECC

6-1 The Positivstellensatz P. Parrilo and S. Lall, ECC 6-1 The Positivstellensatz P. Parrilo and S. Lall, ECC 2003 2003.09.02.10 6. The Positivstellensatz Basic semialgebraic sets Semialgebraic sets Tarski-Seidenberg and quantifier elimination Feasibility

More information

CONVEXITY IN SEMI-ALGEBRAIC GEOMETRY AND POLYNOMIAL OPTIMIZATION

CONVEXITY IN SEMI-ALGEBRAIC GEOMETRY AND POLYNOMIAL OPTIMIZATION CONVEXITY IN SEMI-ALGEBRAIC GEOMETRY AND POLYNOMIAL OPTIMIZATION JEAN B. LASSERRE Abstract. We review several (and provide new) results on the theory of moments, sums of squares and basic semi-algebraic

More information

Towards Solving Bilevel Optimization Problems in Quantum Information Theory

Towards Solving Bilevel Optimization Problems in Quantum Information Theory Towards Solving Bilevel Optimization Problems in Quantum Information Theory ICFO-The Institute of Photonic Sciences and University of Borås 22 January 2016 Workshop on Linear Matrix Inequalities, Semidefinite

More information

On Polynomial Optimization over Non-compact Semi-algebraic Sets

On Polynomial Optimization over Non-compact Semi-algebraic Sets On Polynomial Optimization over Non-compact Semi-algebraic Sets V. Jeyakumar, J.B. Lasserre and G. Li Revised Version: April 3, 2014 Communicated by Lionel Thibault Abstract The optimal value of a polynomial

More information

4. Algebra and Duality

4. Algebra and Duality 4-1 Algebra and Duality P. Parrilo and S. Lall, CDC 2003 2003.12.07.01 4. Algebra and Duality Example: non-convex polynomial optimization Weak duality and duality gap The dual is not intrinsic The cone

More information

Croatian Operational Research Review (CRORR), Vol. 3, 2012

Croatian Operational Research Review (CRORR), Vol. 3, 2012 126 127 128 129 130 131 132 133 REFERENCES [BM03] S. Burer and R.D.C. Monteiro (2003), A nonlinear programming algorithm for solving semidefinite programs via low-rank factorization, Mathematical Programming

More information

arxiv: v1 [math.oc] 26 Sep 2015

arxiv: v1 [math.oc] 26 Sep 2015 arxiv:1509.08021v1 [math.oc] 26 Sep 2015 Degeneracy in Maximal Clique Decomposition for Semidefinite Programs Arvind U. Raghunathan and Andrew V. Knyazev Mitsubishi Electric Research Laboratories 201 Broadway,

More information

Solving Global Optimization Problems with Sparse Polynomials and Unbounded Semialgebraic Feasible Sets

Solving Global Optimization Problems with Sparse Polynomials and Unbounded Semialgebraic Feasible Sets Solving Global Optimization Problems with Sparse Polynomials and Unbounded Semialgebraic Feasible Sets V. Jeyakumar, S. Kim, G. M. Lee and G. Li June 6, 2014 Abstract We propose a hierarchy of semidefinite

More information

Global Optimization with Polynomials

Global Optimization with Polynomials Global Optimization with Polynomials Geoffrey Schiebinger, Stephen Kemmerling Math 301, 2010/2011 March 16, 2011 Geoffrey Schiebinger, Stephen Kemmerling (Math Global 301, 2010/2011) Optimization with

More information

Strong duality in Lasserre s hierarchy for polynomial optimization

Strong duality in Lasserre s hierarchy for polynomial optimization Strong duality in Lasserre s hierarchy for polynomial optimization arxiv:1405.7334v1 [math.oc] 28 May 2014 Cédric Josz 1,2, Didier Henrion 3,4,5 Draft of January 24, 2018 Abstract A polynomial optimization

More information

POLYNOMIAL OPTIMIZATION WITH SUMS-OF-SQUARES INTERPOLANTS

POLYNOMIAL OPTIMIZATION WITH SUMS-OF-SQUARES INTERPOLANTS POLYNOMIAL OPTIMIZATION WITH SUMS-OF-SQUARES INTERPOLANTS Sercan Yıldız syildiz@samsi.info in collaboration with Dávid Papp (NCSU) OPT Transition Workshop May 02, 2017 OUTLINE Polynomial optimization and

More information

MATHEMATICAL ENGINEERING TECHNICAL REPORTS. Exact SDP Relaxations with Truncated Moment Matrix for Binary Polynomial Optimization Problems

MATHEMATICAL ENGINEERING TECHNICAL REPORTS. Exact SDP Relaxations with Truncated Moment Matrix for Binary Polynomial Optimization Problems MATHEMATICAL ENGINEERING TECHNICAL REPORTS Exact SDP Relaxations with Truncated Moment Matrix for Binary Polynomial Optimization Problems Shinsaku SAKAUE, Akiko TAKEDA, Sunyoung KIM, and Naoki ITO METR

More information

Degeneracy in Maximal Clique Decomposition for Semidefinite Programs

Degeneracy in Maximal Clique Decomposition for Semidefinite Programs MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.merl.com Degeneracy in Maximal Clique Decomposition for Semidefinite Programs Raghunathan, A.U.; Knyazev, A. TR2016-040 July 2016 Abstract Exploiting

More information

Convergence rates of moment-sum-of-squares hierarchies for volume approximation of semialgebraic sets

Convergence rates of moment-sum-of-squares hierarchies for volume approximation of semialgebraic sets Convergence rates of moment-sum-of-squares hierarchies for volume approximation of semialgebraic sets Milan Korda 1, Didier Henrion,3,4 Draft of December 1, 016 Abstract Moment-sum-of-squares hierarchies

More information

Hilbert s 17th Problem to Semidefinite Programming & Convex Algebraic Geometry

Hilbert s 17th Problem to Semidefinite Programming & Convex Algebraic Geometry Hilbert s 17th Problem to Semidefinite Programming & Convex Algebraic Geometry Rekha R. Thomas University of Washington, Seattle References Monique Laurent, Sums of squares, moment matrices and optimization

More information

Research Reports on Mathematical and Computing Sciences

Research Reports on Mathematical and Computing Sciences ISSN 1342-284 Research Reports on Mathematical and Computing Sciences Exploiting Sparsity in Linear and Nonlinear Matrix Inequalities via Positive Semidefinite Matrix Completion Sunyoung Kim, Masakazu

More information

An explicit construction of distinguished representations of polynomials nonnegative over finite sets

An explicit construction of distinguished representations of polynomials nonnegative over finite sets An explicit construction of distinguished representations of polynomials nonnegative over finite sets Pablo A. Parrilo Automatic Control Laboratory Swiss Federal Institute of Technology Physikstrasse 3

More information

Minimum Ellipsoid Bounds for Solutions of Polynomial Systems via Sum of Squares

Minimum Ellipsoid Bounds for Solutions of Polynomial Systems via Sum of Squares Journal of Global Optimization (2005) 33: 511 525 Springer 2005 DOI 10.1007/s10898-005-2099-2 Minimum Ellipsoid Bounds for Solutions of Polynomial Systems via Sum of Squares JIAWANG NIE 1 and JAMES W.

More information

Convex Optimization. (EE227A: UC Berkeley) Lecture 28. Suvrit Sra. (Algebra + Optimization) 02 May, 2013

Convex Optimization. (EE227A: UC Berkeley) Lecture 28. Suvrit Sra. (Algebra + Optimization) 02 May, 2013 Convex Optimization (EE227A: UC Berkeley) Lecture 28 (Algebra + Optimization) 02 May, 2013 Suvrit Sra Admin Poster presentation on 10th May mandatory HW, Midterm, Quiz to be reweighted Project final report

More information

Introduction to Semidefinite Programming I: Basic properties a

Introduction to Semidefinite Programming I: Basic properties a Introduction to Semidefinite Programming I: Basic properties and variations on the Goemans-Williamson approximation algorithm for max-cut MFO seminar on Semidefinite Programming May 30, 2010 Semidefinite

More information

I.3. LMI DUALITY. Didier HENRION EECI Graduate School on Control Supélec - Spring 2010

I.3. LMI DUALITY. Didier HENRION EECI Graduate School on Control Supélec - Spring 2010 I.3. LMI DUALITY Didier HENRION henrion@laas.fr EECI Graduate School on Control Supélec - Spring 2010 Primal and dual For primal problem p = inf x g 0 (x) s.t. g i (x) 0 define Lagrangian L(x, z) = g 0

More information

Lecture 5. Theorems of Alternatives and Self-Dual Embedding

Lecture 5. Theorems of Alternatives and Self-Dual Embedding IE 8534 1 Lecture 5. Theorems of Alternatives and Self-Dual Embedding IE 8534 2 A system of linear equations may not have a solution. It is well known that either Ax = c has a solution, or A T y = 0, c

More information

LMI MODELLING 4. CONVEX LMI MODELLING. Didier HENRION. LAAS-CNRS Toulouse, FR Czech Tech Univ Prague, CZ. Universidad de Valladolid, SP March 2009

LMI MODELLING 4. CONVEX LMI MODELLING. Didier HENRION. LAAS-CNRS Toulouse, FR Czech Tech Univ Prague, CZ. Universidad de Valladolid, SP March 2009 LMI MODELLING 4. CONVEX LMI MODELLING Didier HENRION LAAS-CNRS Toulouse, FR Czech Tech Univ Prague, CZ Universidad de Valladolid, SP March 2009 Minors A minor of a matrix F is the determinant of a submatrix

More information

Optimization over Nonnegative Polynomials: Algorithms and Applications. Amir Ali Ahmadi Princeton, ORFE

Optimization over Nonnegative Polynomials: Algorithms and Applications. Amir Ali Ahmadi Princeton, ORFE Optimization over Nonnegative Polynomials: Algorithms and Applications Amir Ali Ahmadi Princeton, ORFE INFORMS Optimization Society Conference (Tutorial Talk) Princeton University March 17, 2016 1 Optimization

More information

SEMIDEFINITE PROGRAMMING VS. LP RELAXATIONS FOR POLYNOMIAL PROGRAMMING

SEMIDEFINITE PROGRAMMING VS. LP RELAXATIONS FOR POLYNOMIAL PROGRAMMING MATHEMATICS OF OPERATIONS RESEARCH Vol. 27, No. 2, May 2002, pp. 347 360 Printed in U.S.A. SEMIDEFINITE PROGRAMMING VS. LP RELAXATIONS FOR POLYNOMIAL PROGRAMMING JEAN B. LASSERRE We consider the global

More information

A PARALLEL INTERIOR POINT DECOMPOSITION ALGORITHM FOR BLOCK-ANGULAR SEMIDEFINITE PROGRAMS

A PARALLEL INTERIOR POINT DECOMPOSITION ALGORITHM FOR BLOCK-ANGULAR SEMIDEFINITE PROGRAMS A PARALLEL INTERIOR POINT DECOMPOSITION ALGORITHM FOR BLOCK-ANGULAR SEMIDEFINITE PROGRAMS Kartik K. Sivaramakrishnan Department of Mathematics North Carolina State University kksivara@ncsu.edu http://www4.ncsu.edu/

More information

Semidefinite Programming, Combinatorial Optimization and Real Algebraic Geometry

Semidefinite Programming, Combinatorial Optimization and Real Algebraic Geometry Semidefinite Programming, Combinatorial Optimization and Real Algebraic Geometry assoc. prof., Ph.D. 1 1 UNM - Faculty of information studies Edinburgh, 16. September 2014 Outline Introduction Definition

More information

Robust and Optimal Control, Spring 2015

Robust and Optimal Control, Spring 2015 Robust and Optimal Control, Spring 2015 Instructor: Prof. Masayuki Fujita (S5-303B) G. Sum of Squares (SOS) G.1 SOS Program: SOS/PSD and SDP G.2 Duality, valid ineqalities and Cone G.3 Feasibility/Optimization

More information

arxiv: v1 [math.oc] 31 Jan 2017

arxiv: v1 [math.oc] 31 Jan 2017 CONVEX CONSTRAINED SEMIALGEBRAIC VOLUME OPTIMIZATION: APPLICATION IN SYSTEMS AND CONTROL 1 Ashkan Jasour, Constantino Lagoa School of Electrical Engineering and Computer Science, Pennsylvania State University

More information

Representations of Positive Polynomials: Theory, Practice, and

Representations of Positive Polynomials: Theory, Practice, and Representations of Positive Polynomials: Theory, Practice, and Applications Dept. of Mathematics and Computer Science Emory University, Atlanta, GA Currently: National Science Foundation Temple University

More information

How to generate weakly infeasible semidefinite programs via Lasserre s relaxations for polynomial optimization

How to generate weakly infeasible semidefinite programs via Lasserre s relaxations for polynomial optimization CS-11-01 How to generate weakly infeasible semidefinite programs via Lasserre s relaxations for polynomial optimization Hayato Waki Department of Computer Science, The University of Electro-Communications

More information

Example: feasibility. Interpretation as formal proof. Example: linear inequalities and Farkas lemma

Example: feasibility. Interpretation as formal proof. Example: linear inequalities and Farkas lemma 4-1 Algebra and Duality P. Parrilo and S. Lall 2006.06.07.01 4. Algebra and Duality Example: non-convex polynomial optimization Weak duality and duality gap The dual is not intrinsic The cone of valid

More information

A PARALLEL INTERIOR POINT DECOMPOSITION ALGORITHM FOR BLOCK-ANGULAR SEMIDEFINITE PROGRAMS IN POLYNOMIAL OPTIMIZATION

A PARALLEL INTERIOR POINT DECOMPOSITION ALGORITHM FOR BLOCK-ANGULAR SEMIDEFINITE PROGRAMS IN POLYNOMIAL OPTIMIZATION A PARALLEL INTERIOR POINT DECOMPOSITION ALGORITHM FOR BLOCK-ANGULAR SEMIDEFINITE PROGRAMS IN POLYNOMIAL OPTIMIZATION Kartik K. Sivaramakrishnan Department of Mathematics North Carolina State University

More information

A General Framework for Convex Relaxation of Polynomial Optimization Problems over Cones

A General Framework for Convex Relaxation of Polynomial Optimization Problems over Cones Research Reports on Mathematical and Computing Sciences Series B : Operations Research Department of Mathematical and Computing Sciences Tokyo Institute of Technology 2-12-1 Oh-Okayama, Meguro-ku, Tokyo

More information

Detecting global optimality and extracting solutions in GloptiPoly

Detecting global optimality and extracting solutions in GloptiPoly Detecting global optimality and extracting solutions in GloptiPoly Didier HENRION 1,2 Jean-Bernard LASSERRE 1 1 LAAS-CNRS Toulouse 2 ÚTIA-AVČR Prague Part 1 Description of GloptiPoly Brief description

More information

The maximal stable set problem : Copositive programming and Semidefinite Relaxations

The maximal stable set problem : Copositive programming and Semidefinite Relaxations The maximal stable set problem : Copositive programming and Semidefinite Relaxations Kartik Krishnan Department of Mathematical Sciences Rensselaer Polytechnic Institute Troy, NY 12180 USA kartis@rpi.edu

More information

Semidefinite Programming Basics and Applications

Semidefinite Programming Basics and Applications Semidefinite Programming Basics and Applications Ray Pörn, principal lecturer Åbo Akademi University Novia University of Applied Sciences Content What is semidefinite programming (SDP)? How to represent

More information

BBCPOP: A Sparse Doubly Nonnegative Relaxation of Polynomial Optimization Problems with Binary, Box and Complementarity Constraints

BBCPOP: A Sparse Doubly Nonnegative Relaxation of Polynomial Optimization Problems with Binary, Box and Complementarity Constraints BBCPOP: A Sparse Doubly Nonnegative Relaxation of Polynomial Optimization Problems with Binary, Box and Complementarity Constraints N. Ito, S. Kim, M. Kojima, A. Takeda, and K.-C. Toh March, 2018 Abstract.

More information

A CONIC DANTZIG-WOLFE DECOMPOSITION APPROACH FOR LARGE SCALE SEMIDEFINITE PROGRAMMING

A CONIC DANTZIG-WOLFE DECOMPOSITION APPROACH FOR LARGE SCALE SEMIDEFINITE PROGRAMMING A CONIC DANTZIG-WOLFE DECOMPOSITION APPROACH FOR LARGE SCALE SEMIDEFINITE PROGRAMMING Kartik Krishnan Advanced Optimization Laboratory McMaster University Joint work with Gema Plaza Martinez and Tamás

More information

Global Optimization with Polynomials

Global Optimization with Polynomials Global Optimization with Polynomials Deren Han Singapore-MIT Alliance National University of Singapore Singapore 119260 Email: smahdr@nus.edu.sg Abstract The class of POP (Polynomial Optimization Problems)

More information

III. Applications in convex optimization

III. Applications in convex optimization III. Applications in convex optimization nonsymmetric interior-point methods partial separability and decomposition partial separability first order methods interior-point methods Conic linear optimization

More information

Convexification of Mixed-Integer Quadratically Constrained Quadratic Programs

Convexification of Mixed-Integer Quadratically Constrained Quadratic Programs Convexification of Mixed-Integer Quadratically Constrained Quadratic Programs Laura Galli 1 Adam N. Letchford 2 Lancaster, April 2011 1 DEIS, University of Bologna, Italy 2 Department of Management Science,

More information

approximation algorithms I

approximation algorithms I SUM-OF-SQUARES method and approximation algorithms I David Steurer Cornell Cargese Workshop, 201 meta-task encoded as low-degree polynomial in R x example: f(x) = i,j n w ij x i x j 2 given: functions

More information

Lecture 6: Conic Optimization September 8

Lecture 6: Conic Optimization September 8 IE 598: Big Data Optimization Fall 2016 Lecture 6: Conic Optimization September 8 Lecturer: Niao He Scriber: Juan Xu Overview In this lecture, we finish up our previous discussion on optimality conditions

More information

Semidefinite programming and convex algebraic geometry

Semidefinite programming and convex algebraic geometry FoCM 2008 - SDP and convex AG p. 1/40 Semidefinite programming and convex algebraic geometry Pablo A. Parrilo www.mit.edu/~parrilo Laboratory for Information and Decision Systems Electrical Engineering

More information

COURSE ON LMI PART I.2 GEOMETRY OF LMI SETS. Didier HENRION henrion

COURSE ON LMI PART I.2 GEOMETRY OF LMI SETS. Didier HENRION   henrion COURSE ON LMI PART I.2 GEOMETRY OF LMI SETS Didier HENRION www.laas.fr/ henrion October 2006 Geometry of LMI sets Given symmetric matrices F i we want to characterize the shape in R n of the LMI set F

More information

that a broad class of conic convex polynomial optimization problems, called

that a broad class of conic convex polynomial optimization problems, called JOTA manuscript No. (will be inserted by the editor) Exact Conic Programming Relaxations for a Class of Convex Polynomial Cone-Programs Vaithilingam Jeyakumar Guoyin Li Communicated by Levent Tunçel Abstract

More information

A new approximation hierarchy for polynomial conic optimization

A new approximation hierarchy for polynomial conic optimization A new approximation hierarchy for polynomial conic optimization Peter J.C. Dickinson Janez Povh July 11, 2018 Abstract In this paper we consider polynomial conic optimization problems, where the feasible

More information

Semidefinite programming lifts and sparse sums-of-squares

Semidefinite programming lifts and sparse sums-of-squares 1/15 Semidefinite programming lifts and sparse sums-of-squares Hamza Fawzi (MIT, LIDS) Joint work with James Saunderson (UW) and Pablo Parrilo (MIT) Cornell ORIE, October 2015 Linear optimization 2/15

More information

Copositive Programming and Combinatorial Optimization

Copositive Programming and Combinatorial Optimization Copositive Programming and Combinatorial Optimization Franz Rendl http://www.math.uni-klu.ac.at Alpen-Adria-Universität Klagenfurt Austria joint work with I.M. Bomze (Wien) and F. Jarre (Düsseldorf) IMA

More information

Semidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 5

Semidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 5 Semidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 5 Instructor: Farid Alizadeh Scribe: Anton Riabov 10/08/2001 1 Overview We continue studying the maximum eigenvalue SDP, and generalize

More information

Semialgebraic Relaxations using Moment-SOS Hierarchies

Semialgebraic Relaxations using Moment-SOS Hierarchies Semialgebraic Relaxations using Moment-SOS Hierarchies Victor Magron, Postdoc LAAS-CNRS 17 September 2014 SIERRA Seminar Laboratoire d Informatique de l Ecole Normale Superieure y b sin( par + b) b 1 1

More information

Modeling with semidefinite and copositive matrices

Modeling with semidefinite and copositive matrices Modeling with semidefinite and copositive matrices Franz Rendl http://www.math.uni-klu.ac.at Alpen-Adria-Universität Klagenfurt Austria F. Rendl, Singapore workshop 2006 p.1/24 Overview Node and Edge relaxations

More information

Analysis and synthesis: a complexity perspective

Analysis and synthesis: a complexity perspective Analysis and synthesis: a complexity perspective Pablo A. Parrilo ETH ZürichZ control.ee.ethz.ch/~parrilo Outline System analysis/design Formal and informal methods SOS/SDP techniques and applications

More information

DSOS/SDOS Programming: New Tools for Optimization over Nonnegative Polynomials

DSOS/SDOS Programming: New Tools for Optimization over Nonnegative Polynomials DSOS/SDOS Programming: New Tools for Optimization over Nonnegative Polynomials Amir Ali Ahmadi Princeton University Dept. of Operations Research and Financial Engineering (ORFE) Joint work with: Anirudha

More information

Sparsity in Sums of Squares of Polynomials

Sparsity in Sums of Squares of Polynomials Research Reports on Mathematical and Computing Sciences Series B : Operations Research Department of Mathematical and Computing Sciences Tokyo Institute of Technology -1-1 Oh-Okayama, Meguro-ku, Tokyo

More information

Relations between Semidefinite, Copositive, Semi-infinite and Integer Programming

Relations between Semidefinite, Copositive, Semi-infinite and Integer Programming Relations between Semidefinite, Copositive, Semi-infinite and Integer Programming Author: Faizan Ahmed Supervisor: Dr. Georg Still Master Thesis University of Twente the Netherlands May 2010 Relations

More information

Semidefinite Programming

Semidefinite Programming Semidefinite Programming Notes by Bernd Sturmfels for the lecture on June 26, 208, in the IMPRS Ringvorlesung Introduction to Nonlinear Algebra The transition from linear algebra to nonlinear algebra has

More information

c 2000 Society for Industrial and Applied Mathematics

c 2000 Society for Industrial and Applied Mathematics SIAM J. OPIM. Vol. 10, No. 3, pp. 750 778 c 2000 Society for Industrial and Applied Mathematics CONES OF MARICES AND SUCCESSIVE CONVEX RELAXAIONS OF NONCONVEX SES MASAKAZU KOJIMA AND LEVEN UNÇEL Abstract.

More information

m i=1 c ix i i=1 F ix i F 0, X O.

m i=1 c ix i i=1 F ix i F 0, X O. What is SDP? for a beginner of SDP Copyright C 2005 SDPA Project 1 Introduction This note is a short course for SemiDefinite Programming s SDP beginners. SDP has various applications in, for example, control

More information

Lecture 3: Semidefinite Programming

Lecture 3: Semidefinite Programming Lecture 3: Semidefinite Programming Lecture Outline Part I: Semidefinite programming, examples, canonical form, and duality Part II: Strong Duality Failure Examples Part III: Conditions for strong duality

More information

Strange Behaviors of Interior-point Methods. for Solving Semidefinite Programming Problems. in Polynomial Optimization

Strange Behaviors of Interior-point Methods. for Solving Semidefinite Programming Problems. in Polynomial Optimization CS-08-02 Strange Behaviors of Interior-point Methods for Solving Semidefinite Programming Problems in Polynomial Optimization Hayato Waki, Maho Nakata, and Masakazu Muramatsu Department of Computer Science,

More information

Dimension reduction for semidefinite programming

Dimension reduction for semidefinite programming 1 / 22 Dimension reduction for semidefinite programming Pablo A. Parrilo Laboratory for Information and Decision Systems Electrical Engineering and Computer Science Massachusetts Institute of Technology

More information

Semidefinite approximations of projections and polynomial images of semialgebraic sets

Semidefinite approximations of projections and polynomial images of semialgebraic sets Semidefinite approximations of projections and polynomial images of semialgebraic sets Victor Magron 1 Didier Henrion 2,3,4 Jean-Bernard Lasserre 2,3 October 17, 2014 Abstract Given a compact semialgebraic

More information

Approximate Optimal Designs for Multivariate Polynomial Regression

Approximate Optimal Designs for Multivariate Polynomial Regression Approximate Optimal Designs for Multivariate Polynomial Regression Fabrice Gamboa Collaboration with: Yohan de Castro, Didier Henrion, Roxana Hess, Jean-Bernard Lasserre Universität Potsdam 16th of February

More information

Lift-and-Project Techniques and SDP Hierarchies

Lift-and-Project Techniques and SDP Hierarchies MFO seminar on Semidefinite Programming May 30, 2010 Typical combinatorial optimization problem: max c T x s.t. Ax b, x {0, 1} n P := {x R n Ax b} P I := conv(k {0, 1} n ) LP relaxation Integral polytope

More information

EE 227A: Convex Optimization and Applications October 14, 2008

EE 227A: Convex Optimization and Applications October 14, 2008 EE 227A: Convex Optimization and Applications October 14, 2008 Lecture 13: SDP Duality Lecturer: Laurent El Ghaoui Reading assignment: Chapter 5 of BV. 13.1 Direct approach 13.1.1 Primal problem Consider

More information

Optimality Conditions for Constrained Optimization

Optimality Conditions for Constrained Optimization 72 CHAPTER 7 Optimality Conditions for Constrained Optimization 1. First Order Conditions In this section we consider first order optimality conditions for the constrained problem P : minimize f 0 (x)

More information

MIT Algebraic techniques and semidefinite optimization February 14, Lecture 3

MIT Algebraic techniques and semidefinite optimization February 14, Lecture 3 MI 6.97 Algebraic techniques and semidefinite optimization February 4, 6 Lecture 3 Lecturer: Pablo A. Parrilo Scribe: Pablo A. Parrilo In this lecture, we will discuss one of the most important applications

More information

Research Reports on Mathematical and Computing Sciences

Research Reports on Mathematical and Computing Sciences ISSN 1342-2804 Research Reports on Mathematical and Computing Sciences Doubly Nonnegative Relaxations for Quadratic and Polynomial Optimization Problems with Binary and Box Constraints Sunyoung Kim, Masakazu

More information

Global Optimization of Polynomials

Global Optimization of Polynomials Semidefinite Programming Lecture 9 OR 637 Spring 2008 April 9, 2008 Scribe: Dennis Leventhal Global Optimization of Polynomials Recall we were considering the problem min z R n p(z) where p(z) is a degree

More information

Research Reports on Mathematical and Computing Sciences

Research Reports on Mathematical and Computing Sciences ISSN 1342-2804 Research Reports on Mathematical and Computing Sciences Sums of Squares and Semidefinite Programming Relaxations for Polynomial Optimization Problems with Structured Sparsity Hayato Waki,

More information

Non-commutative polynomial optimization

Non-commutative polynomial optimization Non-commutative polynomial optimization S. Pironio 1, M. Navascuès 2, A. Acín 3 1 Laboratoire d Information Quantique (Brussels) 2 Department of Mathematics (Bristol) 3 ICFO-The Institute of Photonic Sciences

More information

Sparse Matrix Theory and Semidefinite Optimization

Sparse Matrix Theory and Semidefinite Optimization Sparse Matrix Theory and Semidefinite Optimization Lieven Vandenberghe Department of Electrical Engineering University of California, Los Angeles Joint work with Martin S. Andersen and Yifan Sun Third

More information

An Introduction to Polynomial and Semi-Algebraic Optimization

An Introduction to Polynomial and Semi-Algebraic Optimization An Introduction to Polynomial and Semi-Algebraic Optimization This is the first comprehensive introduction to the powerful moment approach for solving global optimization problems (and some related problems)

More information

15. Conic optimization

15. Conic optimization L. Vandenberghe EE236C (Spring 216) 15. Conic optimization conic linear program examples modeling duality 15-1 Generalized (conic) inequalities Conic inequality: a constraint x K where K is a convex cone

More information

Optimality, Duality, Complementarity for Constrained Optimization

Optimality, Duality, Complementarity for Constrained Optimization Optimality, Duality, Complementarity for Constrained Optimization Stephen Wright University of Wisconsin-Madison May 2014 Wright (UW-Madison) Optimality, Duality, Complementarity May 2014 1 / 41 Linear

More information

SDP Relaxations for MAXCUT

SDP Relaxations for MAXCUT SDP Relaxations for MAXCUT from Random Hyperplanes to Sum-of-Squares Certificates CATS @ UMD March 3, 2017 Ahmed Abdelkader MAXCUT SDP SOS March 3, 2017 1 / 27 Overview 1 MAXCUT, Hardness and UGC 2 LP

More information

Real solving polynomial equations with semidefinite programming

Real solving polynomial equations with semidefinite programming .5cm. Real solving polynomial equations with semidefinite programming Jean Bernard Lasserre - Monique Laurent - Philipp Rostalski LAAS, Toulouse - CWI, Amsterdam - ETH, Zürich LAW 2008 Real solving polynomial

More information

A PARALLEL TWO-STAGE INTERIOR POINT DECOMPOSITION ALGORITHM FOR LARGE SCALE STRUCTURED SEMIDEFINITE PROGRAMS

A PARALLEL TWO-STAGE INTERIOR POINT DECOMPOSITION ALGORITHM FOR LARGE SCALE STRUCTURED SEMIDEFINITE PROGRAMS A PARALLEL TWO-STAGE INTERIOR POINT DECOMPOSITION ALGORITHM FOR LARGE SCALE STRUCTURED SEMIDEFINITE PROGRAMS Kartik K. Sivaramakrishnan Research Division Axioma, Inc., Atlanta kksivara@axiomainc.com http://www4.ncsu.edu/

More information

The convex algebraic geometry of rank minimization

The convex algebraic geometry of rank minimization The convex algebraic geometry of rank minimization Pablo A. Parrilo Laboratory for Information and Decision Systems Massachusetts Institute of Technology International Symposium on Mathematical Programming

More information

Semidefinite and Second Order Cone Programming Seminar Fall 2012 Project: Robust Optimization and its Application of Robust Portfolio Optimization

Semidefinite and Second Order Cone Programming Seminar Fall 2012 Project: Robust Optimization and its Application of Robust Portfolio Optimization Semidefinite and Second Order Cone Programming Seminar Fall 2012 Project: Robust Optimization and its Application of Robust Portfolio Optimization Instructor: Farid Alizadeh Author: Ai Kagawa 12/12/2012

More information

Lagrangian-Conic Relaxations, Part I: A Unified Framework and Its Applications to Quadratic Optimization Problems

Lagrangian-Conic Relaxations, Part I: A Unified Framework and Its Applications to Quadratic Optimization Problems Lagrangian-Conic Relaxations, Part I: A Unified Framework and Its Applications to Quadratic Optimization Problems Naohiko Arima, Sunyoung Kim, Masakazu Kojima, and Kim-Chuan Toh Abstract. In Part I of

More information

A ten page introduction to conic optimization

A ten page introduction to conic optimization CHAPTER 1 A ten page introduction to conic optimization This background chapter gives an introduction to conic optimization. We do not give proofs, but focus on important (for this thesis) tools and concepts.

More information

Convex Optimization in Classification Problems

Convex Optimization in Classification Problems New Trends in Optimization and Computational Algorithms December 9 13, 2001 Convex Optimization in Classification Problems Laurent El Ghaoui Department of EECS, UC Berkeley elghaoui@eecs.berkeley.edu 1

More information

Copositive Programming and Combinatorial Optimization

Copositive Programming and Combinatorial Optimization Copositive Programming and Combinatorial Optimization Franz Rendl http://www.math.uni-klu.ac.at Alpen-Adria-Universität Klagenfurt Austria joint work with M. Bomze (Wien) and F. Jarre (Düsseldorf) and

More information

Certified Roundoff Error Bounds using Semidefinite Programming

Certified Roundoff Error Bounds using Semidefinite Programming Certified Roundoff Error Bounds using Semidefinite Programming Victor Magron, CNRS VERIMAG joint work with G. Constantinides and A. Donaldson INRIA Mescal Team Seminar 19 November 2015 Victor Magron Certified

More information

A semidefinite relaxation scheme for quadratically constrained quadratic problems with an additional linear constraint

A semidefinite relaxation scheme for quadratically constrained quadratic problems with an additional linear constraint Iranian Journal of Operations Research Vol. 2, No. 2, 20, pp. 29-34 A semidefinite relaxation scheme for quadratically constrained quadratic problems with an additional linear constraint M. Salahi Semidefinite

More information

A notion of Total Dual Integrality for Convex, Semidefinite and Extended Formulations

A notion of Total Dual Integrality for Convex, Semidefinite and Extended Formulations A notion of for Convex, Semidefinite and Extended Formulations Marcel de Carli Silva Levent Tunçel April 26, 2018 A vector in R n is integral if each of its components is an integer, A vector in R n is

More information

Lower bounds on the size of semidefinite relaxations. David Steurer Cornell

Lower bounds on the size of semidefinite relaxations. David Steurer Cornell Lower bounds on the size of semidefinite relaxations David Steurer Cornell James R. Lee Washington Prasad Raghavendra Berkeley Institute for Advanced Study, November 2015 overview of results unconditional

More information

SPECTRAHEDRA. Bernd Sturmfels UC Berkeley

SPECTRAHEDRA. Bernd Sturmfels UC Berkeley SPECTRAHEDRA Bernd Sturmfels UC Berkeley GAeL Lecture I on Convex Algebraic Geometry Coimbra, Portugal, Monday, June 7, 2010 Positive Semidefinite Matrices For a real symmetric n n-matrix A the following

More information

The Geometry of Semidefinite Programming. Bernd Sturmfels UC Berkeley

The Geometry of Semidefinite Programming. Bernd Sturmfels UC Berkeley The Geometry of Semidefinite Programming Bernd Sturmfels UC Berkeley Positive Semidefinite Matrices For a real symmetric n n-matrix A the following are equivalent: All n eigenvalues of A are positive real

More information

Advanced SDPs Lecture 6: March 16, 2017

Advanced SDPs Lecture 6: March 16, 2017 Advanced SDPs Lecture 6: March 16, 2017 Lecturers: Nikhil Bansal and Daniel Dadush Scribe: Daniel Dadush 6.1 Notation Let N = {0, 1,... } denote the set of non-negative integers. For α N n, define the

More information

Detecting global optimality and extracting solutions in GloptiPoly

Detecting global optimality and extracting solutions in GloptiPoly Detecting global optimality and extracting solutions in GloptiPoly Didier Henrion 1, Jean-Bernard Lasserre 1 September 5, 5 Abstract GloptiPoly is a Matlab/SeDuMi add-on to build and solve convex linear

More information