The moment-lp and moment-sos approaches
|
|
- Julius McDaniel
- 6 years ago
- Views:
Transcription
1 The moment-lp and moment-sos approaches LAAS-CNRS and Institute of Mathematics, Toulouse, France CIRM, November 2013
2 Semidefinite Programming Why polynomial optimization? LP- and SDP- CERTIFICATES of POSITIVITY The moment-lp and moment-sos approaches Two examples outside optimization: Approximating sets defined with quantifiers Convex polynomial underestimators
3 Semidefinite Programming Why polynomial optimization? LP- and SDP- CERTIFICATES of POSITIVITY The moment-lp and moment-sos approaches Two examples outside optimization: Approximating sets defined with quantifiers Convex polynomial underestimators
4 Semidefinite Programming Why polynomial optimization? LP- and SDP- CERTIFICATES of POSITIVITY The moment-lp and moment-sos approaches Two examples outside optimization: Approximating sets defined with quantifiers Convex polynomial underestimators
5 Semidefinite Programming Why polynomial optimization? LP- and SDP- CERTIFICATES of POSITIVITY The moment-lp and moment-sos approaches Two examples outside optimization: Approximating sets defined with quantifiers Convex polynomial underestimators
6 Semidefinite Programming Why polynomial optimization? LP- and SDP- CERTIFICATES of POSITIVITY The moment-lp and moment-sos approaches Two examples outside optimization: Approximating sets defined with quantifiers Convex polynomial underestimators
7 Semidefinite Programming Why polynomial optimization? LP- and SDP- CERTIFICATES of POSITIVITY The moment-lp and moment-sos approaches Two examples outside optimization: Approximating sets defined with quantifiers Convex polynomial underestimators
8 Semidefinite Programming The CONVEX optimization problem: P min x R n { c x n A i x i i=1 b}, where c R n and b, A i S m (m m symmetric matrices), is called a semidefinite program. The notation 0" means the real symmetric matrix " is positive semidefinite, i.e., all its (real) EIGENVALUES are nonnegative.
9 Example P : min {x 1 + x 2 : x [ 3 + 2x1 + x s.t. 2 x 1 5 x 1 5 x 1 2x 2 or, equivalently ] }, 0 P : min {x 1 + x 2 : x [ 3 5 s.t. 5 0 ] + x 1 [ ] + x 2 [ ] } 0
10 P and its dual P are convex problems that are solvable in polynomial time to arbitrary precision ɛ > 0. = generalization to the convex cone S + m (X 0) of Linear Programming on the convex polyhedral cone R m + (x 0). Indeed, with DIAGONAL matrices Semidefinite programming = Linear Programming! Several academic SDP software packages exist, (e.g. MATLAB LMI toolbox, SeduMi, SDPT3,...). However, so far, size limitation is more severe than for LP software packages. Pioneer contributions by A. Nemirovsky, Y. Nesterov, N.Z. Shor, B.D. Yudin,...
11 P and its dual P are convex problems that are solvable in polynomial time to arbitrary precision ɛ > 0. = generalization to the convex cone S + m (X 0) of Linear Programming on the convex polyhedral cone R m + (x 0). Indeed, with DIAGONAL matrices Semidefinite programming = Linear Programming! Several academic SDP software packages exist, (e.g. MATLAB LMI toolbox, SeduMi, SDPT3,...). However, so far, size limitation is more severe than for LP software packages. Pioneer contributions by A. Nemirovsky, Y. Nesterov, N.Z. Shor, B.D. Yudin,...
12 P and its dual P are convex problems that are solvable in polynomial time to arbitrary precision ɛ > 0. = generalization to the convex cone S + m (X 0) of Linear Programming on the convex polyhedral cone R m + (x 0). Indeed, with DIAGONAL matrices Semidefinite programming = Linear Programming! Several academic SDP software packages exist, (e.g. MATLAB LMI toolbox, SeduMi, SDPT3,...). However, so far, size limitation is more severe than for LP software packages. Pioneer contributions by A. Nemirovsky, Y. Nesterov, N.Z. Shor, B.D. Yudin,...
13 Why Polynomial Optimization? After all... the polynomial optimization problem: f = min{f (x) : g j (x) 0, j = 1,..., m} is just a particular case of Non Linear Programming (NLP)! True!... if one is interested with a LOCAL optimum only!!
14 Why Polynomial Optimization? After all... the polynomial optimization problem: f = min{f (x) : g j (x) 0, j = 1,..., m} is just a particular case of Non Linear Programming (NLP)! True!... if one is interested with a LOCAL optimum only!!
15 When searching for a local minimum... Optimality conditions and descent algorithms use basic tools from REAL and CONVEX analysis and linear algebra The focus is on how to improve f by looking at a NEIGHBORHOOD of a nominal point x K, i.e., LOCALLY AROUND x K, and in general, no GLOBAL property of x K can be inferred. The fact that f and g j are POLYNOMIALS does not help much!
16 When searching for a local minimum... Optimality conditions and descent algorithms use basic tools from REAL and CONVEX analysis and linear algebra The focus is on how to improve f by looking at a NEIGHBORHOOD of a nominal point x K, i.e., LOCALLY AROUND x K, and in general, no GLOBAL property of x K can be inferred. The fact that f and g j are POLYNOMIALS does not help much!
17 When searching for a local minimum... Optimality conditions and descent algorithms use basic tools from REAL and CONVEX analysis and linear algebra The focus is on how to improve f by looking at a NEIGHBORHOOD of a nominal point x K, i.e., LOCALLY AROUND x K, and in general, no GLOBAL property of x K can be inferred. The fact that f and g j are POLYNOMIALS does not help much!
18 BUT for GLOBAL Optimization... the picture is different! Remember that for the GLOBAL minimum f : f = sup { λ : f (x) λ 0 x K}.... and so to compute f one needs TRACTABLE CERTIFICATES of POSITIVITY on K!
19 BUT for GLOBAL Optimization... the picture is different! Remember that for the GLOBAL minimum f : f = sup { λ : f (x) λ 0 x K}.... and so to compute f one needs TRACTABLE CERTIFICATES of POSITIVITY on K!
20 BUT for GLOBAL Optimization... the picture is different! Remember that for the GLOBAL minimum f : f = sup { λ : f (x) λ 0 x K}.... and so to compute f one needs TRACTABLE CERTIFICATES of POSITIVITY on K!
21 REAL ALGEBRAIC GEOMETRY helps!!!! Indeed, POWERFUL CERTIFICATES OF POSITIVITY EXIST! Moreover... and importantly, Such certificates are amenable to PRACTICAL COMPUTATION! ( Stronger Positivstellensatzë exist for analytic functions but are useless from a computational viewpoint.)
22 REAL ALGEBRAIC GEOMETRY helps!!!! Indeed, POWERFUL CERTIFICATES OF POSITIVITY EXIST! Moreover... and importantly, Such certificates are amenable to PRACTICAL COMPUTATION! ( Stronger Positivstellensatzë exist for analytic functions but are useless from a computational viewpoint.)
23 REAL ALGEBRAIC GEOMETRY helps!!!! Indeed, POWERFUL CERTIFICATES OF POSITIVITY EXIST! Moreover... and importantly, Such certificates are amenable to PRACTICAL COMPUTATION! ( Stronger Positivstellensatzë exist for analytic functions but are useless from a computational viewpoint.)
24 SOS-based certificate K = {x : g j (x) 0, j = 1,..., m} Theorem (Putinar s Positivstellensatz) If K is compact (+ a technical Archimedean assumption) and f > 0 on K then: f (x) = σ 0 (x) + m σ j (x) g j (x), x R n, j=1 for some SOS polynomials (σ j ) R[x]. Testing whether holds for some SOS (σ j ) R[x] with a degree bound, is SOLVING an SDP!
25 SOS-based certificate K = {x : g j (x) 0, j = 1,..., m} Theorem (Putinar s Positivstellensatz) If K is compact (+ a technical Archimedean assumption) and f > 0 on K then: f (x) = σ 0 (x) + m σ j (x) g j (x), x R n, j=1 for some SOS polynomials (σ j ) R[x]. Testing whether holds for some SOS (σ j ) R[x] with a degree bound, is SOLVING an SDP!
26 LP-based certificate K = {x : g j (x) 0; (1 g j (x)) 0, j = 1,..., m} Theorem (Krivine-Vasilescu-Handelman s Positivstellensatz) Let K be compact and the family {g j, (1 g j )} generate R[x]. If f > 0 on K then: f (x) = α,β c αβ m j=1 g j (x) α j (1 g j (x)) β j,, x R n, for some NONNEGATIVE scalars (c αβ ). Testing whether holds for some NONNEGATIVE (c αβ ) with α + β M, is SOLVING an LP!
27 LP-based certificate K = {x : g j (x) 0; (1 g j (x)) 0, j = 1,..., m} Theorem (Krivine-Vasilescu-Handelman s Positivstellensatz) Let K be compact and the family {g j, (1 g j )} generate R[x]. If f > 0 on K then: f (x) = α,β c αβ m j=1 g j (x) α j (1 g j (x)) β j,, x R n, for some NONNEGATIVE scalars (c αβ ). Testing whether holds for some NONNEGATIVE (c αβ ) with α + β M, is SOLVING an LP!
28 SUCH POSITIVITY CERTIFICATES allow to infer GLOBAL Properties of FEASIBILITY and OPTIMALITY,... the analogue of (well-known) previous ones valid in the CONVEX CASE ONLY! Farkas Lemma Krivine-Stengle KKT-Optimality conditions Schmüdgen-Putinar
29 SUCH POSITIVITY CERTIFICATES allow to infer GLOBAL Properties of FEASIBILITY and OPTIMALITY,... the analogue of (well-known) previous ones valid in the CONVEX CASE ONLY! Farkas Lemma Krivine-Stengle KKT-Optimality conditions Schmüdgen-Putinar
30 SUCH POSITIVITY CERTIFICATES allow to infer GLOBAL Properties of FEASIBILITY and OPTIMALITY,... the analogue of (well-known) previous ones valid in the CONVEX CASE ONLY! Farkas Lemma Krivine-Stengle KKT-Optimality conditions Schmüdgen-Putinar
31 SUCH POSITIVITY CERTIFICATES allow to infer GLOBAL Properties of FEASIBILITY and OPTIMALITY,... the analogue of (well-known) previous ones valid in the CONVEX CASE ONLY! Farkas Lemma Krivine-Stengle KKT-Optimality conditions Schmüdgen-Putinar
32 In addition, polynomials NONNEGATIVE ON A SET K R n are ubiquitous. They appear in many important applications, and not only in global optimization! For instance, one may also want: To approximate sets defined with QUANTIFIERS, like.e.g., R f := {x B : f (x, y) 0 for all y such that (x, y) K} D f := {x B : f (x, y) 0 for some y such that (x, y) K} where f R[x, y], B is a simple set (box, ellipsoid). To compute convex polynomial underestimators p f of a polynomial f on a box B R n. (Very useful in MINLP.)
33 In addition, polynomials NONNEGATIVE ON A SET K R n are ubiquitous. They appear in many important applications, and not only in global optimization! For instance, one may also want: To approximate sets defined with QUANTIFIERS, like.e.g., R f := {x B : f (x, y) 0 for all y such that (x, y) K} D f := {x B : f (x, y) 0 for some y such that (x, y) K} where f R[x, y], B is a simple set (box, ellipsoid). To compute convex polynomial underestimators p f of a polynomial f on a box B R n. (Very useful in MINLP.)
34 The moment-lp and moment-sos approaches consist of using a certain type of positivity certificate (Krivine-Stengle s or Putinar s certificate) in potentially any application where such a characterization is needed. (Global optimization is only one example.) In may situations this amounts to solving a HIERARCHY of : LINEAR PROGRAMS, or SEMIDEFINITE PROGRAMS... of increasing size!.
35 The moment-lp and moment-sos approaches consist of using a certain type of positivity certificate (Krivine-Stengle s or Putinar s certificate) in potentially any application where such a characterization is needed. (Global optimization is only one example.) In may situations this amounts to solving a HIERARCHY of : LINEAR PROGRAMS, or SEMIDEFINITE PROGRAMS... of increasing size!.
36 The moment-lp and moment-sos approaches consist of using a certain type of positivity certificate (Krivine-Stengle s or Putinar s certificate) in potentially any application where such a characterization is needed. (Global optimization is only one example.) In may situations this amounts to solving a HIERARCHY of : LINEAR PROGRAMS, or SEMIDEFINITE PROGRAMS... of increasing size!.
37 The moment-lp and moment-sos approaches consist of using a certain type of positivity certificate (Krivine-Stengle s or Putinar s certificate) in potentially any application where such a characterization is needed. (Global optimization is only one example.) In may situations this amounts to solving a HIERARCHY of : LINEAR PROGRAMS, or SEMIDEFINITE PROGRAMS... of increasing size!.
38 LP- and SDP-hierarchies for optimization Replace f = sup λ,σj { λ : f (x) λ 0 x K} with: The SDP-hierarchy indexed by d N: m fd = sup { λ : f λ = σ 0 + σ }{{} j g j ; deg (σ j g j ) 2d } }{{} SOS j=1 SOS or, the LP-hierarchy indexed by d N: θ d = sup { λ : f λ = α,β c αβ }{{} 0 m α g j j (1 g j ) β j ; α + β 2d} j=1
39 LP- and SDP-hierarchies for optimization Replace f = sup λ,σj { λ : f (x) λ 0 x K} with: The SDP-hierarchy indexed by d N: m fd = sup { λ : f λ = σ 0 + σ }{{} j g j ; deg (σ j g j ) 2d } }{{} SOS j=1 SOS or, the LP-hierarchy indexed by d N: θ d = sup { λ : f λ = α,β c αβ }{{} 0 m α g j j (1 g j ) β j ; α + β 2d} j=1
40 Theorem Both sequence (f d ), and (θ d), d N, are MONOTONE NON DECREASING and when K is compact (and satisfies a technical Archimedean assumption) then: f = lim fd = lim θ d. d d
41 What makes this approach exciting is that it is at the crossroads of several disciplines/applications: Commutative, Non-commutative, and Non-linear ALGEBRA Real algebraic geometry, and Functional Analysis Optimization, Convex Analysis Computational Complexity in Computer Science, which BENEFIT from interactions! As mentioned... potential applications are ENDLESS!
42 What makes this approach exciting is that it is at the crossroads of several disciplines/applications: Commutative, Non-commutative, and Non-linear ALGEBRA Real algebraic geometry, and Functional Analysis Optimization, Convex Analysis Computational Complexity in Computer Science, which BENEFIT from interactions! As mentioned... potential applications are ENDLESS!
43 Has already been proved useful and successful in applications with modest problem size, notably in optimization, control, robust control, optimal control, estimation, computer vision, etc. HAS initiated and stimulated new research issues: in Convex Algebraic Geometry (e.g. semidefinite representation of convex sets, algebraic degree of semidefinite programming and polynomial optimization) in Computational algebra (e.g., for solving polynomial equations via SDP and Border bases) Computational Complexity where LP- and SDP-HIERARCHIES have become an important tool to analyze Hardness of Approximation for 0/1 combinatorial problems ( links with quantum computing)
44 A remarkable property of the SOS hierarchy: I When solving the optimization problem P : f = min {f (x) : g j (x) 0, j = 1,..., m} one does NOT distinguish between CONVEX, CONTINUOUS NON CONVEX, and 0/1 (and DISCRETE) problems! A boolean variable x i is modelled via the equality constraint x 2 i x i = 0". In Non Linear Programming (NLP), modeling a 0/1 variable with the polynomial equality constraint xi 2 x i = 0" and applying a standard descent algorithm would be considered stupid"! Each class of problems has its own ad hoc tailored algorithms.
45 A remarkable property of the SOS hierarchy: I When solving the optimization problem P : f = min {f (x) : g j (x) 0, j = 1,..., m} one does NOT distinguish between CONVEX, CONTINUOUS NON CONVEX, and 0/1 (and DISCRETE) problems! A boolean variable x i is modelled via the equality constraint x 2 i x i = 0". In Non Linear Programming (NLP), modeling a 0/1 variable with the polynomial equality constraint xi 2 x i = 0" and applying a standard descent algorithm would be considered stupid"! Each class of problems has its own ad hoc tailored algorithms.
46 A remarkable property of the SOS hierarchy: I When solving the optimization problem P : f = min {f (x) : g j (x) 0, j = 1,..., m} one does NOT distinguish between CONVEX, CONTINUOUS NON CONVEX, and 0/1 (and DISCRETE) problems! A boolean variable x i is modelled via the equality constraint x 2 i x i = 0". In Non Linear Programming (NLP), modeling a 0/1 variable with the polynomial equality constraint xi 2 x i = 0" and applying a standard descent algorithm would be considered stupid"! Each class of problems has its own ad hoc tailored algorithms.
47 Even though the moment-sos approach DOES NOT SPECIALIZES to each class of problems: It recognizes the class of (easy) SOS-convex problems as FINITE CONVERGENCE occurs at the FIRST relaxation in the hierarchy. (Finite convergence also occurs for general convex problems.) (NOT true for the LP-hierarchy.) The SOS-hierarchy dominates other lift-and-project hierarchies (i.e. provides the best lower bounds) for hard 0/1 combinatorial optimization problems!
48 Even though the moment-sos approach DOES NOT SPECIALIZES to each class of problems: It recognizes the class of (easy) SOS-convex problems as FINITE CONVERGENCE occurs at the FIRST relaxation in the hierarchy. (Finite convergence also occurs for general convex problems.) (NOT true for the LP-hierarchy.) The SOS-hierarchy dominates other lift-and-project hierarchies (i.e. provides the best lower bounds) for hard 0/1 combinatorial optimization problems!
49 Even though the moment-sos approach DOES NOT SPECIALIZES to each class of problems: It recognizes the class of (easy) SOS-convex problems as FINITE CONVERGENCE occurs at the FIRST relaxation in the hierarchy. (Finite convergence also occurs for general convex problems.) (NOT true for the LP-hierarchy.) The SOS-hierarchy dominates other lift-and-project hierarchies (i.e. provides the best lower bounds) for hard 0/1 combinatorial optimization problems!
50 A remarkable property: II FINITE CONVERGENCE of the SOS-hierarchy is GENERIC!... and provides a GLOBAL OPTIMALITY CERTIFICATE, the analogue for the NON CONVEX CASE of the KKT-OPTIMALITY conditions in the CONVEX CASE!
51 Theorem (Marshall, Nie) Let x K be a global minimizer of P : f = min {f (x) : g j (x) 0, j = 1,..., m}. and assume that: (i) The gradients { g j (x )} are linearly independent, (ii) Strict complementarity holds (λ j g j (x ) = 0 for all j.) (iii) Second-order sufficiency conditions hold at (x, λ ) K R m +. m Then f (x) f = σ0 (x) + σj (x)g j(x), SOS polynomials {σ j }. j=1 x R n, for some Moreover, the conditions (i)-(ii)-(iii) HOLD GENERICALLY!
52 Theorem (Marshall, Nie) Let x K be a global minimizer of P : f = min {f (x) : g j (x) 0, j = 1,..., m}. and assume that: (i) The gradients { g j (x )} are linearly independent, (ii) Strict complementarity holds (λ j g j (x ) = 0 for all j.) (iii) Second-order sufficiency conditions hold at (x, λ ) K R m +. m Then f (x) f = σ0 (x) + σj (x)g j(x), SOS polynomials {σ j }. j=1 x R n, for some Moreover, the conditions (i)-(ii)-(iii) HOLD GENERICALLY!
53 In summary: KKT-OPTIMALITY when f and g j are CONVEX m f (x ) λ j g j(x ) = 0 f (x) f j=1 m λ j g j(x) j=1 PUTINAR s CERTIFICATE in the non CONVEX CASE m f (x ) σ j (x ) g j (x ) = 0 f (x) f j=1 m σj (x)g j(x) j=1 0 for all x R n (= σ0 (x)) 0 for all x Rn. for some SOS {σj }, and σj (x ) = λ j.
54 In summary: KKT-OPTIMALITY when f and g j are CONVEX m f (x ) λ j g j(x ) = 0 f (x) f j=1 m λ j g j(x) j=1 PUTINAR s CERTIFICATE in the non CONVEX CASE m f (x ) σ j (x ) g j (x ) = 0 f (x) f j=1 m σj (x)g j(x) j=1 0 for all x R n (= σ0 (x)) 0 for all x Rn. for some SOS {σj }, and σj (x ) = λ j.
55 II. Approximation of sets with quantifiers Let f R[x, y] and let K R n R p be the semi-algebraic set: K := {(x, y) : g j (x, y) 0, j = 1,..., m}, and let B R n be the unit ball or the [ 1, 1] n. Suppose that one wants to approximate the set: R f := {x B : f (x, y) 0 for all y such that (x, y) K} as closely as desired by a sequence of sets of the form: Θ k := {x B : J k (x) 0 } for some polynomials J k.
56 With g 0 = 1 and with K R n R p and k N, let m Q k (g) := σ j (x, y) g j (x, y) : σ j Σ[x, y], deg σ j g j 2k j=0 Let x F(x) := max {f (x, y) : (x, y) K }, and for every integer k consider the optimization problem: ρ k = min J R[x] k { } (J F ) dx : J(x) f (x, y) Q k (g) B
57 1. The criterion B (J F) dx = F dx B }{{} + α unknown but constant J α x α dx B }{{} easy to compute is LINEAR in the coefficients J α of the unknown polynomial J R[x] k! 2. The constraint J(x) f (x, y) = m σ j (x, y) g j (x, y) j=0 is just LINEAR CONSTRAINTS + LMIs!
58 1. The criterion B (J F) dx = F dx B }{{} + α unknown but constant J α x α dx B }{{} easy to compute is LINEAR in the coefficients J α of the unknown polynomial J R[x] k! 2. The constraint J(x) f (x, y) = m σ j (x, y) g j (x, y) j=0 is just LINEAR CONSTRAINTS + LMIs!
59 ρ k = min J R[x] k Hence, the optimization problem { } (J F ) dx : J(x) f (x, y) Q k (g) B IS AN SDP! Moreover, it has an optimal solution J k R[x] k! Alternatively, if one uses LP-based positivity certificates for J(x) f (x, y), one ends up with solving an LP! From the definition of Jk, the sublevel sets Θ k := {x B : J k (x) 0} R f, k N, provide a nested sequence of INNNER approximations of R f.
60 ρ k = min J R[x] k Hence, the optimization problem { } (J F ) dx : J(x) f (x, y) Q k (g) B IS AN SDP! Moreover, it has an optimal solution J k R[x] k! Alternatively, if one uses LP-based positivity certificates for J(x) f (x, y), one ends up with solving an LP! From the definition of Jk, the sublevel sets Θ k := {x B : J k (x) 0} R f, k N, provide a nested sequence of INNNER approximations of R f.
61 ρ k = min J R[x] k Hence, the optimization problem { } (J F ) dx : J(x) f (x, y) Q k (g) B IS AN SDP! Moreover, it has an optimal solution J k R[x] k! Alternatively, if one uses LP-based positivity certificates for J(x) f (x, y), one ends up with solving an LP! From the definition of Jk, the sublevel sets Θ k := {x B : J k (x) 0} R f, k N, provide a nested sequence of INNNER approximations of R f.
62 Theorem (Lass) (Strong) convergence in L 1 (B)-norm takes place, that is: Jk F dx = 0 lim k B and, if in addition the set {x B : F(x) = 0} has Lebesgue measure zero, then lim VOL(R f \ Θ k ) = 0 k
63 Theorem (Lass) (Strong) convergence in L 1 (B)-norm takes place, that is: Jk F dx = 0 lim k B and, if in addition the set {x B : F(x) = 0} has Lebesgue measure zero, then lim VOL(R f \ Θ k ) = 0 k
64 Ex: Polynomial Matrix Inequalities: (with D. Henrion) Let x A(x) R p p where A(x) is the matrix-polynomial ) x A(x) = ( A α x α = A α x α 1 1 x n αn α N n α N n. for finitely many real symmetric matrices (A α ), α N n.... and suppose one wants to approximate the set R A := {x B : A(x) 0} = {x : λ min (A(x)) 0}. R A = Then: x B : y T A(x)y }{{} f (x,y) 0, y s.t. y 2 = 1
65 Ex: Polynomial Matrix Inequalities: (with D. Henrion) Let x A(x) R p p where A(x) is the matrix-polynomial ) x A(x) = ( A α x α = A α x α 1 1 x n αn α N n α N n. for finitely many real symmetric matrices (A α ), α N n.... and suppose one wants to approximate the set R A := {x B : A(x) 0} = {x : λ min (A(x)) 0}. R A = Then: x B : y T A(x)y }{{} f (x,y) 0, y s.t. y 2 = 1
66 Illustrative example (continued) Let B be the unit disk {x : x 1} and let: { ( [ 1 16x1 x R A := x B : A(x) = 2 x 1 x 1 1 x1 2 x 2 2 ]) } 0 Then by solving relatively simple semidefinite programs, one may approximate R A with sublevel sets of the form: Θ k := {x B : J k (x) 0 } for some polynomial J k of degree k = 2, 4,... and with VOL (R A \ Θ k ) 0 as k.
67 Illustrative example (continued) Let B be the unit disk {x : x 1} and let: { ( [ 1 16x1 x R A := x B : A(x) = 2 x 1 x 1 1 x1 2 x 2 2 ]) } 0 Then by solving relatively simple semidefinite programs, one may approximate R A with sublevel sets of the form: Θ k := {x B : J k (x) 0 } for some polynomial J k of degree k = 2, 4,... and with VOL (R A \ Θ k ) 0 as k.
68 x 2 0 x x x 1 Θ 2 (left) and Θ 4 (right) inner approximations (light gray) of (dark gray) embedded in unit disk B (dashed).
69 x 2 0 x x x 1 Θ 6 (left) and Θ 8 (right) inner approximations (light gray) of (dark gray) embedded in unit disk B (dashed).
70 Similarly, suppose that one wants to approximate the set: D f := {x B : f (x, y) 0 for some y such that (x, y) K} as closely as desired by a sequence of sets of the form: Θ k := {x B : J k (x) 0 } for some polynomials J k.
71 Let x F(x) := min {f (x, y) : (x, y) K }, and for every integer k the optimization problem: ρ k = max J R[x] k { } (F J) dx : J(x) f (x, y) Q k (g) B IS AN SDP with an optimal solution J k R[x] k. From the definition of Jk, the sublevel sets Θ k := {x B : J k (x) 0} D f, k N, provide a nested sequence of OUTER approximations of D f.
72 Theorem (Lass) (Strong) convergence in L 1 (B)-norm takes place, that is: F Jk dx = 0 lim k B and, if in addition the set {x B : F(x) = 0} has Lebesgue measure zero, then lim VOL(Θ k \ D f ) = 0 k
73 Theorem (Lass) (Strong) convergence in L 1 (B)-norm takes place, that is: F Jk dx = 0 lim k B and, if in addition the set {x B : F(x) = 0} has Lebesgue measure zero, then lim VOL(Θ k \ D f ) = 0 k
74 III. Convex underestimators of polynomials In large scale Mixed Integer Nonlinear Programming (MINLP), a popular method is to use B & B where LOWER BOUNDS at each node of the search tree must be computed EFFICIENTLY! In such a case... one needs CONVEX UNDERESTIMATORS of the objective function, say on a BOX B R n! Message: Good" CONVEX POLYNOMIAL UNDERESTIMATORS can be computed efficienty!
75 III. Convex underestimators of polynomials In large scale Mixed Integer Nonlinear Programming (MINLP), a popular method is to use B & B where LOWER BOUNDS at each node of the search tree must be computed EFFICIENTLY! In such a case... one needs CONVEX UNDERESTIMATORS of the objective function, say on a BOX B R n! Message: Good" CONVEX POLYNOMIAL UNDERESTIMATORS can be computed efficienty!
76 inf p R[x] d Solving { (f (x) p(x)) dx : B s.t. f p 0 on B and p convex on B} will provide a degree-d POLYNOMIAL CONVEX UNDERESTIMATOR p of f on B that minimizes the L 1 (B)-norm f p 1! Notice that: (f (x) p(x)) dx is LINEAR in the coefficients of p! B p convex on B y T 2 p(x) y }{{} R[xy] d 0 on B {y : y 2 = 1}!
77 Hence replace the positivity and convexity constraints f p 0 on B and p convex on B with the positivity certificates f (x) p(x) = y T 2 p(x) y = m σ j (x) g j (x) }{{} SOS m ψ(x, y) g }{{} j (x) + ψ m+1 (x, y) (1 y 2 ) SOS k=0 k=0
78 Hence replace the positivity and convexity constraints f p 0 on B and p convex on B with the positivity certificates f (x) p(x) = y T 2 p(x) y = m σ j (x) g j (x) }{{} SOS m ψ(x, y) g }{{} j (x) + ψ m+1 (x, y) (1 y 2 ) SOS k=0 k=0
79 and apply the moment-sos approach to obtain a sequence of polynomials p k R[x] d, k N, of degree d which converges to the BEST convex polynomial underestimator of degree d.
80 Conclusion The moment-sos hierarchy is a powerful general methodology. Works for problems of modest size (or larger size problem with sparsity and/or symmetries) f (x) = α,β Mixed LP-SOS positivity certificate c αβ }{{} 0 where k IS FIXED! g j (x) α j j An alternative for larger size problems? j (1 g j (x)) β j + σ 0 (x) }{{} SOS of degree k
81 Conclusion The moment-sos hierarchy is a powerful general methodology. Works for problems of modest size (or larger size problem with sparsity and/or symmetries) f (x) = α,β Mixed LP-SOS positivity certificate c αβ }{{} 0 where k IS FIXED! g j (x) α j j An alternative for larger size problems? j (1 g j (x)) β j + σ 0 (x) }{{} SOS of degree k
82 THANK YOU!!
The moment-lp and moment-sos approaches in optimization
The moment-lp and moment-sos approaches in optimization LAAS-CNRS and Institute of Mathematics, Toulouse, France Workshop Linear Matrix Inequalities, Semidefinite Programming and Quantum Information Theory
More informationA new look at nonnegativity on closed sets
A new look at nonnegativity on closed sets LAAS-CNRS and Institute of Mathematics, Toulouse, France IPAM, UCLA September 2010 Positivstellensatze for semi-algebraic sets K R n from the knowledge of defining
More informationMoments and Positive Polynomials for Optimization II: LP- VERSUS SDP-relaxations
Moments and Positive Polynomials for Optimization II: LP- VERSUS SDP-relaxations LAAS-CNRS and Institute of Mathematics, Toulouse, France EECI Course: February 2016 LP-relaxations LP- VERSUS SDP-relaxations
More informationMoments and Positive Polynomials for Optimization II: LP- VERSUS SDP-relaxations
Moments and Positive Polynomials for Optimization II: LP- VERSUS SDP-relaxations LAAS-CNRS and Institute of Mathematics, Toulouse, France Tutorial, IMS, Singapore 2012 LP-relaxations LP- VERSUS SDP-relaxations
More informationOn Polynomial Optimization over Non-compact Semi-algebraic Sets
On Polynomial Optimization over Non-compact Semi-algebraic Sets V. Jeyakumar, J.B. Lasserre and G. Li Revised Version: April 3, 2014 Communicated by Lionel Thibault Abstract The optimal value of a polynomial
More informationOptimization over Polynomials with Sums of Squares and Moment Matrices
Optimization over Polynomials with Sums of Squares and Moment Matrices Monique Laurent Centrum Wiskunde & Informatica (CWI), Amsterdam and University of Tilburg Positivity, Valuations and Quadratic Forms
More informationConvex Optimization & Parsimony of L p-balls representation
Convex Optimization & Parsimony of L p -balls representation LAAS-CNRS and Institute of Mathematics, Toulouse, France IMA, January 2016 Motivation Unit balls associated with nonnegative homogeneous polynomials
More informationConvergence rates of moment-sum-of-squares hierarchies for volume approximation of semialgebraic sets
Convergence rates of moment-sum-of-squares hierarchies for volume approximation of semialgebraic sets Milan Korda 1, Didier Henrion,3,4 Draft of December 1, 016 Abstract Moment-sum-of-squares hierarchies
More informationHilbert s 17th Problem to Semidefinite Programming & Convex Algebraic Geometry
Hilbert s 17th Problem to Semidefinite Programming & Convex Algebraic Geometry Rekha R. Thomas University of Washington, Seattle References Monique Laurent, Sums of squares, moment matrices and optimization
More informationCONVEXITY IN SEMI-ALGEBRAIC GEOMETRY AND POLYNOMIAL OPTIMIZATION
CONVEXITY IN SEMI-ALGEBRAIC GEOMETRY AND POLYNOMIAL OPTIMIZATION JEAN B. LASSERRE Abstract. We review several (and provide new) results on the theory of moments, sums of squares and basic semi-algebraic
More information6-1 The Positivstellensatz P. Parrilo and S. Lall, ECC
6-1 The Positivstellensatz P. Parrilo and S. Lall, ECC 2003 2003.09.02.10 6. The Positivstellensatz Basic semialgebraic sets Semialgebraic sets Tarski-Seidenberg and quantifier elimination Feasibility
More informationMinimum Ellipsoid Bounds for Solutions of Polynomial Systems via Sum of Squares
Journal of Global Optimization (2005) 33: 511 525 Springer 2005 DOI 10.1007/s10898-005-2099-2 Minimum Ellipsoid Bounds for Solutions of Polynomial Systems via Sum of Squares JIAWANG NIE 1 and JAMES W.
More informationLMI MODELLING 4. CONVEX LMI MODELLING. Didier HENRION. LAAS-CNRS Toulouse, FR Czech Tech Univ Prague, CZ. Universidad de Valladolid, SP March 2009
LMI MODELLING 4. CONVEX LMI MODELLING Didier HENRION LAAS-CNRS Toulouse, FR Czech Tech Univ Prague, CZ Universidad de Valladolid, SP March 2009 Minors A minor of a matrix F is the determinant of a submatrix
More informationSolving Global Optimization Problems with Sparse Polynomials and Unbounded Semialgebraic Feasible Sets
Solving Global Optimization Problems with Sparse Polynomials and Unbounded Semialgebraic Feasible Sets V. Jeyakumar, S. Kim, G. M. Lee and G. Li June 6, 2014 Abstract We propose a hierarchy of semidefinite
More informationConvex Optimization. (EE227A: UC Berkeley) Lecture 28. Suvrit Sra. (Algebra + Optimization) 02 May, 2013
Convex Optimization (EE227A: UC Berkeley) Lecture 28 (Algebra + Optimization) 02 May, 2013 Suvrit Sra Admin Poster presentation on 10th May mandatory HW, Midterm, Quiz to be reweighted Project final report
More informationStrong duality in Lasserre s hierarchy for polynomial optimization
Strong duality in Lasserre s hierarchy for polynomial optimization arxiv:1405.7334v1 [math.oc] 28 May 2014 Cédric Josz 1,2, Didier Henrion 3,4,5 Draft of January 24, 2018 Abstract A polynomial optimization
More informationCOURSE ON LMI PART I.2 GEOMETRY OF LMI SETS. Didier HENRION henrion
COURSE ON LMI PART I.2 GEOMETRY OF LMI SETS Didier HENRION www.laas.fr/ henrion October 2006 Geometry of LMI sets Given symmetric matrices F i we want to characterize the shape in R n of the LMI set F
More informationI.3. LMI DUALITY. Didier HENRION EECI Graduate School on Control Supélec - Spring 2010
I.3. LMI DUALITY Didier HENRION henrion@laas.fr EECI Graduate School on Control Supélec - Spring 2010 Primal and dual For primal problem p = inf x g 0 (x) s.t. g i (x) 0 define Lagrangian L(x, z) = g 0
More informationPOLYNOMIAL OPTIMIZATION WITH SUMS-OF-SQUARES INTERPOLANTS
POLYNOMIAL OPTIMIZATION WITH SUMS-OF-SQUARES INTERPOLANTS Sercan Yıldız syildiz@samsi.info in collaboration with Dávid Papp (NCSU) OPT Transition Workshop May 02, 2017 OUTLINE Polynomial optimization and
More informationIntroduction to Semidefinite Programming I: Basic properties a
Introduction to Semidefinite Programming I: Basic properties and variations on the Goemans-Williamson approximation algorithm for max-cut MFO seminar on Semidefinite Programming May 30, 2010 Semidefinite
More informationSemidefinite Programming Basics and Applications
Semidefinite Programming Basics and Applications Ray Pörn, principal lecturer Åbo Akademi University Novia University of Applied Sciences Content What is semidefinite programming (SDP)? How to represent
More information4. Algebra and Duality
4-1 Algebra and Duality P. Parrilo and S. Lall, CDC 2003 2003.12.07.01 4. Algebra and Duality Example: non-convex polynomial optimization Weak duality and duality gap The dual is not intrinsic The cone
More informationApproximate Optimal Designs for Multivariate Polynomial Regression
Approximate Optimal Designs for Multivariate Polynomial Regression Fabrice Gamboa Collaboration with: Yohan de Castro, Didier Henrion, Roxana Hess, Jean-Bernard Lasserre Universität Potsdam 16th of February
More informationarxiv: v1 [math.oc] 31 Jan 2017
CONVEX CONSTRAINED SEMIALGEBRAIC VOLUME OPTIMIZATION: APPLICATION IN SYSTEMS AND CONTROL 1 Ashkan Jasour, Constantino Lagoa School of Electrical Engineering and Computer Science, Pennsylvania State University
More informationA brief history of noncommutative Positivstellensätze. Jaka Cimprič, University of Ljubljana, Slovenia
A brief history of noncommutative Positivstellensätze Jaka Cimprič, University of Ljubljana, Slovenia Hilbert s Nullstellensatz: If f, g 1,..., g m R := C[X 1,..., X n ] then the following are equivalent:
More informationAn explicit construction of distinguished representations of polynomials nonnegative over finite sets
An explicit construction of distinguished representations of polynomials nonnegative over finite sets Pablo A. Parrilo Automatic Control Laboratory Swiss Federal Institute of Technology Physikstrasse 3
More informationHow to generate weakly infeasible semidefinite programs via Lasserre s relaxations for polynomial optimization
CS-11-01 How to generate weakly infeasible semidefinite programs via Lasserre s relaxations for polynomial optimization Hayato Waki Department of Computer Science, The University of Electro-Communications
More informationSemialgebraic Relaxations using Moment-SOS Hierarchies
Semialgebraic Relaxations using Moment-SOS Hierarchies Victor Magron, Postdoc LAAS-CNRS 17 September 2014 SIERRA Seminar Laboratoire d Informatique de l Ecole Normale Superieure y b sin( par + b) b 1 1
More informationRobust and Optimal Control, Spring 2015
Robust and Optimal Control, Spring 2015 Instructor: Prof. Masayuki Fujita (S5-303B) G. Sum of Squares (SOS) G.1 SOS Program: SOS/PSD and SDP G.2 Duality, valid ineqalities and Cone G.3 Feasibility/Optimization
More informationLecture 6: Conic Optimization September 8
IE 598: Big Data Optimization Fall 2016 Lecture 6: Conic Optimization September 8 Lecturer: Niao He Scriber: Juan Xu Overview In this lecture, we finish up our previous discussion on optimality conditions
More informationOptimality, Duality, Complementarity for Constrained Optimization
Optimality, Duality, Complementarity for Constrained Optimization Stephen Wright University of Wisconsin-Madison May 2014 Wright (UW-Madison) Optimality, Duality, Complementarity May 2014 1 / 41 Linear
More informationc 2000 Society for Industrial and Applied Mathematics
SIAM J. OPIM. Vol. 10, No. 3, pp. 750 778 c 2000 Society for Industrial and Applied Mathematics CONES OF MARICES AND SUCCESSIVE CONVEX RELAXAIONS OF NONCONVEX SES MASAKAZU KOJIMA AND LEVEN UNÇEL Abstract.
More informationSemidefinite approximations of projections and polynomial images of semialgebraic sets
Semidefinite approximations of projections and polynomial images of semialgebraic sets Victor Magron 1 Didier Henrion 2,3,4 Jean-Bernard Lasserre 2,3 October 17, 2014 Abstract Given a compact semialgebraic
More information15. Conic optimization
L. Vandenberghe EE236C (Spring 216) 15. Conic optimization conic linear program examples modeling duality 15-1 Generalized (conic) inequalities Conic inequality: a constraint x K where K is a convex cone
More informationWhat can be expressed via Conic Quadratic and Semidefinite Programming?
What can be expressed via Conic Quadratic and Semidefinite Programming? A. Nemirovski Faculty of Industrial Engineering and Management Technion Israel Institute of Technology Abstract Tremendous recent
More informationConstrained Optimization Theory
Constrained Optimization Theory Stephen J. Wright 1 2 Computer Sciences Department, University of Wisconsin-Madison. IMA, August 2016 Stephen Wright (UW-Madison) Constrained Optimization Theory IMA, August
More informationSemidefinite programming and convex algebraic geometry
FoCM 2008 - SDP and convex AG p. 1/40 Semidefinite programming and convex algebraic geometry Pablo A. Parrilo www.mit.edu/~parrilo Laboratory for Information and Decision Systems Electrical Engineering
More informationSEMIDEFINITE PROGRAMMING VS. LP RELAXATIONS FOR POLYNOMIAL PROGRAMMING
MATHEMATICS OF OPERATIONS RESEARCH Vol. 27, No. 2, May 2002, pp. 347 360 Printed in U.S.A. SEMIDEFINITE PROGRAMMING VS. LP RELAXATIONS FOR POLYNOMIAL PROGRAMMING JEAN B. LASSERRE We consider the global
More informationRepresentations of Positive Polynomials: Theory, Practice, and
Representations of Positive Polynomials: Theory, Practice, and Applications Dept. of Mathematics and Computer Science Emory University, Atlanta, GA Currently: National Science Foundation Temple University
More informationSemidefinite programming relaxations for semialgebraic problems
Mathematical Programming manuscript No. (will be inserted by the editor) Pablo A. Parrilo Semidefinite programming relaxations for semialgebraic problems Abstract. A hierarchy of convex relaxations for
More informationAssignment 1: From the Definition of Convexity to Helley Theorem
Assignment 1: From the Definition of Convexity to Helley Theorem Exercise 1 Mark in the following list the sets which are convex: 1. {x R 2 : x 1 + i 2 x 2 1, i = 1,..., 10} 2. {x R 2 : x 2 1 + 2ix 1x
More informationarxiv: v1 [math.oc] 9 Sep 2015
CONTAINMENT PROBLEMS FOR PROJECTIONS OF POLYHEDRA AND SPECTRAHEDRA arxiv:1509.02735v1 [math.oc] 9 Sep 2015 KAI KELLNER Abstract. Spectrahedra are affine sections of the cone of positive semidefinite matrices
More informationA JOINT+MARGINAL APPROACH TO PARAMETRIC POLYNOMIAL OPTIMIZATION
A JOINT+MARGINAL APPROACH TO PARAMETRIC POLNOMIAL OPTIMIZATION JEAN B. LASSERRE Abstract. Given a compact parameter set R p, we consider polynomial optimization problems (P y) on R n whose description
More informationthat a broad class of conic convex polynomial optimization problems, called
JOTA manuscript No. (will be inserted by the editor) Exact Conic Programming Relaxations for a Class of Convex Polynomial Cone-Programs Vaithilingam Jeyakumar Guoyin Li Communicated by Levent Tunçel Abstract
More informationPolyhedral Results for A Class of Cardinality Constrained Submodular Minimization Problems
Polyhedral Results for A Class of Cardinality Constrained Submodular Minimization Problems Shabbir Ahmed and Jiajin Yu Georgia Institute of Technology A Motivating Problem [n]: Set of candidate investment
More informationOptimization over Nonnegative Polynomials: Algorithms and Applications. Amir Ali Ahmadi Princeton, ORFE
Optimization over Nonnegative Polynomials: Algorithms and Applications Amir Ali Ahmadi Princeton, ORFE INFORMS Optimization Society Conference (Tutorial Talk) Princeton University March 17, 2016 1 Optimization
More informationm i=1 c ix i i=1 F ix i F 0, X O.
What is SDP? for a beginner of SDP Copyright C 2005 SDPA Project 1 Introduction This note is a short course for SemiDefinite Programming s SDP beginners. SDP has various applications in, for example, control
More informationOptimization based robust control
Optimization based robust control Didier Henrion 1,2 Draft of March 27, 2014 Prepared for possible inclusion into The Encyclopedia of Systems and Control edited by John Baillieul and Tariq Samad and published
More informationGlobal Optimization with Polynomials
Global Optimization with Polynomials Geoffrey Schiebinger, Stephen Kemmerling Math 301, 2010/2011 March 16, 2011 Geoffrey Schiebinger, Stephen Kemmerling (Math Global 301, 2010/2011) Optimization with
More informationAn Exact Jacobian SDP Relaxation for Polynomial Optimization
An Exact Jacobian SDP Relaxation for Polynomial Optimization Jiawang Nie May 26, 2011 Abstract Given polynomials f(x), g i (x), h j (x), we study how to imize f(x) on the set S = {x R n : h 1 (x) = = h
More informationExample: feasibility. Interpretation as formal proof. Example: linear inequalities and Farkas lemma
4-1 Algebra and Duality P. Parrilo and S. Lall 2006.06.07.01 4. Algebra and Duality Example: non-convex polynomial optimization Weak duality and duality gap The dual is not intrinsic The cone of valid
More informationCourse Outline. FRTN10 Multivariable Control, Lecture 13. General idea for Lectures Lecture 13 Outline. Example 1 (Doyle Stein, 1979)
Course Outline FRTN Multivariable Control, Lecture Automatic Control LTH, 6 L-L Specifications, models and loop-shaping by hand L6-L8 Limitations on achievable performance L9-L Controller optimization:
More informationConvex Optimization in Classification Problems
New Trends in Optimization and Computational Algorithms December 9 13, 2001 Convex Optimization in Classification Problems Laurent El Ghaoui Department of EECS, UC Berkeley elghaoui@eecs.berkeley.edu 1
More informationMean squared error minimization for inverse moment problems
Mean squared error minimization for inverse moment problems Didier Henrion 1,2,3, Jean B. Lasserre 1,2,4, Martin Mevissen 5 August 28, 2012 Abstract We consider the problem of approximating the unknown
More informationLECTURE 25: REVIEW/EPILOGUE LECTURE OUTLINE
LECTURE 25: REVIEW/EPILOGUE LECTURE OUTLINE CONVEX ANALYSIS AND DUALITY Basic concepts of convex analysis Basic concepts of convex optimization Geometric duality framework - MC/MC Constrained optimization
More informationNumerical Optimization
Constrained Optimization Computer Science and Automation Indian Institute of Science Bangalore 560 012, India. NPTEL Course on Constrained Optimization Constrained Optimization Problem: min h j (x) 0,
More informationRobust linear optimization under general norms
Operations Research Letters 3 (004) 50 56 Operations Research Letters www.elsevier.com/locate/dsw Robust linear optimization under general norms Dimitris Bertsimas a; ;, Dessislava Pachamanova b, Melvyn
More informationConic Linear Programming. Yinyu Ye
Conic Linear Programming Yinyu Ye December 2004, revised January 2015 i ii Preface This monograph is developed for MS&E 314, Conic Linear Programming, which I am teaching at Stanford. Information, lecture
More informationMinimum volume semialgebraic sets for robust estimation
Minimum volume semialgebraic sets for robust estimation Fabrizio Dabbene 1, Didier Henrion 2,3,4 October 31, 2018 arxiv:1210.3183v1 [math.oc] 11 Oct 2012 Abstract Motivated by problems of uncertainty propagation
More informationLecture 5. The Dual Cone and Dual Problem
IE 8534 1 Lecture 5. The Dual Cone and Dual Problem IE 8534 2 For a convex cone K, its dual cone is defined as K = {y x, y 0, x K}. The inner-product can be replaced by x T y if the coordinates of the
More informationThe Trust Region Subproblem with Non-Intersecting Linear Constraints
The Trust Region Subproblem with Non-Intersecting Linear Constraints Samuel Burer Boshi Yang February 21, 2013 Abstract This paper studies an extended trust region subproblem (etrs in which the trust region
More informationConvex computation of the region of attraction of polynomial control systems
Convex computation of the region of attraction of polynomial control systems Didier Henrion 1,2,3, Milan Korda 4 Draft of July 15, 213 Abstract We address the long-standing problem of computing the region
More informationFormal Proofs, Program Analysis and Moment-SOS Relaxations
Formal Proofs, Program Analysis and Moment-SOS Relaxations Victor Magron, Postdoc LAAS-CNRS 15 July 2014 Imperial College Department of Electrical and Electronic Eng. y b sin( par + b) b 1 1 b1 b2 par
More informationSemidefinite Programming
Chapter 2 Semidefinite Programming 2.0.1 Semi-definite programming (SDP) Given C M n, A i M n, i = 1, 2,..., m, and b R m, the semi-definite programming problem is to find a matrix X M n for the optimization
More informationAdvanced SDPs Lecture 6: March 16, 2017
Advanced SDPs Lecture 6: March 16, 2017 Lecturers: Nikhil Bansal and Daniel Dadush Scribe: Daniel Dadush 6.1 Notation Let N = {0, 1,... } denote the set of non-negative integers. For α N n, define the
More informationLecture 18: Optimization Programming
Fall, 2016 Outline Unconstrained Optimization 1 Unconstrained Optimization 2 Equality-constrained Optimization Inequality-constrained Optimization Mixture-constrained Optimization 3 Quadratic Programming
More informationSemidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 2
Semidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 2 Instructor: Farid Alizadeh Scribe: Xuan Li 9/17/2001 1 Overview We survey the basic notions of cones and cone-lp and give several
More informationInner approximations of the region of attraction for polynomial dynamical systems
Inner approimations of the region of attraction for polynomial dynamical systems Milan Korda, Didier Henrion 2,3,4, Colin N. Jones October, 22 Abstract hal-74798, version - Oct 22 In a previous work we
More informationRelations between Semidefinite, Copositive, Semi-infinite and Integer Programming
Relations between Semidefinite, Copositive, Semi-infinite and Integer Programming Author: Faizan Ahmed Supervisor: Dr. Georg Still Master Thesis University of Twente the Netherlands May 2010 Relations
More informationOptimality Conditions for Constrained Optimization
72 CHAPTER 7 Optimality Conditions for Constrained Optimization 1. First Order Conditions In this section we consider first order optimality conditions for the constrained problem P : minimize f 0 (x)
More informationSemidefinite Programming
Semidefinite Programming Basics and SOS Fernando Mário de Oliveira Filho Campos do Jordão, 2 November 23 Available at: www.ime.usp.br/~fmario under talks Conic programming V is a real vector space h, i
More informationPrimal-Dual Geometry of Level Sets and their Explanatory Value of the Practical Performance of Interior-Point Methods for Conic Optimization
Primal-Dual Geometry of Level Sets and their Explanatory Value of the Practical Performance of Interior-Point Methods for Conic Optimization Robert M. Freund M.I.T. June, 2010 from papers in SIOPT, Mathematics
More informationHW1 solutions. 1. α Ef(x) β, where Ef(x) is the expected value of f(x), i.e., Ef(x) = n. i=1 p if(a i ). (The function f : R R is given.
HW1 solutions Exercise 1 (Some sets of probability distributions.) Let x be a real-valued random variable with Prob(x = a i ) = p i, i = 1,..., n, where a 1 < a 2 < < a n. Of course p R n lies in the standard
More informationSum of Squares Relaxations for Polynomial Semi-definite Programming
Sum of Squares Relaxations for Polynomial Semi-definite Programming C.W.J. Hol, C.W. Scherer Delft University of Technology, Delft Center of Systems and Control (DCSC) Mekelweg 2, 2628CD Delft, The Netherlands
More informationTowards Solving Bilevel Optimization Problems in Quantum Information Theory
Towards Solving Bilevel Optimization Problems in Quantum Information Theory ICFO-The Institute of Photonic Sciences and University of Borås 22 January 2016 Workshop on Linear Matrix Inequalities, Semidefinite
More informationSelected Examples of CONIC DUALITY AT WORK Robust Linear Optimization Synthesis of Linear Controllers Matrix Cube Theorem A.
. Selected Examples of CONIC DUALITY AT WORK Robust Linear Optimization Synthesis of Linear Controllers Matrix Cube Theorem A. Nemirovski Arkadi.Nemirovski@isye.gatech.edu Linear Optimization Problem,
More informationDSOS/SDOS Programming: New Tools for Optimization over Nonnegative Polynomials
DSOS/SDOS Programming: New Tools for Optimization over Nonnegative Polynomials Amir Ali Ahmadi Princeton University Dept. of Operations Research and Financial Engineering (ORFE) Joint work with: Anirudha
More informationCopositive Programming and Combinatorial Optimization
Copositive Programming and Combinatorial Optimization Franz Rendl http://www.math.uni-klu.ac.at Alpen-Adria-Universität Klagenfurt Austria joint work with I.M. Bomze (Wien) and F. Jarre (Düsseldorf) IMA
More informationRobust conic quadratic programming with ellipsoidal uncertainties
Robust conic quadratic programming with ellipsoidal uncertainties Roland Hildebrand (LJK Grenoble 1 / CNRS, Grenoble) KTH, Stockholm; November 13, 2008 1 Uncertain conic programs min x c, x : Ax + b K
More information12. Interior-point methods
12. Interior-point methods Convex Optimization Boyd & Vandenberghe inequality constrained minimization logarithmic barrier function and central path barrier method feasibility and phase I methods complexity
More informationModule 04 Optimization Problems KKT Conditions & Solvers
Module 04 Optimization Problems KKT Conditions & Solvers Ahmad F. Taha EE 5243: Introduction to Cyber-Physical Systems Email: ahmad.taha@utsa.edu Webpage: http://engineering.utsa.edu/ taha/index.html September
More informationConic Linear Programming. Yinyu Ye
Conic Linear Programming Yinyu Ye December 2004, revised October 2017 i ii Preface This monograph is developed for MS&E 314, Conic Linear Programming, which I am teaching at Stanford. Information, lecture
More informationA notion of Total Dual Integrality for Convex, Semidefinite and Extended Formulations
A notion of for Convex, Semidefinite and Extended Formulations Marcel de Carli Silva Levent Tunçel April 26, 2018 A vector in R n is integral if each of its components is an integer, A vector in R n is
More informationNetwork Utility Maximization With Nonconcave Utilities Using Sum-of-Squares Method
Proceedings of the 44th IEEE Conference on Decision and Control, and the European Control Conference 2005 Seville, Spain, December 2-5, 2005 MoC4.6 Network Utility Maximization With Nonconcave Utilities
More informationNon-commutative polynomial optimization
Non-commutative polynomial optimization S. Pironio 1, M. Navascuès 2, A. Acín 3 1 Laboratoire d Information Quantique (Brussels) 2 Department of Mathematics (Bristol) 3 ICFO-The Institute of Photonic Sciences
More informationSemidefinite and Second Order Cone Programming Seminar Fall 2012 Project: Robust Optimization and its Application of Robust Portfolio Optimization
Semidefinite and Second Order Cone Programming Seminar Fall 2012 Project: Robust Optimization and its Application of Robust Portfolio Optimization Instructor: Farid Alizadeh Author: Ai Kagawa 12/12/2012
More informationLinear conic optimization for nonlinear optimal control
Linear conic optimization for nonlinear optimal control Didier Henrion 1,2,3, Edouard Pauwels 1,2 Draft of July 15, 2014 Abstract Infinite-dimensional linear conic formulations are described for nonlinear
More informationA new approximation hierarchy for polynomial conic optimization
A new approximation hierarchy for polynomial conic optimization Peter J.C. Dickinson Janez Povh July 11, 2018 Abstract In this paper we consider polynomial conic optimization problems, where the feasible
More informationAnalysis and synthesis: a complexity perspective
Analysis and synthesis: a complexity perspective Pablo A. Parrilo ETH ZürichZ control.ee.ethz.ch/~parrilo Outline System analysis/design Formal and informal methods SOS/SDP techniques and applications
More informationLecture Note 5: Semidefinite Programming for Stability Analysis
ECE7850: Hybrid Systems:Theory and Applications Lecture Note 5: Semidefinite Programming for Stability Analysis Wei Zhang Assistant Professor Department of Electrical and Computer Engineering Ohio State
More informationGeometric problems. Chapter Projection on a set. The distance of a point x 0 R n to a closed set C R n, in the norm, is defined as
Chapter 8 Geometric problems 8.1 Projection on a set The distance of a point x 0 R n to a closed set C R n, in the norm, is defined as dist(x 0,C) = inf{ x 0 x x C}. The infimum here is always achieved.
More informationminimize x x2 2 x 1x 2 x 1 subject to x 1 +2x 2 u 1 x 1 4x 2 u 2, 5x 1 +76x 2 1,
4 Duality 4.1 Numerical perturbation analysis example. Consider the quadratic program with variables x 1, x 2, and parameters u 1, u 2. minimize x 2 1 +2x2 2 x 1x 2 x 1 subject to x 1 +2x 2 u 1 x 1 4x
More informationLecture 5. Theorems of Alternatives and Self-Dual Embedding
IE 8534 1 Lecture 5. Theorems of Alternatives and Self-Dual Embedding IE 8534 2 A system of linear equations may not have a solution. It is well known that either Ax = c has a solution, or A T y = 0, c
More informationon Semialgebraic Sets for Nonnnegative Polynomials Finding Representations
Finding Representations for Nonnnegative Polynomials on Semialgebraic Sets Ruchira Datta Department of Mathematics University of California Berkeley, California April 2nd, 2002 1 Introducing Polynomial
More informationNon-Convex Optimization via Real Algebraic Geometry
Non-Convex Optimization via Real Algebraic Geometry Constantine Caramanis Massachusetts Institute of Technology November 29, 2001 The following paper represents the material from a collection of different
More informationConvex computation of the region of attraction of polynomial control systems
Convex computation of the region of attraction of polynomial control systems Didier Henrion 1,2,3, Milan Korda 4 ariv:128.1751v1 [math.oc] 8 Aug 212 Draft of August 9, 212 Abstract We address the long-standing
More informationTilburg University. Hidden Convexity in Partially Separable Optimization Ben-Tal, A.; den Hertog, Dick; Laurent, Monique. Publication date: 2011
Tilburg University Hidden Convexity in Partially Separable Optimization Ben-Tal, A.; den Hertog, Dick; Laurent, Monique Publication date: 2011 Link to publication Citation for published version (APA):
More informationModern Optimal Control
Modern Optimal Control Matthew M. Peet Arizona State University Lecture 19: Stabilization via LMIs Optimization Optimization can be posed in functional form: min x F objective function : inequality constraints
More informationA General Framework for Convex Relaxation of Polynomial Optimization Problems over Cones
Research Reports on Mathematical and Computing Sciences Series B : Operations Research Department of Mathematical and Computing Sciences Tokyo Institute of Technology 2-12-1 Oh-Okayama, Meguro-ku, Tokyo
More informationMean squared error minimization for inverse moment problems
Mean squared error minimization for inverse moment problems Didier Henrion 1,2,3, Jean B. Lasserre 1,2,4, Martin Mevissen 5 June 19, 2013 Abstract We consider the problem of approximating the unknown density
More information