Sum of Squares Relaxations for Polynomial Semi-definite Programming

Size: px
Start display at page:

Download "Sum of Squares Relaxations for Polynomial Semi-definite Programming"

Transcription

1 Sum of Squares Relaxations for Polynomial Semi-definite Programming C.W.J. Hol, C.W. Scherer Delft University of Technology, Delft Center of Systems and Control (DCSC) Mekelweg 2, 2628CD Delft, The Netherlands Abstract We present an extension of the scalar polynomial optimization by sum-of squares decompositions [5] to optimization problems with scalar polynomial objective and polynomial semi-definite constraints. We will show that the values of these relaxations converge to the optimal value under a constraint qualification. Although this convergence property is well known for polynomial problems with scalar constraints [5], to the best of our knowledge this result is new for matrix-valued inequalities. (We are aware of the parallel independent work of Kojima [6], that presents the same result with a different proof.) The result allows for a systematic improvement of LMI relaxations of non-convex polynomial semi-definite programming problems with guaranteed reduction of the relaxation gap to zero. This can be applied to various very hard control problems that can be written as polynomial semidefinite programs (SDP s), such as static or fixed-order controller synthesis. We present a direct and compact description of the resulting linear SDP s with full flexibility in the choice of the underlying monomial basis. The method is illustrated with a non-convex quadratic SDP problem and with H 2 -optimal static controller synthesis. 1 Introduction Recent improvements of semi-definite programming solvers and developments on polynomial optimization has resulted in a large increase in the research activity on applications of the so-called sum-of-squares (SOS) techniques in control. In this approach non-convex polynomial optimization problems are approximated by a family of convex problems that are relaxations of the original problem [1, 5]. These relaxations are based on decompositions of certain polynomials into a sum of squares. Using a theorem of Putinar [9] it can be shown (under suitable constraint qualifications) that the optimal values of these relaxed problems converge to the optimal value of the original problem. These relaxation schemes have recently been applied to various non-convex problems in control such as Lyapunov stability of nonlinear dynamic systems [7, 2] and robust stability analysis [4]. Many problems in control, including some very hard non-convex problems, can be formulated as semi-definite polynomial problems. An example is the static or fixed-order H 2 - synthesis problem, which can be written as a non-convex semi-definite polynomial optimization problem. The research of this author is sponsored by Philips CFT The research of this author is supported by the Technology Foundation STW, applied science division of NWO and the technology programme of the Ministry of Economic Affairs. 1

2 In this paper we present an extension of the scalar polynomial optimization by SOS decompositions [5] to optimization problems with scalar polynomial objective and nonlinear semi-definite constraints. We will show that the values of these relaxations converge to the optimal value under a constraint qualification. Although this convergence property is well known for polynomial problems with scalar constraints [5], to the best of our knowledge this result is, except for independent work of Kojima [6], new for matrix-valued inequalities. In Section 2 we show that the optimal value of a suitably constructed sequence of matrix sum-of-squares relaxations converges to the optimal value of the original polynomial semidefinite optimization problem. In Section 3 we present a direct and compact description of the resulting linear Semi-Definite Programs (SDP s) with full flexibility in the choice of the underlying monomial basis. We will describe how we can solve these sum-of squares relaxations by optimizing Linear Matrix Inequality (LMI) problems and give their sizes in Section 4. Two examples are presented in Section 5. The first example is a non-convex quadratic SDP and the second a H 2 -optimal static controller synthesis problem. 2 A direct polynomial SDP-approach In this section we present an extension of the scalar polynomial optimization by SOS decompositions [5] to optimization problems with scalar polynomial objective and nonlinear semidefinite constraints. We formulate the relaxations in terms of Lagrange duality with SOS polynomials as multipliers which seems a bit more straightforward than the corresponding dual formulation based on the problem of moments [5]. 2.1 Polynomial semi-definite programming For x R n let f(x) and G(x) denote scalar and symmetric-matrix-valued polynomials in x, where G maps into S m, the set of symmetric m m matrices. Consider the following polynomial semi-definite optimization problem with optimal value d opt : infimize f(x) subject to G(x) 0 (1) With any matrix S 0, the value inf x R n f(x) + S, G(x) is a lower bound for d opt by standard weak duality. However, not even the maximization of this lower bound over S 0 allows to close the duality gap due to non-convexity of the problem. This is the reason for considering, instead, Lagrange multiplier matrices S(x) which are globally positive semidefinite polynomial functions of x, i.e. polynomials satisfying S(x) 0 for all x R n. Still inf x R n f(x) + S(x), G(x) defines a lower bound of d opt, and the best lower bound that is achievable in this fashion is given by the supremal t for which there exists a globally positive semi-definite polynomial matrix S such that f(x) + S(x), G(x) t > 0 for all x R n. In order to render the determination of this lower bound computational we introduce the following concept. A symmetric matrix-valued m m-polynomial matrix S(x) is said to be a (matrix) sum-of-squares if there exists a (not necessarily square and typically tall) polynomial matrix T(x) such that S(x) = T(x) T T(x). 2

3 If T j (x), j = 1,...,q denote the rows of T(x), we infer S(x) = q j=1 T j(x) T T j (x). If S(x) is a scalar then T j (x) are scalars which implies S(x) = q j=1 T j(x) 2. This motivates our terminology since we are dealing with a generalization of classical scalar SOS representations. Very similar to the scalar case, every SOS matrix is globally positive semi-definite, but the converse is not necessarily true. Let us now just replace all inequalities in the above derived program for the lower bound computations by the requirement that the corresponding polynomial or polynomial matrices are SOS. This leads to the following optimization problem: supremize subject to t S(x) and f(x) + S(x), G(x) t are SOS (2) If fixing upper bounds on the degree of the SOS matrix S(x), the value of this problem can be computed by solving a standard linear SDP as will be seen in Section 3. In this fashion one can construct a family of LMI relaxations for computing increasingly improving lower bounds. Under a suitable constraint qualification, due to Putinar for scalar problems, it is possible to prove that the value of (2) actually equals d opt. To the best of our knowledge, the generalization to matrix valued problems as formulated in the following result has, except for the recent independent work of Kojima [6], not been presented anywhere else in the literature. Theorem 1 Let the following constraint qualification hold true: There exists some r > 0 and some SOS matrix R(x) such that Then the optimal value of (2) equals d opt. r x 2 + R(x), G(x) is SOS. (3) Proof. The value of (2) is not larger than d opt. Since trivial for d opt =, we assume that G(x) 0 is feasible. Choose any ǫ > 0 and some ˆx with G(ˆx) 0 and f(ˆx) d opt + ǫ. Let us now suppose that S(x) and f(x) + S(x), G(x) t are SOS. Then d opt + ǫ t f(ˆx) t f(ˆx) + S(ˆx), G(ˆx) t 0 and thus d opt + ǫ t. Since ǫ was arbitrary we infer d opt t. To prove the converse we first reveal that, due to the constraint qualification, we can replace G(x) by Ĝ(x) = diag(g(x), x 2 r) in both (1) and (2) without changing their values. Indeed if G(x) 0 we infer from (3) that r x 2 r x 2 + R(x), G(x) 0. Therefore the extra constraint x 2 r 0 is redundant for Problem 1. We show redundancy for (2) in two steps. If S(x) and f(x) t+ S(x), G(x) are SOS we can define the SOS matrix Ŝ(x) = diag(s(x), 0) to conclude that f(x) t + Ŝ(x), Ĝ(x) is SOS (since it just equals f(x) t + S(x), G(x) ). Conversely suppose that Ŝ(x) = ˆT(x) T ˆT(x) and ˆt(x) T ˆt(x) = f(x) t + Ŝ(x), Ĝ(x) are SOS. Partition ˆT(x) = (T(x) u(x)) according to the columns of Ĝ(x). With the SOS polynomial v(x) T v(x) = r x 2 + R(x), G(x) we infer ˆt(x) T ˆt(x) = f(x) t + T(x) T T(x), G(x) + u(x) T u(x)( x 2 r) = = f(x) t + T(x) T T(x), G(x) + u(x) T u(x)( R(x), G(x) v(x) T v(x)) = = f(x) t + T(x) T T(x) + u(x) T u(x)r(x), G(x) u(x) T u(x)v(x) T v(x). With R(x) = R f (x) T R f (x) we now observe that ( S(x) := T(x) T T(x) + u(x) T u(x)r(x) = 3 T(x) u(x) R f (x) ) T ( T(x) u(x) R f (x) )

4 and ( s(x) := ˆt(x) T ˆt(x) + u(x) T u(x)v(x) T v(x) = ˆt(x) u(x) v(x) ) T ( ˆt(x) u(x) v(x) ) are SOS. Due to f(x) t + S(x), G(x) = s(x) the claim is proved. Hence from now one we can assume without loss of generality that there exists a standard unit vector v 1 with v T 1 G(x)v 1 = x 2 r. (4) Let us now choose a sequence of unit vectors v 2, v 3,... such that v i, i = 1, 2,... is dense in the Euclidean unit sphere, and consider the family of scalar polynomial optimization problems infimize subject to f(x) vi TG(x)v i 0, i = 1,...,N (5) with optimal values d N. Since any x with G(x) 0 is feasible for (5), we infer d N d opt. Moreover it is clear that d N d N+1 which implies d N d 0 d opt for N. Let us prove prove that d 0 = d opt. Due to (4) the feasible set of (5) is contained in {x R n x 2 r} and hence compact. Therefore there exists an optimal solution x N of (5), and we can choose a subsequence N ν with x Nν x 0. Hence d 0 = lim ν d Nν = lim ν f(x Nν ) = f(x 0 ). Then d 0 = d opt follows if we can show that G(x 0 ) 0. Otherwise there exists a unit vector v with ǫ := v T G(x 0 )v > 0. By convergence there exists some K with G(x Nν ) K for all ν. By density there exists a sufficiently large ν such that K v i v 2 + 2K v i v < ǫ/2 for some i {1,...,N ν }. We can take ν with v T G(x Nν )v ǫ/2 and arrive at 0 v T i G(x Nν )v i = (v i v) T G(x Nν )(v i v) + 2v T G(x Nν )(v i v) + v T G(x Nν )v K v i v 2 2K v i v + ǫ/2 > 0, a contradiction. Let us finally fix any ǫ > 0 and choose N with d N d opt ǫ/2. This implies f(x) d opt + ǫ > 0 for all x with v T i G(x)v i 0 for i = 1,...,N. Due to (4) we can apply Putinar s scalar representation result [9] to infer that there exist polynomials t i (x) for which f(x) d opt + ǫ + With the SOS matrix t N 1 (x)v T 1 S N (x) := v i t i (x) T t i (x)vi T =. i=1 t N (x)vn T N t i (x) T t i (x)vi T G(x)v i is SOS. (6) i=1 T t 1 (x)v T 1. t N (x)v T N we conclude that f(x) d opt +ǫ+ S N (x), G(x) equals (6) and is thus SOS. This implies that the optimal value of (2) is at least d opt ǫ, and since ǫ > 0 was arbitrary the proof is finished. Theorem 1 is a natural extension of a theorem of Putinar [9] for scalar polynomial problems to polynomial SDP s. Indeed, Lasserre s approach [5] for minimizing f(x) over scalar polynomial constraints g i (x) 0, i = 1,...,m, is recovered with G(x) = diag(g 1 (x),..., g m (x))., 4

5 Moreover the constraint qualification in Theorem 1 is a natural generalization of that used by Schweighofer [11]. Remark.It is a direct consequence of Theorem 1 that, as in the scalar case [10], the constraint qualification (3) can be equivalently formulated as follows: there exist an SOS matrix R(x) and an SOS polynomial s(x) such that {x R n R(x), G(x) s(x) 0} is compact. 3 Verification of matrix SOS property Let us now discuss how to construct a linear SDP representation of (2) if restricting the search of the SOS matrix S(x) to an arbitrary subspace of polynomials matrices. The suggested description allows for complete flexibility in the choice of the corresponding monomial basis with a direct and compact description of the resulting linear SDP, even for problems that involve SOS matrices. Moreover it forms the basis for trying to reduce the relaxation sizes for specific problem instances. For all these purposes let us choose a polynomial vector u(x) = col(u 1 (x),...,u nu (x)) whose components u j (x) are pairwise different x-monomials. Then S(x) of dimension m m is said to be SOS with respect to the monomial basis u(x) if there exist real matrices T j, j = 1,...,n u, such that n u n u S(x) = T(x) T T(x) with T(x) = T j u j (x) = T j (u j (x) I m ). If U = (T 1 T nu ) and if P denotes the permutation that guarantees u(x) I m = P[I m u(x)] we infer with W = (UP) T (UP) 0 that j=1 j=1 S(x) = [I m u(x)] T W[I m u(x)]. (7) In order to render this relation more explicit let us continue with the following simple concepts. If M R nm nm is partitioned into n n blocks as (M jk ) j,k=1,...,m define Tr(M 11 ) Tr(M 1m ) Trace m (M) =..... Tr(M m1 ) Tr(M mm ) as well as the bilinear mapping.,. m : R mm nm R mm nm R m m as A, B m = Trace m (A T B). One then easily verifies that [I m u(x)] T W[I m u(x)] = W, I m u(x)u(x) T m. If we denote the pairwise different monomials in u(x)u(x) T by w j (x), j = 1,...,n w, and if we determine the unique symmetric Z j with u(x)u(x) T = n w j=1 Z j w j (x), we can conclude that S(x) = n w j=1 W, I m Z j m w j (x). (8) 5

6 This proves one direction of the complete characterization of S(x) being SOS with respect to u(x), to be considered as a flexible generalization of the Gram-matrix method to polynomial matrices. Lemma 2 The matrix polynomial S(x) is SOS with respect to the monomial basis u(x) iff there exist symmetric S j such that S(x) = n w j=1 S j w j (x), and the linear system has a solution W 0. W, I m Z j m = S j, j = 1,...,n w (9) Proof. If W 0 satisfies (9) we can determine a Cholesky factorization of PWP T as U T U to obtain W = (UP) T (UP) and reverse the arguments. 4 Construction of LMI relaxation families The constraint in (2) is equivalent to the existence of SOS polynomials S(x) and s(x) such that f(x) + S(x), G(x) t = s(x), for all x R n. (10) With a monomial vector v(x) = ( v 1 (x) v 2 (x)... v nv (x) ) T let us represent the constraint function as n v G(x) = B i v i (x) = B(I m v(x)), i=1 where B i S m, i = 1,...,n v and B := ( B 1 B 2... B nv ). Moreover, let us choose monomial vectors u(x) and y(x) of length n u and n y to parameterize the SOS polynomials S(x) and s(x) with respect to u(x) and y(x) with W 0 and V 0 respectively, as in Section 3. We infer S(x), G(x) = Tr(S(x)G(x)) = Tr ( W, I m [u(x)u(x) T ] m [B(I m v(x))] ) = Tr ( W, ( I m [u(x)u(x) T ] ) ([B (I m v(x))] I nu ) m = Tr ( W, (B I nu ) ( I m [ v(x) u(x)u(x) T]) m). ) Let us now choose the pairwise different monomials w 0 (x) = 1, w 1 (x),...,w nw (x) to allow for the representations v(x) u(x)u(x) T = n w j=0 P j w j (x), y(x)y(x) T = n w j=0 Q j w j (x) and f(x) = n w j=0 a j w j (x) with P j R (nunv) nu, Q j R nu nu, a j R, j = 1,...,n w. Then there exist SOS polynomials S(x) and s(x) with respect to u(x) and y(x) respectively, such that (10) holds true if and only if there exists a solution to the following LMI system: W 0, V 0, (11) a 0 + Tr ( W, (B I nu )(I m P 0 ) m ) t = Tr(V Q 0 ), (12) a j + Tr ( W, (B I nu )(I m P j ) m ) = Tr(V Q j ), j = 1,...,n w. (13) 6

7 Table 1: Lower bounds and optimal values of quadratic SDP problem α 1 α 2 α 3 lower bound optimal value We can hence easily supremize t over these LMI constraints to determine a lower bound on the optimal value of (1). Moreover these lower bounds are guaranteed to converge to the optimal value of (1) if we choose u(x) and y(x) to comprise all monomials up to a certain degree, and if we let the degree bound grow to infinity. The size of the LMI relaxation for (1) is easily determined as follows. The constraints are (11), (12) and (13). The condition on the matrices W and V to be nonnegative definite in (11) comprise inequalities in S mnu and S ny respectively. On top of that (12) and (13) add 1 and n w scalar equation constraints respectively. The decision variables in the LMI relaxation are the lower bound t and the matrices for the SOS representation W S mnu and V S ny. Since a symmetric matrix in S n can be parameterized by a vector in R 1 2 n(n+1), we end up in total with mn u(mn u +1)+ 1 2 n y(n y +1) scalar variables in our LMI problem. 5 Applications 5.1 A quadratic SDP We computed lower bounds for the SDP problem minimize 1 x 2 + α(2/5 ( x 1 ) 2 2 3x 2 subject to G 1 (x) := 1 + x x x x 1 (x 1 + 1) + x 2 g 2 (x) :=.5 (x 1 0.4) 2 (x 2 0.2) 2 0 g 3 (x) := x , g 4(x) := x , ) 0 for three values of α, α 1 = 0.8, α 2 = 1.5, α 3 = 0. G 1 (x) 0 is a matrix-valued constraint, g 2 (x) 0 is chosen such that its feasible region is non-convex. The constraints g 3 (x) 0 and g 4 (x) 0 are just added to restrict the decision variables x 1 and x 2 in the interval ( 1, 1). For α = α 1 and α = α 2 the optimal solution lies on a point where the constraint g 2 (x) 0 with negative curvature is active, as is shown in Figure 1. For α = α 3 the optimal solution lies on a point where G 1 (x) 0 and g 2 (x) 0 are both active. With SOS bases u(x) = ( ) T 1 x 1 x 2 and y(x) = ( 1 x 1 x 2 x 1 x 2 x 2 1 x 2 ) T 2 we computed a lower bounds for the three values of α as shown in Table 1. The number of variables and constraints in our implementation of the LMI relaxation is 41 and 87 respectively. By gridding we have found optimal solutions as shown in Figure 1 and the optimal values are shown in Table 1. From the table we observe that in this example the algorithm finds the global optimal value with only a first order SOS basis, even though the non-convex constraint is active. Since G 1 (x) 0 is equivalent to det(g 1 (x)) 0, G 1 (1, 1)(x) 0 where G 1 (1, 1) denotes the left upper element of G 1, we can reduce this SDP problem to a scalar polynomial problem, which can be solved with the relaxation techniques for scalar polynomial optimization [5]. Indeed, the code GloptiPoly [3] gives the same results as in Table 1 with 27 LMI variables and 252 LMI constraints. We suspect however that for problems with a matrix-valued 7

8 x x 1 Figure 1: Feasible region (grey filled area) and optimal solutions ( ) for α i, i {1, 2, 3} polynomial G(x) of large size, the polynomial det(g(x)) will have high polynomial degree, such that the resulting LMI relaxations will be (much) larger in terms of decision variables and constraints than in our approach, since all monomials that occur in det(g(x)) must be included in the monomial vector. Our future research is aimed at getting numerical evidence for this conjecture. 5.2 Static H 2 controller synthesis Static H 2 controller synthesis is a non-convex problem that is important for practical implementation of controllers. Consider the following state-space system of a plant with only the closed-loop A cl -matrix depending affinely on the static controller matrix K R m 2 p 2 : ( ) ( ) Acl (K) B cl A + B2 KC := 2 B 1 C 1 0 C cl D cl where A R n n, B 1 R n m 1, B 2 R n m 2, C 1 R p 1 n and C 2 R p 2 n. The problem of finding the static controller with optimal closed-loop H 2 -norm can be written as follows minimize Tr(C cl XC T cl ) subject to A cl (K)X + XA cl (K) T + B cl B T cl = 0, X 0 (14) This is a semi-definite polynomial problem, which is non-convex due to the bilinear coupling of the variables X and K. We computed lower bounds for randomly generated 4 th order plants with n = 4, m 1 = 2, m 2 = 1, p 1 = 3, p 2 = 1 and computed upper bounds by gridding, as shown in Table 2. To keep the size of the LMI problems small, we used the very simple SOS bases u(x) = ( ) T, ( 1 k 1 y(x) = 1 k1 svec(x) k 1 svec(x) ) T, where svec(x) denotes the symmetric vectorization of the symmetric matrix X. The number of decision variables in the LMI is 469. Table 2 reveals that the lower bound is equal to the upper bound for 6 out of 11 cases, such that the relaxation gap is zero. This indicates that small SOS bases are often sufficient to obtain exact relaxations. Furthermore the lower bounds are in all cases larger 8

9 Table 2: Upper bound, lower bounds and full order (FO) H 2 performance for randomly generated 4 th order system Upper bound Lower bound FO performance than the trivial lower bound of full order performance, which are shown in the 3 rd column. It is not yet clear whether there is a fundamental reason for the lower bounds being no worse than the full order performance for this choice of bases. 6 Conclusions We have shown that there exist sequences of SOS relaxations whose optimal value converge from below to the optimal value of polynomial SDP programs. Furthermore we have discussed how these relaxations can be reformulated as LMI optimization problems with full flexibility on the choice of monomial bases. We have applied the method to two non-convex problems, an academic polynomial SDP problem and the fixed-order H 2 synthesis problem. The first example illustrated that the number of LMI constraints in our relaxation is smaller than in the relaxation obtained after scalarisation of the matrix-valued constraint G(x) 0 using the principal minors. This difference in computational complexity is probably even larger for constraints on matrix-valued polynomials with many rows and columns. The H 2 -synthesis example illustrated the applicability to non-convex control problems. We have computed good lower bounds with remarkably simple monomial bases. Apart from these applications the presented convergence result is of value for a variety of other matrix-valued optimization problems. Additional examples in control are inputoutput selection, where the integer constraints of type p {0, 1} are replaced by a quadratic constraint p(p 1) = 0, and spectral factorization of multidimensional transfer functions to asses dissipativity of linear shift-invariant distributed systems [8]. References [1] G. Chesi, A. Garulli, A. Tesi, and A. Vicino. An LMI-based approach for characterizing the solution set of polynomial systems. In Proc. 39th IEEE Conf. Decision and Control, Sydney, Australia, [2] G. Chesi, A. Garulli, A. Tesi, and A. Vicino. Homogeneous Lyapunov functions for systems with structured uncertainties. preprint,

10 [3] J.B. Lasserre D. Henrion. Detecting global optimality and extracting solutions in gloptipoly. Technical report, LAAS-CNRS, [4] D. Henrion, M. Sebek, and V. Kucera. Positive polynomials and robust stabilization with fixed-order controllers. IEEE Transactions on Automatic Control, 48: , [5] J.B. Lasserre. Global optimization with polynomials and the problem of moments. SIAM Journal of Optimization, 11: , [6] M.Kojima. Sums of squares relaxations of polynomial semidefinite programs. Technical report, Tokyo Institute of Technolgy, [7] P.A. Parrilo. Structured Semidefinite Prograns and Semialgebraic Geometry Methods in Robustness and Optimization. PhD thesis, Cailifornia Institute of Technology, [8] H. Pillai and J.C. Willems. Lossless and dissipative distributed systems. SIAM J. Control Optim., 40(5): , [9] M. Putinar. Positive polynomials on compact semi-algebraic sets. Indiana Univ. Math. J., 42: , [10] K. Schmüdgen. The K-moment problem for compact semi-algebraic sets. Math. Ann., 289(2): , [11] M. Schweighofer. Optimization of polynomials on compact semialgebraic sets. In Preprint,

Strong duality in Lasserre s hierarchy for polynomial optimization

Strong duality in Lasserre s hierarchy for polynomial optimization Strong duality in Lasserre s hierarchy for polynomial optimization arxiv:1405.7334v1 [math.oc] 28 May 2014 Cédric Josz 1,2, Didier Henrion 3,4,5 Draft of January 24, 2018 Abstract A polynomial optimization

More information

Lecture Note 5: Semidefinite Programming for Stability Analysis

Lecture Note 5: Semidefinite Programming for Stability Analysis ECE7850: Hybrid Systems:Theory and Applications Lecture Note 5: Semidefinite Programming for Stability Analysis Wei Zhang Assistant Professor Department of Electrical and Computer Engineering Ohio State

More information

Exact SDP Relaxations for Classes of Nonlinear Semidefinite Programming Problems

Exact SDP Relaxations for Classes of Nonlinear Semidefinite Programming Problems Exact SDP Relaxations for Classes of Nonlinear Semidefinite Programming Problems V. Jeyakumar and G. Li Revised Version:August 31, 2012 Abstract An exact semidefinite linear programming (SDP) relaxation

More information

ON SUM OF SQUARES DECOMPOSITION FOR A BIQUADRATIC MATRIX FUNCTION

ON SUM OF SQUARES DECOMPOSITION FOR A BIQUADRATIC MATRIX FUNCTION Annales Univ. Sci. Budapest., Sect. Comp. 33 (2010) 273-284 ON SUM OF SQUARES DECOMPOSITION FOR A BIQUADRATIC MATRIX FUNCTION L. László (Budapest, Hungary) Dedicated to Professor Ferenc Schipp on his 70th

More information

Semi-definite representibility. For fun and profit

Semi-definite representibility. For fun and profit Semi-definite representation For fun and profit March 9, 2010 Introduction Any convex quadratic constraint x T C T Cx a + b T x Can be recast as the linear matrix inequality (LMI): ( ) In Cx (Cx) T a +

More information

LMI MODELLING 4. CONVEX LMI MODELLING. Didier HENRION. LAAS-CNRS Toulouse, FR Czech Tech Univ Prague, CZ. Universidad de Valladolid, SP March 2009

LMI MODELLING 4. CONVEX LMI MODELLING. Didier HENRION. LAAS-CNRS Toulouse, FR Czech Tech Univ Prague, CZ. Universidad de Valladolid, SP March 2009 LMI MODELLING 4. CONVEX LMI MODELLING Didier HENRION LAAS-CNRS Toulouse, FR Czech Tech Univ Prague, CZ Universidad de Valladolid, SP March 2009 Minors A minor of a matrix F is the determinant of a submatrix

More information

A Note on KKT Points of Homogeneous Programs 1

A Note on KKT Points of Homogeneous Programs 1 A Note on KKT Points of Homogeneous Programs 1 Y. B. Zhao 2 and D. Li 3 Abstract. Homogeneous programming is an important class of optimization problems. The purpose of this note is to give a truly equivalent

More information

Advanced SDPs Lecture 6: March 16, 2017

Advanced SDPs Lecture 6: March 16, 2017 Advanced SDPs Lecture 6: March 16, 2017 Lecturers: Nikhil Bansal and Daniel Dadush Scribe: Daniel Dadush 6.1 Notation Let N = {0, 1,... } denote the set of non-negative integers. For α N n, define the

More information

4. Algebra and Duality

4. Algebra and Duality 4-1 Algebra and Duality P. Parrilo and S. Lall, CDC 2003 2003.12.07.01 4. Algebra and Duality Example: non-convex polynomial optimization Weak duality and duality gap The dual is not intrinsic The cone

More information

Minimum Ellipsoid Bounds for Solutions of Polynomial Systems via Sum of Squares

Minimum Ellipsoid Bounds for Solutions of Polynomial Systems via Sum of Squares Journal of Global Optimization (2005) 33: 511 525 Springer 2005 DOI 10.1007/s10898-005-2099-2 Minimum Ellipsoid Bounds for Solutions of Polynomial Systems via Sum of Squares JIAWANG NIE 1 and JAMES W.

More information

DOMAIN OF ATTRACTION: ESTIMATES FOR NON-POLYNOMIAL SYSTEMS VIA LMIS. Graziano Chesi

DOMAIN OF ATTRACTION: ESTIMATES FOR NON-POLYNOMIAL SYSTEMS VIA LMIS. Graziano Chesi DOMAIN OF ATTRACTION: ESTIMATES FOR NON-POLYNOMIAL SYSTEMS VIA LMIS Graziano Chesi Dipartimento di Ingegneria dell Informazione Università di Siena Email: chesi@dii.unisi.it Abstract: Estimating the Domain

More information

6-1 The Positivstellensatz P. Parrilo and S. Lall, ECC

6-1 The Positivstellensatz P. Parrilo and S. Lall, ECC 6-1 The Positivstellensatz P. Parrilo and S. Lall, ECC 2003 2003.09.02.10 6. The Positivstellensatz Basic semialgebraic sets Semialgebraic sets Tarski-Seidenberg and quantifier elimination Feasibility

More information

Converse Results on Existence of Sum of Squares Lyapunov Functions

Converse Results on Existence of Sum of Squares Lyapunov Functions 2011 50th IEEE Conference on Decision and Control and European Control Conference (CDC-ECC) Orlando, FL, USA, December 12-15, 2011 Converse Results on Existence of Sum of Squares Lyapunov Functions Amir

More information

Semidefinite Programming

Semidefinite Programming Semidefinite Programming Notes by Bernd Sturmfels for the lecture on June 26, 208, in the IMPRS Ringvorlesung Introduction to Nonlinear Algebra The transition from linear algebra to nonlinear algebra has

More information

Analytical Validation Tools for Safety Critical Systems

Analytical Validation Tools for Safety Critical Systems Analytical Validation Tools for Safety Critical Systems Peter Seiler and Gary Balas Department of Aerospace Engineering & Mechanics, University of Minnesota, Minneapolis, MN, 55455, USA Andrew Packard

More information

An explicit construction of distinguished representations of polynomials nonnegative over finite sets

An explicit construction of distinguished representations of polynomials nonnegative over finite sets An explicit construction of distinguished representations of polynomials nonnegative over finite sets Pablo A. Parrilo Automatic Control Laboratory Swiss Federal Institute of Technology Physikstrasse 3

More information

Robust Stability. Robust stability against time-invariant and time-varying uncertainties. Parameter dependent Lyapunov functions

Robust Stability. Robust stability against time-invariant and time-varying uncertainties. Parameter dependent Lyapunov functions Robust Stability Robust stability against time-invariant and time-varying uncertainties Parameter dependent Lyapunov functions Semi-infinite LMI problems From nominal to robust performance 1/24 Time-Invariant

More information

Convex Optimization & Parsimony of L p-balls representation

Convex Optimization & Parsimony of L p-balls representation Convex Optimization & Parsimony of L p -balls representation LAAS-CNRS and Institute of Mathematics, Toulouse, France IMA, January 2016 Motivation Unit balls associated with nonnegative homogeneous polynomials

More information

Fixed Order H -synthesis: Computing Optimal Values by Robust Performance Analysis

Fixed Order H -synthesis: Computing Optimal Values by Robust Performance Analysis Fixed Order H -synthesis: Computing Optimal Values by Robust Performance Analysis Camile Hol and Carsten Scherer Abstract The computation of optimal H controllers with a prescribed order is important for

More information

Optimization over Polynomials with Sums of Squares and Moment Matrices

Optimization over Polynomials with Sums of Squares and Moment Matrices Optimization over Polynomials with Sums of Squares and Moment Matrices Monique Laurent Centrum Wiskunde & Informatica (CWI), Amsterdam and University of Tilburg Positivity, Valuations and Quadratic Forms

More information

Detecting global optimality and extracting solutions in GloptiPoly

Detecting global optimality and extracting solutions in GloptiPoly Detecting global optimality and extracting solutions in GloptiPoly Didier Henrion 1, Jean-Bernard Lasserre 1 September 5, 5 Abstract GloptiPoly is a Matlab/SeDuMi add-on to build and solve convex linear

More information

Example: feasibility. Interpretation as formal proof. Example: linear inequalities and Farkas lemma

Example: feasibility. Interpretation as formal proof. Example: linear inequalities and Farkas lemma 4-1 Algebra and Duality P. Parrilo and S. Lall 2006.06.07.01 4. Algebra and Duality Example: non-convex polynomial optimization Weak duality and duality gap The dual is not intrinsic The cone of valid

More information

Some Properties of the Augmented Lagrangian in Cone Constrained Optimization

Some Properties of the Augmented Lagrangian in Cone Constrained Optimization MATHEMATICS OF OPERATIONS RESEARCH Vol. 29, No. 3, August 2004, pp. 479 491 issn 0364-765X eissn 1526-5471 04 2903 0479 informs doi 10.1287/moor.1040.0103 2004 INFORMS Some Properties of the Augmented

More information

The moment-lp and moment-sos approaches

The moment-lp and moment-sos approaches The moment-lp and moment-sos approaches LAAS-CNRS and Institute of Mathematics, Toulouse, France CIRM, November 2013 Semidefinite Programming Why polynomial optimization? LP- and SDP- CERTIFICATES of POSITIVITY

More information

Unbounded Convex Semialgebraic Sets as Spectrahedral Shadows

Unbounded Convex Semialgebraic Sets as Spectrahedral Shadows Unbounded Convex Semialgebraic Sets as Spectrahedral Shadows Shaowei Lin 9 Dec 2010 Abstract Recently, Helton and Nie [3] showed that a compact convex semialgebraic set S is a spectrahedral shadow if the

More information

Research Reports on Mathematical and Computing Sciences

Research Reports on Mathematical and Computing Sciences ISSN 1342-284 Research Reports on Mathematical and Computing Sciences Exploiting Sparsity in Linear and Nonlinear Matrix Inequalities via Positive Semidefinite Matrix Completion Sunyoung Kim, Masakazu

More information

Robust and Optimal Control, Spring 2015

Robust and Optimal Control, Spring 2015 Robust and Optimal Control, Spring 2015 Instructor: Prof. Masayuki Fujita (S5-303B) G. Sum of Squares (SOS) G.1 SOS Program: SOS/PSD and SDP G.2 Duality, valid ineqalities and Cone G.3 Feasibility/Optimization

More information

Block diagonalization of matrix-valued sum-of-squares programs

Block diagonalization of matrix-valued sum-of-squares programs Block diagonalization of matrix-valued sum-of-squares programs Johan Löfberg Division of Automatic Control Department of Electrical Engineering Linköpings universitet, SE-581 83 Linköping, Sweden WWW:

More information

On Polynomial Optimization over Non-compact Semi-algebraic Sets

On Polynomial Optimization over Non-compact Semi-algebraic Sets On Polynomial Optimization over Non-compact Semi-algebraic Sets V. Jeyakumar, J.B. Lasserre and G. Li Revised Version: April 3, 2014 Communicated by Lionel Thibault Abstract The optimal value of a polynomial

More information

Hilbert s 17th Problem to Semidefinite Programming & Convex Algebraic Geometry

Hilbert s 17th Problem to Semidefinite Programming & Convex Algebraic Geometry Hilbert s 17th Problem to Semidefinite Programming & Convex Algebraic Geometry Rekha R. Thomas University of Washington, Seattle References Monique Laurent, Sums of squares, moment matrices and optimization

More information

Rank-one LMIs and Lyapunov's Inequality. Gjerrit Meinsma 4. Abstract. We describe a new proof of the well-known Lyapunov's matrix inequality about

Rank-one LMIs and Lyapunov's Inequality. Gjerrit Meinsma 4. Abstract. We describe a new proof of the well-known Lyapunov's matrix inequality about Rank-one LMIs and Lyapunov's Inequality Didier Henrion 1;; Gjerrit Meinsma Abstract We describe a new proof of the well-known Lyapunov's matrix inequality about the location of the eigenvalues of a matrix

More information

COURSE ON LMI PART I.2 GEOMETRY OF LMI SETS. Didier HENRION henrion

COURSE ON LMI PART I.2 GEOMETRY OF LMI SETS. Didier HENRION   henrion COURSE ON LMI PART I.2 GEOMETRY OF LMI SETS Didier HENRION www.laas.fr/ henrion October 2006 Geometry of LMI sets Given symmetric matrices F i we want to characterize the shape in R n of the LMI set F

More information

How to generate weakly infeasible semidefinite programs via Lasserre s relaxations for polynomial optimization

How to generate weakly infeasible semidefinite programs via Lasserre s relaxations for polynomial optimization CS-11-01 How to generate weakly infeasible semidefinite programs via Lasserre s relaxations for polynomial optimization Hayato Waki Department of Computer Science, The University of Electro-Communications

More information

Fast Algorithms for SDPs derived from the Kalman-Yakubovich-Popov Lemma

Fast Algorithms for SDPs derived from the Kalman-Yakubovich-Popov Lemma Fast Algorithms for SDPs derived from the Kalman-Yakubovich-Popov Lemma Venkataramanan (Ragu) Balakrishnan School of ECE, Purdue University 8 September 2003 European Union RTN Summer School on Multi-Agent

More information

CONVEXITY IN SEMI-ALGEBRAIC GEOMETRY AND POLYNOMIAL OPTIMIZATION

CONVEXITY IN SEMI-ALGEBRAIC GEOMETRY AND POLYNOMIAL OPTIMIZATION CONVEXITY IN SEMI-ALGEBRAIC GEOMETRY AND POLYNOMIAL OPTIMIZATION JEAN B. LASSERRE Abstract. We review several (and provide new) results on the theory of moments, sums of squares and basic semi-algebraic

More information

Local Stability Analysis for Uncertain Nonlinear Systems

Local Stability Analysis for Uncertain Nonlinear Systems 1042 IEEE TRANSACTIONS ON AUTOMATIC CONTROL, VOL. 54, NO. 5, MAY 2009 Local Stability Analysis for Uncertain Nonlinear Systems Ufuk Topcu and Andrew Packard Abstract We propose a method to compute provably

More information

arxiv: v1 [math.oc] 31 Jan 2017

arxiv: v1 [math.oc] 31 Jan 2017 CONVEX CONSTRAINED SEMIALGEBRAIC VOLUME OPTIMIZATION: APPLICATION IN SYSTEMS AND CONTROL 1 Ashkan Jasour, Constantino Lagoa School of Electrical Engineering and Computer Science, Pennsylvania State University

More information

Solving Global Optimization Problems with Sparse Polynomials and Unbounded Semialgebraic Feasible Sets

Solving Global Optimization Problems with Sparse Polynomials and Unbounded Semialgebraic Feasible Sets Solving Global Optimization Problems with Sparse Polynomials and Unbounded Semialgebraic Feasible Sets V. Jeyakumar, S. Kim, G. M. Lee and G. Li June 6, 2014 Abstract We propose a hierarchy of semidefinite

More information

Convergence rates of moment-sum-of-squares hierarchies for volume approximation of semialgebraic sets

Convergence rates of moment-sum-of-squares hierarchies for volume approximation of semialgebraic sets Convergence rates of moment-sum-of-squares hierarchies for volume approximation of semialgebraic sets Milan Korda 1, Didier Henrion,3,4 Draft of December 1, 016 Abstract Moment-sum-of-squares hierarchies

More information

Detecting global optimality and extracting solutions in GloptiPoly

Detecting global optimality and extracting solutions in GloptiPoly Detecting global optimality and extracting solutions in GloptiPoly Didier HENRION 1,2 Jean-Bernard LASSERRE 1 1 LAAS-CNRS Toulouse 2 ÚTIA-AVČR Prague Part 1 Description of GloptiPoly Brief description

More information

Lecture 6 Verification of Hybrid Systems

Lecture 6 Verification of Hybrid Systems Lecture 6 Verification of Hybrid Systems Ufuk Topcu Nok Wongpiromsarn Richard M. Murray AFRL, 25 April 2012 Outline: A hybrid system model Finite-state abstractions and use of model checking Deductive

More information

that a broad class of conic convex polynomial optimization problems, called

that a broad class of conic convex polynomial optimization problems, called JOTA manuscript No. (will be inserted by the editor) Exact Conic Programming Relaxations for a Class of Convex Polynomial Cone-Programs Vaithilingam Jeyakumar Guoyin Li Communicated by Levent Tunçel Abstract

More information

Stability of linear time-varying systems through quadratically parameter-dependent Lyapunov functions

Stability of linear time-varying systems through quadratically parameter-dependent Lyapunov functions Stability of linear time-varying systems through quadratically parameter-dependent Lyapunov functions Vinícius F. Montagner Department of Telematics Pedro L. D. Peres School of Electrical and Computer

More information

Strange Behaviors of Interior-point Methods. for Solving Semidefinite Programming Problems. in Polynomial Optimization

Strange Behaviors of Interior-point Methods. for Solving Semidefinite Programming Problems. in Polynomial Optimization CS-08-02 Strange Behaviors of Interior-point Methods for Solving Semidefinite Programming Problems in Polynomial Optimization Hayato Waki, Maho Nakata, and Masakazu Muramatsu Department of Computer Science,

More information

Trust Region Problems with Linear Inequality Constraints: Exact SDP Relaxation, Global Optimality and Robust Optimization

Trust Region Problems with Linear Inequality Constraints: Exact SDP Relaxation, Global Optimality and Robust Optimization Trust Region Problems with Linear Inequality Constraints: Exact SDP Relaxation, Global Optimality and Robust Optimization V. Jeyakumar and G. Y. Li Revised Version: September 11, 2013 Abstract The trust-region

More information

A semidefinite relaxation scheme for quadratically constrained quadratic problems with an additional linear constraint

A semidefinite relaxation scheme for quadratically constrained quadratic problems with an additional linear constraint Iranian Journal of Operations Research Vol. 2, No. 2, 20, pp. 29-34 A semidefinite relaxation scheme for quadratically constrained quadratic problems with an additional linear constraint M. Salahi Semidefinite

More information

Estimating the Region of Attraction of Ordinary Differential Equations by Quantified Constraint Solving

Estimating the Region of Attraction of Ordinary Differential Equations by Quantified Constraint Solving Estimating the Region of Attraction of Ordinary Differential Equations by Quantified Constraint Solving Henning Burchardt and Stefan Ratschan October 31, 2007 Abstract We formulate the problem of estimating

More information

Formal Proofs, Program Analysis and Moment-SOS Relaxations

Formal Proofs, Program Analysis and Moment-SOS Relaxations Formal Proofs, Program Analysis and Moment-SOS Relaxations Victor Magron, Postdoc LAAS-CNRS 15 July 2014 Imperial College Department of Electrical and Electronic Eng. y b sin( par + b) b 1 1 b1 b2 par

More information

Convex Optimization. (EE227A: UC Berkeley) Lecture 28. Suvrit Sra. (Algebra + Optimization) 02 May, 2013

Convex Optimization. (EE227A: UC Berkeley) Lecture 28. Suvrit Sra. (Algebra + Optimization) 02 May, 2013 Convex Optimization (EE227A: UC Berkeley) Lecture 28 (Algebra + Optimization) 02 May, 2013 Suvrit Sra Admin Poster presentation on 10th May mandatory HW, Midterm, Quiz to be reweighted Project final report

More information

On optimal quadratic Lyapunov functions for polynomial systems

On optimal quadratic Lyapunov functions for polynomial systems On optimal quadratic Lyapunov functions for polynomial systems G. Chesi 1,A.Tesi 2, A. Vicino 1 1 Dipartimento di Ingegneria dell Informazione, Università disiena Via Roma 56, 53100 Siena, Italy 2 Dipartimento

More information

Notes on the decomposition result of Karlin et al. [2] for the hierarchy of Lasserre by M. Laurent, December 13, 2012

Notes on the decomposition result of Karlin et al. [2] for the hierarchy of Lasserre by M. Laurent, December 13, 2012 Notes on the decomposition result of Karlin et al. [2] for the hierarchy of Lasserre by M. Laurent, December 13, 2012 We present the decomposition result of Karlin et al. [2] for the hierarchy of Lasserre

More information

Distributionally robust optimization techniques in batch bayesian optimisation

Distributionally robust optimization techniques in batch bayesian optimisation Distributionally robust optimization techniques in batch bayesian optimisation Nikitas Rontsis June 13, 2016 1 Introduction This report is concerned with performing batch bayesian optimization of an unknown

More information

Representations of Positive Polynomials: Theory, Practice, and

Representations of Positive Polynomials: Theory, Practice, and Representations of Positive Polynomials: Theory, Practice, and Applications Dept. of Mathematics and Computer Science Emory University, Atlanta, GA Currently: National Science Foundation Temple University

More information

Linear Matrix Inequalities in Control

Linear Matrix Inequalities in Control Linear Matrix Inequalities in Control Delft Center for Systems and Control (DCSC) Delft University of Technology The Netherlands Department of Electrical Engineering Eindhoven University of Technology

More information

On parameter-dependent Lyapunov functions for robust stability of linear systems

On parameter-dependent Lyapunov functions for robust stability of linear systems On parameter-dependent Lyapunov functions for robust stability of linear systems Didier Henrion, Denis Arzelier, Dimitri Peaucelle, Jean-Bernard Lasserre Abstract For a linear system affected by real parametric

More information

LMI relaxations in robust control (tutorial)

LMI relaxations in robust control (tutorial) LM relaxations in robust control tutorial CW Scherer Delft Center for Systems and Control Delft University of Technology Mekelweg 2, 2628 CD Delft, The Netherlands cwscherer@dcsctudelftnl Abstract This

More information

A new look at nonnegativity on closed sets

A new look at nonnegativity on closed sets A new look at nonnegativity on closed sets LAAS-CNRS and Institute of Mathematics, Toulouse, France IPAM, UCLA September 2010 Positivstellensatze for semi-algebraic sets K R n from the knowledge of defining

More information

Are There Sixth Order Three Dimensional PNS Hankel Tensors?

Are There Sixth Order Three Dimensional PNS Hankel Tensors? Are There Sixth Order Three Dimensional PNS Hankel Tensors? Guoyin Li Liqun Qi Qun Wang November 17, 014 Abstract Are there positive semi-definite PSD) but not sums of squares SOS) Hankel tensors? If the

More information

A Note on Nonconvex Minimax Theorem with Separable Homogeneous Polynomials

A Note on Nonconvex Minimax Theorem with Separable Homogeneous Polynomials A Note on Nonconvex Minimax Theorem with Separable Homogeneous Polynomials G. Y. Li Communicated by Harold P. Benson Abstract The minimax theorem for a convex-concave bifunction is a fundamental theorem

More information

A positive definite polynomial Hessian that does not factor

A positive definite polynomial Hessian that does not factor A positive definite polynomial Hessian that does not factor The MIT Faculty has made this article openly available Please share how this access benefits you Your story matters Citation As Published Publisher

More information

Local Stability Analysis For Uncertain Nonlinear Systems Using A Branch-and-Bound Algorithm

Local Stability Analysis For Uncertain Nonlinear Systems Using A Branch-and-Bound Algorithm 28 American Control Conference Westin Seattle Hotel, Seattle, Washington, USA June 11-13, 28 ThC15.2 Local Stability Analysis For Uncertain Nonlinear Systems Using A Branch-and-Bound Algorithm Ufuk Topcu,

More information

V&V MURI Overview Caltech, October 2008

V&V MURI Overview Caltech, October 2008 V&V MURI Overview Caltech, October 2008 Pablo A. Parrilo Laboratory for Information and Decision Systems Massachusetts Institute of Technology Goals!! Specification, design, and certification!! Coherent

More information

Denis ARZELIER arzelier

Denis ARZELIER   arzelier COURSE ON LMI OPTIMIZATION WITH APPLICATIONS IN CONTROL PART II.2 LMIs IN SYSTEMS CONTROL STATE-SPACE METHODS PERFORMANCE ANALYSIS and SYNTHESIS Denis ARZELIER www.laas.fr/ arzelier arzelier@laas.fr 15

More information

Semidefinite Programming Duality and Linear Time-invariant Systems

Semidefinite Programming Duality and Linear Time-invariant Systems Semidefinite Programming Duality and Linear Time-invariant Systems Venkataramanan (Ragu) Balakrishnan School of ECE, Purdue University 2 July 2004 Workshop on Linear Matrix Inequalities in Control LAAS-CNRS,

More information

An Introduction to Linear Matrix Inequalities. Raktim Bhattacharya Aerospace Engineering, Texas A&M University

An Introduction to Linear Matrix Inequalities. Raktim Bhattacharya Aerospace Engineering, Texas A&M University An Introduction to Linear Matrix Inequalities Raktim Bhattacharya Aerospace Engineering, Texas A&M University Linear Matrix Inequalities What are they? Inequalities involving matrix variables Matrix variables

More information

Nonlinear Control Design for Linear Differential Inclusions via Convex Hull Quadratic Lyapunov Functions

Nonlinear Control Design for Linear Differential Inclusions via Convex Hull Quadratic Lyapunov Functions Nonlinear Control Design for Linear Differential Inclusions via Convex Hull Quadratic Lyapunov Functions Tingshu Hu Abstract This paper presents a nonlinear control design method for robust stabilization

More information

I.3. LMI DUALITY. Didier HENRION EECI Graduate School on Control Supélec - Spring 2010

I.3. LMI DUALITY. Didier HENRION EECI Graduate School on Control Supélec - Spring 2010 I.3. LMI DUALITY Didier HENRION henrion@laas.fr EECI Graduate School on Control Supélec - Spring 2010 Primal and dual For primal problem p = inf x g 0 (x) s.t. g i (x) 0 define Lagrangian L(x, z) = g 0

More information

Introduction to Real Analysis Alternative Chapter 1

Introduction to Real Analysis Alternative Chapter 1 Christopher Heil Introduction to Real Analysis Alternative Chapter 1 A Primer on Norms and Banach Spaces Last Updated: March 10, 2018 c 2018 by Christopher Heil Chapter 1 A Primer on Norms and Banach Spaces

More information

Polynomial level-set methods for nonlinear dynamical systems analysis

Polynomial level-set methods for nonlinear dynamical systems analysis Proceedings of the Allerton Conference on Communication, Control and Computing pages 64 649, 8-3 September 5. 5.7..4 Polynomial level-set methods for nonlinear dynamical systems analysis Ta-Chung Wang,4

More information

Introduction to Semidefinite Programming I: Basic properties a

Introduction to Semidefinite Programming I: Basic properties a Introduction to Semidefinite Programming I: Basic properties and variations on the Goemans-Williamson approximation algorithm for max-cut MFO seminar on Semidefinite Programming May 30, 2010 Semidefinite

More information

Moments and Positive Polynomials for Optimization II: LP- VERSUS SDP-relaxations

Moments and Positive Polynomials for Optimization II: LP- VERSUS SDP-relaxations Moments and Positive Polynomials for Optimization II: LP- VERSUS SDP-relaxations LAAS-CNRS and Institute of Mathematics, Toulouse, France Tutorial, IMS, Singapore 2012 LP-relaxations LP- VERSUS SDP-relaxations

More information

Approximation algorithms for nonnegative polynomial optimization problems over unit spheres

Approximation algorithms for nonnegative polynomial optimization problems over unit spheres Front. Math. China 2017, 12(6): 1409 1426 https://doi.org/10.1007/s11464-017-0644-1 Approximation algorithms for nonnegative polynomial optimization problems over unit spheres Xinzhen ZHANG 1, Guanglu

More information

Module 04 Optimization Problems KKT Conditions & Solvers

Module 04 Optimization Problems KKT Conditions & Solvers Module 04 Optimization Problems KKT Conditions & Solvers Ahmad F. Taha EE 5243: Introduction to Cyber-Physical Systems Email: ahmad.taha@utsa.edu Webpage: http://engineering.utsa.edu/ taha/index.html September

More information

Lagrange Duality. Daniel P. Palomar. Hong Kong University of Science and Technology (HKUST)

Lagrange Duality. Daniel P. Palomar. Hong Kong University of Science and Technology (HKUST) Lagrange Duality Daniel P. Palomar Hong Kong University of Science and Technology (HKUST) ELEC5470 - Convex Optimization Fall 2017-18, HKUST, Hong Kong Outline of Lecture Lagrangian Dual function Dual

More information

Hybrid System Identification via Sparse Polynomial Optimization

Hybrid System Identification via Sparse Polynomial Optimization 2010 American Control Conference Marriott Waterfront, Baltimore, MD, USA June 30-July 02, 2010 WeA046 Hybrid System Identification via Sparse Polynomial Optimization Chao Feng, Constantino M Lagoa and

More information

THEOREM OF OSELEDETS. We recall some basic facts and terminology relative to linear cocycles and the multiplicative ergodic theorem of Oseledets [1].

THEOREM OF OSELEDETS. We recall some basic facts and terminology relative to linear cocycles and the multiplicative ergodic theorem of Oseledets [1]. THEOREM OF OSELEDETS We recall some basic facts and terminology relative to linear cocycles and the multiplicative ergodic theorem of Oseledets []. 0.. Cocycles over maps. Let µ be a probability measure

More information

arxiv: v1 [math.oc] 9 Sep 2015

arxiv: v1 [math.oc] 9 Sep 2015 CONTAINMENT PROBLEMS FOR PROJECTIONS OF POLYHEDRA AND SPECTRAHEDRA arxiv:1509.02735v1 [math.oc] 9 Sep 2015 KAI KELLNER Abstract. Spectrahedra are affine sections of the cone of positive semidefinite matrices

More information

A General Framework for Convex Relaxation of Polynomial Optimization Problems over Cones

A General Framework for Convex Relaxation of Polynomial Optimization Problems over Cones Research Reports on Mathematical and Computing Sciences Series B : Operations Research Department of Mathematical and Computing Sciences Tokyo Institute of Technology 2-12-1 Oh-Okayama, Meguro-ku, Tokyo

More information

Network Utility Maximization With Nonconcave Utilities Using Sum-of-Squares Method

Network Utility Maximization With Nonconcave Utilities Using Sum-of-Squares Method Proceedings of the 44th IEEE Conference on Decision and Control, and the European Control Conference 2005 Seville, Spain, December 2-5, 2005 MoC4.6 Network Utility Maximization With Nonconcave Utilities

More information

Semialgebraic Relaxations using Moment-SOS Hierarchies

Semialgebraic Relaxations using Moment-SOS Hierarchies Semialgebraic Relaxations using Moment-SOS Hierarchies Victor Magron, Postdoc LAAS-CNRS 17 September 2014 SIERRA Seminar Laboratoire d Informatique de l Ecole Normale Superieure y b sin( par + b) b 1 1

More information

Linear Matrix Inequality (LMI)

Linear Matrix Inequality (LMI) Linear Matrix Inequality (LMI) A linear matrix inequality is an expression of the form where F (x) F 0 + x 1 F 1 + + x m F m > 0 (1) x = (x 1,, x m ) R m, F 0,, F m are real symmetric matrices, and the

More information

Moments and Positive Polynomials for Optimization II: LP- VERSUS SDP-relaxations

Moments and Positive Polynomials for Optimization II: LP- VERSUS SDP-relaxations Moments and Positive Polynomials for Optimization II: LP- VERSUS SDP-relaxations LAAS-CNRS and Institute of Mathematics, Toulouse, France EECI Course: February 2016 LP-relaxations LP- VERSUS SDP-relaxations

More information

Convex Optimization 1

Convex Optimization 1 Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.245: MULTIVARIABLE CONTROL SYSTEMS by A. Megretski Convex Optimization 1 Many optimization objectives generated

More information

Dimension reduction for semidefinite programming

Dimension reduction for semidefinite programming 1 / 22 Dimension reduction for semidefinite programming Pablo A. Parrilo Laboratory for Information and Decision Systems Electrical Engineering and Computer Science Massachusetts Institute of Technology

More information

Optimization based robust control

Optimization based robust control Optimization based robust control Didier Henrion 1,2 Draft of March 27, 2014 Prepared for possible inclusion into The Encyclopedia of Systems and Control edited by John Baillieul and Tariq Samad and published

More information

Lecture 5. 1 Goermans-Williamson Algorithm for the maxcut problem

Lecture 5. 1 Goermans-Williamson Algorithm for the maxcut problem Math 280 Geometric and Algebraic Ideas in Optimization April 26, 2010 Lecture 5 Lecturer: Jesús A De Loera Scribe: Huy-Dung Han, Fabio Lapiccirella 1 Goermans-Williamson Algorithm for the maxcut problem

More information

9. Interpretations, Lifting, SOS and Moments

9. Interpretations, Lifting, SOS and Moments 9-1 Interpretations, Lifting, SOS and Moments P. Parrilo and S. Lall, CDC 2003 2003.12.07.04 9. Interpretations, Lifting, SOS and Moments Polynomial nonnegativity Sum of squares (SOS) decomposition Eample

More information

Nu-Gap Metric A Sum-Of-Squares and Linear Matrix Inequality Approach

Nu-Gap Metric A Sum-Of-Squares and Linear Matrix Inequality Approach Nationaal Lucht- en Ruimtevaartlaboratorium National Aerospace Laboratory NLR NLR-P-014-319 Nu-Gap Metric A Sum-Of-Squares and Linear Matrix Inequality Approach S. aamallah Nationaal Lucht- en Ruimtevaartlaboratorium

More information

Certified Roundoff Error Bounds using Semidefinite Programming

Certified Roundoff Error Bounds using Semidefinite Programming Certified Roundoff Error Bounds using Semidefinite Programming Victor Magron, CNRS VERIMAG joint work with G. Constantinides and A. Donaldson INRIA Mescal Team Seminar 19 November 2015 Victor Magron Certified

More information

A Hierarchy of Suboptimal Policies for the Multi-period, Multi-echelon, Robust Inventory Problem

A Hierarchy of Suboptimal Policies for the Multi-period, Multi-echelon, Robust Inventory Problem A Hierarchy of Suboptimal Policies for the Multi-period, Multi-echelon, Robust Inventory Problem Dimitris J. Bertsimas Dan A. Iancu Pablo A. Parrilo Sloan School of Management and Operations Research Center,

More information

Selected Examples of CONIC DUALITY AT WORK Robust Linear Optimization Synthesis of Linear Controllers Matrix Cube Theorem A.

Selected Examples of CONIC DUALITY AT WORK Robust Linear Optimization Synthesis of Linear Controllers Matrix Cube Theorem A. . Selected Examples of CONIC DUALITY AT WORK Robust Linear Optimization Synthesis of Linear Controllers Matrix Cube Theorem A. Nemirovski Arkadi.Nemirovski@isye.gatech.edu Linear Optimization Problem,

More information

MIT Algebraic techniques and semidefinite optimization May 9, Lecture 21. Lecturer: Pablo A. Parrilo Scribe:???

MIT Algebraic techniques and semidefinite optimization May 9, Lecture 21. Lecturer: Pablo A. Parrilo Scribe:??? MIT 6.972 Algebraic techniques and semidefinite optimization May 9, 2006 Lecture 2 Lecturer: Pablo A. Parrilo Scribe:??? In this lecture we study techniques to exploit the symmetry that can be present

More information

Rational Sums of Squares and Applications

Rational Sums of Squares and Applications f ΣR[X] 2 " f ΣQ[X] 2 Rational Sums of Squares and Applications Christopher Hillar (MSRI & Berkeley) A 2008 study found that adding a picture of a brain scan to a scientific argument about human nature

More information

Math Linear Algebra II. 1. Inner Products and Norms

Math Linear Algebra II. 1. Inner Products and Norms Math 342 - Linear Algebra II Notes 1. Inner Products and Norms One knows from a basic introduction to vectors in R n Math 254 at OSU) that the length of a vector x = x 1 x 2... x n ) T R n, denoted x,

More information

Convexification of Mixed-Integer Quadratically Constrained Quadratic Programs

Convexification of Mixed-Integer Quadratically Constrained Quadratic Programs Convexification of Mixed-Integer Quadratically Constrained Quadratic Programs Laura Galli 1 Adam N. Letchford 2 Lancaster, April 2011 1 DEIS, University of Bologna, Italy 2 Department of Management Science,

More information

Semidefinite Representation of Convex Sets

Semidefinite Representation of Convex Sets Mathematical Programming manuscript No. (will be inserted by the editor J. William Helton Jiawang Nie Semidefinite Representation of Convex Sets the date of receipt and acceptance should be inserted later

More information

Convex Optimization Theory. Chapter 5 Exercises and Solutions: Extended Version

Convex Optimization Theory. Chapter 5 Exercises and Solutions: Extended Version Convex Optimization Theory Chapter 5 Exercises and Solutions: Extended Version Dimitri P. Bertsekas Massachusetts Institute of Technology Athena Scientific, Belmont, Massachusetts http://www.athenasc.com

More information

Randomized Coordinate Descent Methods on Optimization Problems with Linearly Coupled Constraints

Randomized Coordinate Descent Methods on Optimization Problems with Linearly Coupled Constraints Randomized Coordinate Descent Methods on Optimization Problems with Linearly Coupled Constraints By I. Necoara, Y. Nesterov, and F. Glineur Lijun Xu Optimization Group Meeting November 27, 2012 Outline

More information

On Bounded Real Matrix Inequality Dilation

On Bounded Real Matrix Inequality Dilation On Bounded Real Matrix Inequality Dilation Solmaz Sajjadi-Kia and Faryar Jabbari Abstract We discuss a variation of dilated matrix inequalities for the conventional Bounded Real matrix inequality, and

More information

Linear and non-linear programming

Linear and non-linear programming Linear and non-linear programming Benjamin Recht March 11, 2005 The Gameplan Constrained Optimization Convexity Duality Applications/Taxonomy 1 Constrained Optimization minimize f(x) subject to g j (x)

More information