Strong Duality in Robust Semi-Definite Linear Programming under Data Uncertainty
|
|
- Alexis Hall
- 5 years ago
- Views:
Transcription
1 Strong Duality in Robust Semi-Definite Linear Programming under Data Uncertainty V. Jeyakumar and G. Y. Li March 1, 2012 Abstract This paper develops the deterministic approach to duality for semi-definite linear programming problems in the face of data uncertainty. We establish strong duality between the robust counterpart of an uncertain semi-definite linear programming model problem and the optimistic counterpart of its uncertain dual. We prove that strong duality between the deterministic counterparts holds under a characteristic cone condition. We also show that the characteristic cone condition is also necessary for the validity of strong duality for every linear objective function of the original model problem. In addition, we derive that a robust Slater condition alone ensures strong duality for uncertain semi-definite linear programs with affinely parameterized data. Finally, we present strong duality with computationally easy tractable dual programs for two classes of uncertainties: the scenario uncertainty and the spectral norm uncertainty. Key words. Robust optimization, semi-definite programming under uncertainty, strong duality, linear matrix inequality problems. AMS subject classification. 90C22,90C25,90C46 Research was partially supported by a grant from the Australian Research Counc. Department of Applied Mathematics, University of New South Wales, Sydney 2052, Australia. E- ma: v.jeyakumar@unsw.edu.au Department of Applied Mathematics, University of New South Wales, Sydney 2052, Australia. E- ma: g.li@unsw.edu.au 1
2 1 Introduction A semi-definite linear programming model problem SDP) in the absence of data uncertainty consists of minimizing a linear objective function over a linear matrix inequality constraint [21]. It is often expressed in the form x R n{ct 0 x i A i 0}, where the data consist of the objective vector c R n and the matrices A i S m, the space of m m) symmetric matrices. For A S m, A 0 resp., A 0) means that A is positive semi-definite resp., definite). Model problems of this form arise in a wide range of engineering applications [7, 21]. In practice, however, these data are subject to uncertainty due to modeling or forecasting errors [4, 5, 8, 18]. The robust optimization approach to examining semi-definite linear programming problems under data uncertainty, developed in this paper, is to treat uncertainty as deterministic via uncertainty sets which are closed and convex. On the other hand, stochastic approaches to optimization under uncertainty start by assuming the uncertainty has a probabistic description. The best known technique is based on stochastic programming [20]. We consider the semi-definite linear programming model problem with data uncertainty in the constraint UP ) x R n{ct 0 x i A i 0} and its Lagrangian dual program with the same uncertain data DP ) max F S m{ TrA 0F ) : TrA i F ) = c i, i = 1,..., n, F 0}, where, for each i = 0, 1,..., n, A i S m, is uncertain and belongs to the uncertainty set V i which is a closed and convex set in S m, and TrA i F ) is the trace of the matrix A i F. We study duality between the uncertain dual pair by examining their deterministic counterparts: the robust counterpart of UP) RP ) x R n{ct 0 x i A i 0, A i V i, i I}, where the index set I := {0, 1,..., n}, and the optimistic counterpart of the uncertain dual program DP) ODP ) sup A i V i,i I max F S m{ TrA 0F ) : TrA i F ) = c i, i I\{0}, F 0}. The strong duality in robust semi-definite programming states that x R n{ct 0 x i A i 0, A i V i, i I} = max A i V i,i I max F S m{ TrA 0F ) : TrA i F ) = c i, i I\{0}, F 0} 2
3 and it describes, in particular, that the value of the robust counterpart RP) primal worst value ) is equal to the value of the optimistic counterpart ODP) dual best value ) and the optimal value of optimistic counterpart is attained. Due to the importance of solving semi-definite programs under data uncertainty, a great deal of attention has recently been focussed on identifying and obtaining uncertaintyimmunized) robust solutions of uncertain semi-definite programs [3, 6, 8, 9, 15, 19]. More recently, strong duality has been established for robust convex programming problems with inequality constraints under uncertainty in [1]. For such inequality constrained problems, various characterizations of strong duality have also been given by the authors in [16]. In this paper, we aim to establish strong duality for linear optimization problems with uncertain linear matrix inequality constraints. We first prove strong duality between the deterministic dual pair RP) and ODP) under the condition that the robust characteristic cone D = { TrA 1 F ),..., TrA n F ), TrA 0 F ) r) : F 0, r 0} A i V i,i I is closed and convex. We further prove that this robust characteristic cone condition is also necessary for the validity of robust strong duality for every linear objective function c T x. Related results for convex programming problems without uncertainty can be found in [12, 13, 14]. In particular, we show that a robust Slater strict feasibity condition [17] together with the convexity of the characteristic cone ensures strong duality. However, we lustrate by an example that the robust Slater condition is, in general, not necessary for strong duality. Second, in the case of uncertain semi-definite linear programs with affinely parameterized data [4], we show that the robust characteristic cone is a convex cone and that a robust Slater condition alone ensures strong duality. Third, we derive two classes of uncertain semi-definite linear programs for which strong duality holds under a robust Slater condition and the optimistic counterparts are computationally tractable semi-definite linear programs: the cases of scenario uncertainty [4] and spectral norm uncertainty [1, 4]. The organization of the paper is as follows. Section 2 presents strong duality results in terms of the robust characteristic cone. Section 3 establishes strong duality for uncertain semi-definite linear programs with affinely parameterized data under a robust Slater condition. Section 4 provides classes of uncertain semi-definite linear programs with computationally tractable optimistic counterparts. Section 5 concludes with a discussion on further research on the tractabity of optimistic counterparts. 3
4 2 Strong Duality for Robust SDPs In this Section, we present strong duality results in terms of the robust characteristic cone, D R n1, which is defined by D = { TrA 1 F ),..., TrA n F ), TrA 0 F ) r) : F 0, r 0}. A i V i,i I We begin by showing that D is indeed a cone in R n1. Lemma 2.1. For each i I, let V i S m be closed and convex. Then, D is a cone. Proof. Clearly, 0 D. Let x D R n1 and µ > 0. Then, there exist A i V i, M 0 and r 0 such that x i = TrA i M), 1 i n and x n1 = TrA 0 M) r. So, letting M = µm 0 and r = µr 0, we obtain that µx i = TrA i M), 1 i n, and µx n1 = TrA 0 M) r. So, µx D, and hence, D is a cone. We now show that strong duality holds whenever the robust characteristic cone is closed and convex. Theorem 2.1. Strong Duality) Let V i S m be closed and convex, for i I, and let the robust feasible set F := {x R n : A 0 n x ia i 0, A i V i } =. Suppose that the robust characteristic cone D is closed and convex. Then, x R n{ct 0 x i A i 0, A i V i, i I} = max A i V i,i I max F S m{ TrA 0F ) : TrA i F ) = c i, i I\{0}, F 0}. Proof. [Weak Duality]. We first observe that, for each x R n with A 0 n x ia i 0, A i V i, i I, and for each F 0, A i V i, i I\{0}, with TrA if ) = c i and A 0 V 0, we have c T x = x i TrA if ) = Tr A 0 and so, the following weak duality holds: ) x i A i)f TrA 0F ) TrA 0F ), x R n{ct 0 x i A i 0, A i V i } max max A i V i,i I F S m{ TrA 0F ) : TrA i F ) = c i, i I\{0}, F 0}. [Reverse Inequality] To see the reverse inequality, we may assume that inf x R n{c T x : A 0 n x ia i 0, A i V i, i I} >. As the robust feasible set F, γ := inf x R n{ct x : A 0 x i A i 0, A i V i, i I} R. 4
5 Then, we see that the following implication holds: A 0 x i A i 0, A i V i, i I c T x γ ) [Homogenization]. Using 2.1), we now show, by the method of contradiction, that t 0 and ta 0 x i A i 0, A i V i, i I c T x tγ ) Suppose, to the contrary, that there exists x 0, t 0 ) R n R such that t 0 A 0 n x0 i A i 0, A i V i and c T x 0 t 0 γ < 0. If t 0 > 0, then we have A 0 n x 0 i t 0 A i 0, A i V i and c T x0 i t 0 ) γ < 0. This contradicts 2.1). If t 0 = 0, then n x0 i A i 0, A i V i, i I\{0} and c T x 0 < 0. Now, fix x F such that A i V i, A 0 n x ia i 0 and consider x α = x αx 0, α 0. Then, A 0 and x α i A i = A 0 x i αx 0 i )A i = A 0 x i A i α x 0 i A i 0, A i V i, i I c T x α γ = c T x γ) αc T x 0 as α. This also contradicts 2.1), and so, 2.2) holds. Now, fix A i V i. Define c R n1 by c = c, γ) and define à : Rn1 S m R by Ãz = ta 0 x i A i, t), for z = x, t) R n R. Then 2.2) can be equivalently rewritten as Now, we show that A i V i, Ãz S m R ) c T z ) c, γ) cod, 2.4) where cod denotes the closure of the convex hull of D. [Hyperplane separation]. Suppose that c, γ) / cod. Then, by the convex separation theorem, there exists z = x, t) R n R)\{0} such that As D = A i V i,i I is a cone, we obtain that u, x, t) < c, γ), x, t), u D. { TrA 1 F ),..., TrA n F ), TrA 0 F ) r) : F 0, r 0} u, x, t) 0 < c, γ), x, t), u D. 5
6 This gives us that sup u, x, t) 0 and c, γ), x, t) < 0. u D On the other hand, it can be verified that, for each F, r) S m R, F, r), Ãx, t) = F, r), ta 0 n x ia i, t) = n x itra i F ) TrA 0 F ) r)t. Recall that for a linear mapping L : R p R q, p, q N, its conjugate mapping L : R q R p is defined by L y), x = y, Lx) for all x R p and y R q. Then, the conjugate mapping of Ã, denoted as Ã, is given by à F, r) = TrA 1 F ),..., TrA n F ), TrA 0 F ) r). So, letting w = x, t) R n R)\{0}, we have and sup A i V i,f 0,r 0 Ãw, F, r) = sup A i V i,f 0,r 0 à F, r), w = sup u, x, t) 0 u D c, w = c, γ), x, t) = c, γ), x, t) < 0. Hence, from the bipolar theorem, Ãw S m R ) and c, w < 0. This contradicts 2.3), and so 2.4) holds. [Dual Attainment]. By assumption, cod = D, and so, c, γ) D. Then, there exist A i V i, F 0 and r 0 such that So, TrA i F ) = c i, 1 i n and γ = TrA 0 F ) r TrA 0 F ). γ max max A i V i,i I F S m{ TrA 0F ) : TrA i F ) = c i, i I\{0}, F 0}, and hence the conclusion follows. We now show that the requirement for strong duality, that D is convex and closed, is in fact necessary and sufficient for the validity of strong duality for every linear objective function c T x. Theorem 2.2. Characterization of Strong Duality) Let V i S m be closed and convex and let the robust feasible set F := {x R n : A 0 n x ia i 0, A i V i }. Then, the following statements are equivalent: i) The robust characteristic cone D is closed and convex. ii) For each c R n, x R n{ct 0 x i A i 0, A i V i, i I} = max A i V i,i I max F S m{ TrA 0F ) : TrA i F ) = c i, i I\{0}, F 0}. 6
7 Proof. The conclusion wl follow from Theorem 2.1 if we show that [ii) i)]. Assume that ii) holds. Let z := z 1,..., z n, z n1 ) R n1 such that z cod. Let w = z 1,..., z n ). Then, we claim that inf{ w T x : A 0 x i A i 0, A i V i } z n1. 2.5) To see this, as z cod and D is a cone of dimension n 1, there exist z jk D and λ jk [0, 1], j = 1,..., n 1 with n1 λjk = 1 such that, as k, n1 λjk z jk z. Since z jk = z jk 1,..., zn jk, zn1) jk D = A i V i,i I { TrA 1F ),..., TrA n F ), TrA 0 F ) r) : F 0, r 0}, there exist A jk i V i, i = 1,..., n, F jk 0 and r jk 0 such that z jk i = TrA jk i F jk ), 1 i n, and z jk n1 = TrA jk 0 F jk ) r jk. Let w jk = z jk 1,..., z jk n ). Then, n1 λjk w jk w and n1 λ jk w jk ) T x = n1 = n1 λ jk λ jk z jk i x i Trx i A jk i F jk ). So, for each x = x 1,..., x n ) with A 0 n x ia i 0, A i V i, we have λ jk w jk ) T x = n1 = n1 λ jk Trx i A jk i F jk ) n1 λ jk Tr A jk 0 ) n1 x i A jk i )F jk λ jk TrA jk 0 F jk ) n1 n1 λ jk z jk n1 r jk ) λ jk zn1. jk Passing to limit and noting that n1 λjk w jk w = z 1,..., z n ) and n1 λjk z jk n1 z n1, we obtain that for each x = x 1,..., x n ) with A 0 n x ia i 0, A i V i w T x z n1. So, 2.5) follows. Now, applying ii) with c = w, where w = z 1,..., z n ), we obtain that max max A i V i,i I F S m{ TrA 0F ) : TrA i F ) = z i, i I\{0}, F 0} z n1. So, there exist F 0 and A i V i, i I with TrA i F ) = z i, such that z n1 TrA 0 F ). 7
8 This implies that z = z 1,..., z n, z n1 ) D := A i V i,i I { TrA 1 F ),..., TrA n F ), TrA 0 F )r) : F 0, r 0}. This shows that cod D and so, cod = D. So, D is closed and convex. Remark 2.1. Now, we show that the more general case, i.e., linear semi-definite problems where uncertainty occurs in both objective function and in the constraints can also be handled by our approach. Indeed, assume that the data c in the objective function c T x is also uncertain and c belongs to the uncertainty set C, then the corresponding uncertain semi-definite linear programs is given by inf c T x x 1,...,x n) T R n s.t. A 0 x i A i 0 where the datum c R n and A i S m are uncertain, and the data c, A i ) belongs to the uncertainty set C V i, i = 0, 1,..., n. This can be equivalently rewritten as inf x 1,...,x n1 ) T R n1 x n1 s.t. A 0 x i A i 0 x n1 c T x 0, where the datum c R n and A i S m are uncertain, and the data c, A i ) belongs to the uncertainty set C V i, i = 0, 1,..., n. Denote c = c 1,..., c n ) T R n and the n n) identity matrix by I n. Then, this situation can be modeled as the following uncertain linear semi-definite problem where uncertainty only occurs in the constraint inf x 1,...,x n1 ) T R n1 x n1 n1 s.t. A 0 x i A i 0 where the matrix data A i is uncertain, i = 0, 1,..., n 1, and the matrix A i belongs to the uncertainty set V i where { ) } { ) } 0 0 V 0 = : A 0 A 0 V 0, V i ci I = n 0 : A 0 0 A 0 V 0, i = 1,..., n i and { V n1 In 0 = 0 A i )}. 8
9 In the special case of UP) without data uncertainty, where each V i, i I, is a singleton set {A i }, we see that Theorem 2.2 yields the recent characterization of strong duality for semi-definite linear programs, given in [13]. Corollary 2.1. Let the feasible set F := {x R n : A 0 n x ia i 0}. Then, the following statements are equivalent: i) The cone D 0 := { TrA 1 F ),..., TrA n F ), TrA 0 F )r) : F 0, r 0} is closed. ii) For each c R n, x R n{ct 0 x i A i 0} = max F S m{ TrA 0F ) : TrA i F ) = c i, i = 1,..., n, F 0}. Proof. For each i I, let V i, be a singleton set {A i }. Then the cone D collapses to the cone D 0 := { TrA 1 F ),..., TrA n F ), TrA 0 F ) r) : F 0, r 0}. Note that D 0 is always convex. So, the conclusion follows Theorem 2.2. We also show that under the Robust Slater Condition that F := {x R n : A 0 n x ia i 0, A i V i }, the characteristic cone D is closed and that strong duality is characterized by the convexity of D. Theorem 2.3. Let V i S m be compact and convex. Suppose that F = {x R n : A 0 n x ia i 0, A i V i }. Then the following statements hold i) The characteristic cone D is closed. ii) The cone D is convex if and only if, for each c R n, x R n{ct 0 x i A i 0, A i V i, i I} = max A i V i,i I max F S m{ TrA 0F ) : TrA i F ) = c i, i I\{0}, F 0}. Proof. i) Let {x k } R n1 such that {x k } D, x k = x k 1,..., x k n1) and x k x = x 1,..., x n1 ). Then, there exist A k i V i, i I, F k 0 and r k 0 such that and x k i = TrA k i F k ), 1 i n, 2.6) x k n1 = TrA k 0F k ) r k. 2.7) As V i is compact, by passing to a subsequence, if necessary, we may assume that A k i A i V i. We first show that { F k } is a bounded sequence. Otherwise, without of loss of F generality, we assume that F k. Then, k F F k Sm \{0}, where S m = {A S m A 0}. Dividing both sides of 2.6) and 2.7) by F k and passing to limit, we obtain that r k TrA i F ) = 0, 1 i n, and TrA 0 F ) = lim k F k 0. 9
10 Since F, there exists x 0 R n such that A 0 n x0 i A i 0, A i V i, and ) Tr A 0 x 0 i A i )F = TrA 0 F ) x 0 i TrA i F ) 0. On the other hand, because F S m \{0}, Tr A 0 ) n x0 i A i )F > 0. This is a contradiction, and so, { F k } is a bounded sequence. Hence, by 2.7), r k is also bounded. Hence, by passing to subsequence, if necessary, we can assume that F k F and r k r. Now, passing to limit in 2.6) and 2.7), we obtain that x i = TrA i F ), 1 i n, and x n1 = TrA 0 F ) r. 2.8) Thus, x D, and so, D is closed. ii). This wl follow from i) and the strong duality Theorem 2.2. We now provide two examples. The first example lustrate the situation where the robust Slater condition fas whe the cone D is closed and convex and strong duality holds. The second example shows that, in general, the characteristic cone can be nonconvex. Example 2.1. Consider the following robust semi-definite linear programming problem: where V 0 = E 1 ) inf x R 2 c 1 x 1 c 2 x 2 s.t. A 0 x 1 A 1 x 2 A 2 0, A i V i, i = 0, 1, 2,, V 1 = a 0 a 0 : a [0, 1] and V 2 = Clearly, for any A i V i, i = 0, 1, 2, A 0 x 1 A 1 x 2 A 2 = 1 x ax 1 0 ax 1 0. Note that, any positive definite matrix must have positive diagonal elements. So, for each A i V i, i = 0, 1, 2, A 0 x 1 A 1 x 2 A 2 is not positive definite, and so, the robust Slater condition fas. However, D = { TrA 1 F ), TrA 2 F ), TrA 0 F ) r) : F 0, r 0} A i V i = a [0,1] { 2af 5, f 1, f 1 r) : F = = R R R, 10 f 1 f 2 f 3 f 2 f 4 f 5 f 3 f 5 f 6 0, r 0}
11 which is closed and convex. On the other hand, the robust feasible set F is given by So, we have F = {x 1, x 2 ) : A 0 x 1 A 1 x 2 A 2 0, A i V i, i = 0, 1, 2} 1 x = {x 1, x 2 ) : 0 0 ax 1 0, a [0, 1]} 0 ax 1 0 = {x 1, x 2 ) : x 2 1, x 1 = 0}. inf x R 2{c 1x 1 c 2 x 2 : A 0 x 1 A 1 x 2 A 2 0, A i V i, i = 0, 1, 2} = Moreover, the optimistic counterpart of its uncertain dual is ODP 1 ) {, if c2 < 0, c 2, if c 2 0. max max A i V i,i=0,1,2 F S 3{ TrA 0F ) : TrA i F ) = c i, i = 1, 2, F 0} = max a [0,1] max F S 3{ f 1 : 2af 5 = c 1, f 1 = c 2, F = f 1 f 2 f 3 f 2 f 4 f 5 f 3 f 5 f 6 0}. Note that F 0 implies that f 1 0. So, if c 2 < 0, then the feasible set of ODP 1 ) is empty and hence the value of it is. Moreover, if c 2 0, then the feasible set is nonempty and hence maxodp 1 ) = c 2. Thus, inf x R 2{c 1x 1 c 2 x 2 : A 0 x 1 A 1 x 2 A 2 0, A i V i, i = 0, 1, 2} = max max A i V i,i=0,1,2 F S 3{ TrA 0F ) : TrA i F ) = c i, i = 1, 2, F 0} {, if c2 < 0, = c 2, if c 2 0. Example 2.2. Consider the following uncertain semi-definite linear program: min x 1 x 2 s.t. A 0 x 1 A 1 x 2 A 2 0 { )} 0 0 where the matrix data A 0, A 1, A 2 are uncertain and A 0 V 0 =, A { ) } { ) } 0 a 0 b V 1 = : a [0, 1] and A a 0 2 V 2 = : b [ 1, 0]. In this case, the b 0 characteristic cone D becomes D = = = { TrA 1 F ), TrA 2 F ), TrA 0 F ) r) : F 0, r 0} A i V i ) f1 f { 2af 2, 2bf 2, r) : F = 2 0, r 0} f 2 f 3 a [0,1], b [ 1,0] { 2af 2, 2bf 2, r) : f 1 0, f 3 0, f2 2 f 1 f 3, r 0} a [0,1], b [ 1,0] 11
12 It can be easy verified that 0, 2, 0) D by letting a = 0, b = 1, f 2 = 1, f 1 = f 3 = 2 and r = 0) and 2, 0, 0) D by letting a = 1, b = 0, f 2 = 1, f 1 = f 3 = 2 and r = 0). However, we now show that their mid point 1, 1, 0) / D. Otherwise, we have 2af 2 = 1 and 2bf 2 = 1 for some a [0, 1] and b [ 1, 0]. This implies that a 0, b 0 and f 2 0. So, we see that a = b. This enforces that a = b = 0 which is impossible. Thus, the characteristic cone is not convex. We now verify that the strong duality between the robust counterpart and the optimistic counterpart of the dual fas. To see this, note that the robust counterpart is This can equivalent written as min x 1 x 2 s.t. A 0 x 1 A 1 x 2 A 2 0 A i V i, i = 0, 1, 2 min x 1 x 2 0 ax s.t. 1 bx 2 ax 1 bx 2 0 ) 0 a [0, 1], b [ 1, 0]. Note that 0, 0) is the only feasible point for the robust counterpart, so the optimal value is 0. On the other hand, the optimistic counterpart of the dual is ) max max f1 f A i V i,i=0,1,2 F S m{ TrA 0F ) : TrA i F ) = 1, i = 1, 2, F = 2 0} f 2 f 3 = max max a [0,1],b [ 1,0] F S m{0 : 2af 1 = 1 and 2bf 2 = 1}. It can be seen that there is no a [0, 1] and b [ 1, 0] such that 2af 2 = 1 and 2bf 2 = 1. So, the feasible set of the optimistic dual is empty and hence, strong duality fas between the robust counterpart and the optimistic dual. From the above example, we see that the characteristic cone can be nonconvex in general. A sufficient condition ensuring the convexity of the characteristic cone can be found in the appendix. Finally, it is worth noting that, like the robust counterparts RP ), the optimistic counterparts ODP ) can, in general, be difficult to solve for certain classes of uncertainties [4]. This prompts us to examine classes of robust SDP where optimistic counterparts are easy solvable in the next section. 3 Duality with Tractable Optimistic Dual In this Section, we focus on classes of uncertain semi-definite linear programs for which optimistic counterparts can be computationally tractable and strong duality can be easy verified. It should be noted that the computational tractabity of primal robust SDPs may also be checked independently for these classes. We do not intend to identify cases of robust SDPs where duality and optimistic counterparts are easy solvable but the robust counterparts are not necessary tractable. 12
13 Let us first consider the uncertain semi-definite programming problem with spectral norm uncertainty: P s ) x R n{ct 0 x i A i 0} where A i Ṽi := {A i ρ i i : i S m, i spec 1}, A i S m, ρ i 0 and spec denotes the square root of the largest eigenvalue of the matrix T. In this case, the uncertainty set Ṽi is just a closed ball with center A i and radius ρ i in the matrix space S m associated with the spectral norm. The robust counterpart of this uncertain SDP problem is RP s ) x R n{ct 0 x i A i 0, A i {A i ρ i i : i S m, i spec 1}}. The optimistic counterpart of the dual of RP s ) is ODP s ) max max i spec 1, i I F S m{ Tr A 0 ρ 0 0 )F ) : Tr A i ρ i i )F ) = c i, i I\{0}, F 0}. In this case also we show that the robust characteristic cone D is a convex cone. Proposition 3.1. Let Ṽi = {A i ρ i i : i S m, spec 1}, A i S m. Then the robust characteristic cone D := { TrA 1 F ),..., TrA n F ), TrA 0 F ) r) : F 0, r 0}, is convex. A i Ṽi,i I Proof. First, we note that D = { TrA 1 F ),..., TrA n F ), TrA 0 F ) r) : F 0, r 0} = A i Ṽi i spec 1 { Tr A 1 ρ 1 1 )F ),..., Tr A n ρ n n )F ), Tr A 0 ρ 0 0 )F ) r) : F 0, r 0}. To see the convexity of D, let x = x 1,..., x n1 ) D and y = y 1,..., y n1 ) D. As D is a cone by Lemma 2.1), it suffices to show that x y D. Since x, y D, there exist H i, C i { : spec 1}, i I, M, N 0 and r, s 0 such that and x i = Tr A i ρ i H i )M ), 1 i n, x n1 = Tr A 0 ρ 0 H 0 )M ) r y i = Tr A i ρ i C i )N ), 1 i n, y n1 = Tr A 0 ρ 0 C 0 )N ) s. New, we show that, for each fixed F 0, [Tr A i ρ i I m )F ), Tr A i ρ i I m )F ) ] = 13 i spec 1 {Tr A i ρ i i )F ) }, i I. 3.9)
14 To see this, as I m, I m { i : i spec 1}, where I m is the m m) identity matrix, and for each fixed F 0, i Tr A i ρ i i )F ) is a continuous function, it follows by the intermediate value theorem that, for each fixed F 0, [Tr A i ρ i I m )F ), Tr A i ρ i I m )F ) ] {Tr A i ρ i i )F ) }, i I. i spec 1 To see the reverse inclusion, note that for each fixed F 0 see [2, Proposition 2.1]), sup Tr F ) = TrF ) and spec 1 So, the reverse inclusion follows as sup {Tr A i ρ i i )F ) } = Tr A i ρ i I m )F ) and i spec 1 Therefore, 3.9) gives us that inf Tr F ) = TrF ). spec 1 inf {Tr A i ρ i i )F ) } = Tr A i ρ i I m )F ). i spec 1 x i [Tr A i ρ i I m )M ), Tr A i ρ i I m )M ) ], 1 i n, x n1 r [Tr A 0 ρ 0 I m )M ), Tr A 0 ρ 0 I m )M ) ] and y i [Tr A i ρ i I m )N ), Tr A i ρ i I m )N ) ], 1 i n, y n1 s [Tr A 0 ρ 0 I m )N ), Tr A 0 ρ 0 I m )N ) ]. Then, and x i y i [Tr A i ρ i I m )M N) ), Tr A i ρ i I m )M N) ) ], 1 i n, x n1 y n1 r s) [Tr A 0 ρ 0 I m )M N) ), A 0 ρ 0 I m )M N) ) ]. Now, applying 3.9) we can find i with spec 1 such that x i y i = Tr A i ρ i i )MN) ), 1 i n, x n1 y n1 rs) = Tr A 0 ρ 0 0 )MN) ). Hence, x y D. We finally establish that strong duality holds for P s ) under the robust Slater condition and that the optimistic counterpart ODP s ) is a single semi-definite linear program. Theorem 3.1. For P s ), suppose that there exists x 0 = x 0 1,..., x 0 n) R n such that Then, A 0 x 0 i A i 0 A i {A i ρ i i : i spec 1}, i I\{0}. x R n{ct 0 x i A i 0, A i {A i ρ i i : i spec 1}} = max F S m{ Tr A 0 ρ 0 I)F ) : Tr A i ρ i I m )F ) c i, i I\{0} Tr A i ρ i I m )F ) c i, i I\{0}, F 0}. 14
15 Proof. From Proposition 3.1 and Theorem 2.3, we see that the cone D is closed and convex. Then, by Theorem 2.1, x R n{ct 0 x i A i 0, A i {A i ρ i i : i spec 1}} = max i spec 1, i I max F S m{ Tr A 0 ρ 0 0 )F ) : Tr A i ρ i i )F ) = c i, i I\{0}, F 0}. Since, 3.9) holds for each F 0, [Tr A i ρ i I m )F ), Tr A i ρ i I m )F ) ] = i spec 1 So, the optimistic counterpart can be simplified as {Tr A i ρ i i )F ) }, i I. 3.10) max max i spec 1, i I F S m{ Tr A 0 ρ 0 0 )F ) : Tr A i ρ i i )F ) = c i, i I\{0}, F 0} = max i spec 1, i I\{0} max F S m{ Tr A 0 ρ 0 I m )F ) : Tr A i ρ i i )F ) = c i, i I\{0}, F 0} = max F S m{ Tr A 0 ρ 0 I m )F ) : Tr A i ρ i I m )F ) c i, i I\{0}, Tr A i ρ i I m )F ) c i, i I\{0}, F 0}. Indeed, for each F 0 with Tr A i ρ i i )F ) = c i, we have Then, Tr A i ρ i I m )F ) c i and Tr A i ρ i I m )F ) c i, i I\{0}. max { Tr A 0 ρ 0 I m )F ) : Tr A i ρ i i )F ) = c i, i I\{0}, F 0} F S m i spec 1, i I\{0} max F S m{ Tr A 0 ρ 0 I m )F ) : Tr A i ρ i I m )F ) c i, i I\{0}, Tr A i ρ i I m )F ) c i, i I\{0}, F 0}. On the other hand, for any F 0 with Tr A i ρ i I m )F ) c i and Tr A i ρ i I m )F ) c i, i I\{0}, the equation 3.10) implies that there exist i with i spec 1 such that Tr A i ρ i i )F ) = c i. This gives us that max { Tr A 0 ρ 0 I m )F ) : Tr A i ρ i i )F ) = c i, i I\{0}, F 0} F S m i spec 1, i I\{0} max F S m{ Tr A 0 ρ 0 I m )F ) : Tr A i ρ i I m )F ) c i, i I\{0} Tr A i ρ i I m )F ) c i, i I\{0}, F 0}. Hence, the equality holds. 15
16 Remark 3.1. Comparison with the known result) It has been established for example see [4, Chapter 8]) that the robust counterpart of the uncertain semi-definite linear programming problem with spectral norm uncertainty can be reformulated as a new semi-definite linear programming, and so, is also computational tractable. Our approach provides an alternative way for identifying this tractable class of robust semi-definite linear program via a dual approach. Remark 3.2. Other simple tractable cases) Consider the semi-definite linear programming model problem with affinely parameterized diagonal matrix data: where for each i = 0, 1,..., n, P a ) inf x R n{ct x : A 0 A i V i = {diag a 0) i1 u j) i1 aj) i1,..., a0) a j) im x i A i 0} u j) im aj) im) : u = u 1,..., u k ) U, l = 1,..., m}, R for each l = 1,..., m and j = 1,..., k and U, l = 1,..., m, are convex compact sets in R k. Here, diagd 1,..., d n ) denotes a diagonal matrix with diagonal elements d 1,..., d n. The characteristic cone D collapses to D = u U { r m m λ l a 0) λ l a 0) 1l u j) 1l aj) 1l ),..., m u j) aj) ) ) : λ l 0, r 0} Let v l = u 1),..., uk), u1) 1l,..., uk) 1l, u1) nl,..., uk) nl ) and g l x, v l ) = a 0) u j) aj) ) x i a 0) λ l a 0) nl u j) a j) ). u j) nl aj) nl ), Then, direct verification gives us that D = v l Q n i=0 U,λ l 0 epi m λ lg l, v l ) ) where epif is the epigraph of a function f given by epif = {x, s) : s fx)} and f denotes the usual convex conjugate of a convex function f. Clearly, g l, v l ) is a linear function and g l x, ) is also linear. Thus [16, Proposition 2.3] implies that v l Q n i=0 U,λ l 0 epi m λ lg l, v l ) ) is convex, and so, the characteristic cone D is convex. 1) If for each i = 0, 1,..., n, l = 1,..., m, U = co{z 1),..., z p) } which is the convex hull of a given finitely generated scenarios z t) = z 1t),..., z kt) ) R k, t = 1,..., p, and p N, then the characteristic cone D is closed and so, strong duality between the robust counterpart and the optimistic dual always holds see [16, Theorem 4.1]). Moreover, the 16
17 optimistic dual can be simplified as max max A i V i,i I F S m{ TrA 0F ) : TrA i F ) = c i, i I\{0}, F 0} = max { u U,λ l 0 m = max { m λ l a 0) λ l 0 m λ l a 0) λ l a 0) p t=1 µ jt) p t=1 u j) aj) µ jt) ) : m z jt) a j) ) : z jt) a j) ) = c i, p t=1 λ l a 0) µ jt) = 1, µ jt) 0} u j) a j) ) = c i, i = 1,..., n} Letting λ jt) = λ l µ jt) 0, then p t=1 λjt) = λ l, i = 1,..., n, j = 1,..., k, l = 1,..., m, and the optimistic dual can be further simplified as max λ l 0,λ jt) 0 { m λ l a 0) p t=1 λ jt) z jt) a j) : m λ l a 0) p t=1 λ jt) z jt) a j) = c i, p t=1 λ jt) = λ l } which is a linear programming problem. 2) If for each i = 0, 1,..., n, l = 1,..., m, U = {z R k : z 1} and the robust Slater condition holds, strong duality between the robust counterpart and the optimistic dual always holds by Theorem 2.3. Moreover, the optimistic dual can be simplified as max max A i V i,i I F S m{ TrA 0F ) : TrA i F ) = c i, i I\{0}, F 0} = max { u U,λ l 0 = max u 1) m,...,u k) ) 1,λ l 0 λ l a 0) { m u j) aj) λ l a 0) ) : m u j) aj) λ l a 0) ) : m λ l a 0) u j) a j) ) = c i, i = 1,..., n} u j) a j) ) = c i, i = 1,..., n} Letting λ j) = λ l u j), then λ l λ 1),..., λ k) ) and the optimistic dual can be further simplified as max λ l λ 1),...,λ k) ) { m λ l a 0) λ j) aj) : m λ l a 0) which is a second-order cone linear programming problem. 4 Conclusion and Further Research λ j) a j) = c i, i = 1,..., n} In this paper we considered semi-definite linear programs subject to data uncertainty. We have presented a deterministic approach for studying such uncertain programs by 17
18 restricting the uncertain data to closed and convex uncertainty sets. We have established necessary as well as sufficient conditions for strong duality for uncertain SDPs by proving strong duality between the associated deterministic counterparts, namely, the robust counterpart and the optimistic counterpart. Our study presented in this paper raises interesting questions for further research. For instance, the tractabity in robust semi-definite programming often relates to whether the robust counterpart of an uncertain semi-definite program is computationally tractable or not. On the other hand, from the dual point of view, the investigation of the tractabity of optimistic counterpart of the dual of an uncertain semi-definite program would be of great interest, particularly, when strong duality is avaable. We have provided two classes of uncertain semi-definite linear programs for which strong duality holds under a robust Slater condition and the optimistic counterparts are computationally tractable semi-definite linear programs. It would be useful to examine this dual tractabity for other specially structured problems with broad classes of uncertainty sets. Especially, the duality approach, presented in this paper, could be applied to uncertain second-order cone programming problems. This would lead to identifying other broad classes of computationally tractable optimistic counterparts and wl be presented elsewhere. Appendix: Convexity of the Characteristic Cone In this appendix, we provide a sufficient condition for the convexity of the characteristic cone. Consider the uncertain semi-definite programming model problem P a ) x R n{ct 0 x i A i 0} where for each i I, the matrix A i is uncertain and the uncertain data matrix is affinely parameterized, i.e., the matrix data A i belongs to the uncertainty set V i = {A 0) i u j i Aj) i : u 1 i,..., u k i ) U i }, where U i, i = 0, 1,..., n, is a compact convex set in R k and A j) i S m i = 0, 1,..., n. The robust counterpart of this uncertain SDP problem is RP a ) x R n{ct 0 x i A i 0, A i {A 0) i The optimistic counterpart of its dual problem is ODP a ) max u 1 i,...,uk i ) U i i I max { Tr A 0) F S m 0 Tr A 0) i 18 u j 0 Aj) 0 )F ) : u j i Aj) i : u 1 i,..., u k i ) U i }}. u j i Aj) i )F ) = c i, i I\{0}, F 0}.
19 We show in this case that the robust characteristic cone D is a convex cone. Proposition 4.1. Convexity of the Characteristic Cone: Sufficient Condition) For each i I, A i V i = {A 0) i k uj i Aj) i : u 1 i,..., u k i ) U i }, where U i, i = 0, 1,..., n, is a compact convex set in R k, A j) 0 S m and A j) i 0 i = 1,..., n, j = 1,..., k. Then the robust characteristic cone is convex. D := A i V i,i I { TrA 1 F ),..., TrA n F ), TrA 0 F ) r) : F 0, r 0}, Proof. Let x, y D and let λ [0, 1]. As x, y D, there exist H i, C i V i, i I, M, N 0 and r, s 0 such that and x i = TrH i M), i I\{0}, x n1 = TrH 0 M) r, y i = TrC i N), i I\{0}, x n1 = TrC 0 N) s. As H i, C i V i, there exist u 1 i,..., u k i ) U i and vi 1,..., vi k ) U i such that, for each i I, H i = A 0) i u j i Aj) i and C i = A 0) i v j i Aj) i. Now, fix an i I\{0}. We see that λx i 1 λ)y i = TrλH i M 1 λ)c i N) = Tr λa 0) i u j i Aj) i )M 1 λ)a 0) i ) = Tr A 0) i λm 1 λ)n) u j i Tr λa j) i M ) v j i Tr 1 λ)a j) i N )) ) v j i Aj) i )N Define, for each i I and j = 1,..., k, ) u j w j i λa Tr j) i M )v ji Tr 1 λ)a j) i N i = ) if Tr λm 1 λ)n)a j) ) Tr λm1 λ)n)a j) i 0 i u j i if Tr λm 1 λ)n)a j) ) i = 0. Clearly, w 1 i,..., w k i ) U i, i I. Moreover, for each i I\{0} and j = 1,..., k w j i Tr λm 1 λ)n)a j) ) i = u j i Tr λa j) i M ) v j i Tr 1 λ)a j) i N ). 19
20 This equality follows trivially if Tr λm 1 λ)n)a j) ) i 0. On the other hand, if Tr λm 1 λ)n)a j) ) j) i = 0, then Tr λa i M ) = Tr 1 λ)a j) i N ) = 0 as M, N and A j i, i = 1,..., k, j = 1 = 1,..., m are all positive semi-definite matrices. Thus, the equality also follows in this case.) So, ) λx i 1 λ)y i = Tr A 0) i λm 1 λ)n) = Tr A 0) i Simarly, we can also show that λx n1 1 λ)y n1 = Tr A 0) 0 ) w j i Aj) i )λm 1 λ)n). w j i Tr λm 1 λn))a j) ) i ) w0a j j) 0 )λm 1 λ)n) λr 1 λ)s). Note that w 1 i,..., w k i ) U i, i I, λm 1 λ)n 0 and λr 1 λ)s 0. We see that λx 1 λ)y D. So, D is convex. References [1] A. Beck and A. Ben-Tal, Duality in robust optimization: primal worst equals dual best, Oper. Res. Lett., ), 1-6. [2] B. Recht, M. Fazel, P.A. Parro, Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization. SIAM Rev ), [3] A. Ben-Tal and A. Nemirovski, On approximate robust counterparts of uncertain semi-definite and conic quadratic programs. System modeling and optimization, XX Trier, 2001), 1 22, IFIP Int. Fed. Inf. Process., 130, Kluwer Acad. Publ., Boston, MA, [4] A. Ben-Tal, L.E. Ghaoui and A. Nemirovski, Robust Optimization, Princeton Series in Applied Mathematics, Princeton University Press, New Jersey, [5] D. Bertsimas and D. Brown, Constructing uncertainty sets for robust linear optimization, Oper. Res., ), [6] D. Bertsimas, D. Pachamanova and M. Sim, Robust linear optimization under general norms, Oper. Res. Lett., ), [7] S. Boyd and L. Vandenberghe, Convex optimization, Cambridge, [8] L.E. Ghaoui, F. Oustry and H. Lebret, Robust solutions to uncertain semi-definite programs, SIAM J. Optim., ),
21 [9] D. Goldfarb and G. Iyengar, Robust convex quadratically constrained programs, Math. Program., Ser. B, ), [10] M.A. Goberna, Linear semi-infinite optimization: recent advances. In: Jeyakumar, V., Rubinov, A.M. eds.) Continuous Optimization, 2005, Springer, NewYork, [11] M.A. Goberna., M.A. López, Linear Semi-infinite Optimization. J. Wey, Chichester, [12] V. Jeyakumar, Constraint qualifications characterizing Lagrangian duality in convex optimization, J. Optim. Theory Appl., ), [13] V. Jeyakumar, A note on strong duality in convex semi-definite optimization: necessary and sufficient conditions. Optim. Lett., ), [14] V. Jeyakumar and G. M. Lee, Complete characterizations of stable Farkas lemma and cone-convex programming duality, Math. Program., ), [15] V. Jeyakumar and G. Li, Characterizing robust set containments and solutions of uncertain linear programs without qualifications, Oper. Res. Lett., ), [16] V. Jeyakumar and G. Li, Strong duality in robust convex programming: Complete charaterizations, SIAM J. Optim., ), [17] V. Jeyakumar and H. Wolkowicz, Generalizations of Slater s constraint qualification for infinite convex programs, Math. Program., ), Ser. B, [18] M. C. Pinar and R. H. Tutuncu, Robust profit opportunities in risky financial portfolios, Oper. Res. Lett., ), [19] C. W. Scherer, Relaxations for robust linear matrix inequality problems with verifications for exactness. SIAM J. Matrix Anal. Appl., ), [20] A. Shapiro, D. Dentcheva and A. Ruszczynski, Lectures on Stochastic Programming: Modeling and Theory, SIAM, Phadelphia, [21] H. Wolkowicz, R. Saigal and L. Vandenberghe, Handbook of semi-definite programming: Theory, algorithms, and applications, Kluwer Academic Publishers, Boston, MA,
Robust Farkas Lemma for Uncertain Linear Systems with Applications
Robust Farkas Lemma for Uncertain Linear Systems with Applications V. Jeyakumar and G. Li Revised Version: July 8, 2010 Abstract We present a robust Farkas lemma, which provides a new generalization of
More informationA Robust von Neumann Minimax Theorem for Zero-Sum Games under Bounded Payoff Uncertainty
A Robust von Neumann Minimax Theorem for Zero-Sum Games under Bounded Payoff Uncertainty V. Jeyakumar, G.Y. Li and G. M. Lee Revised Version: January 20, 2011 Abstract The celebrated von Neumann minimax
More informationCharacterizing Robust Solution Sets of Convex Programs under Data Uncertainty
Characterizing Robust Solution Sets of Convex Programs under Data Uncertainty V. Jeyakumar, G. M. Lee and G. Li Communicated by Sándor Zoltán Németh Abstract This paper deals with convex optimization problems
More informationExact SDP Relaxations for Classes of Nonlinear Semidefinite Programming Problems
Exact SDP Relaxations for Classes of Nonlinear Semidefinite Programming Problems V. Jeyakumar and G. Li Revised Version:August 31, 2012 Abstract An exact semidefinite linear programming (SDP) relaxation
More informationTrust Region Problems with Linear Inequality Constraints: Exact SDP Relaxation, Global Optimality and Robust Optimization
Trust Region Problems with Linear Inequality Constraints: Exact SDP Relaxation, Global Optimality and Robust Optimization V. Jeyakumar and G. Y. Li Revised Version: September 11, 2013 Abstract The trust-region
More informationRobust linear semi-in nite programming duality under uncertainty?
Mathematical Programming (Series B) manuscript No. (will be inserted by the editor) Robust linear semi-in nite programming duality under uncertainty? Robust linear SIP duality Goberna M.A., Jeyakumar V.
More information1. Introduction. Consider the deterministic multi-objective linear semi-infinite program of the form (P ) V-min c (1.1)
ROBUST SOLUTIONS OF MULTI-OBJECTIVE LINEAR SEMI-INFINITE PROGRAMS UNDER CONSTRAINT DATA UNCERTAINTY M.A. GOBERNA, V. JEYAKUMAR, G. LI, AND J. VICENTE-PÉREZ Abstract. The multi-objective optimization model
More informationRobust Duality in Parametric Convex Optimization
Robust Duality in Parametric Convex Optimization R.I. Boţ V. Jeyakumar G.Y. Li Revised Version: June 20, 2012 Abstract Modelling of convex optimization in the face of data uncertainty often gives rise
More informationA Note on Nonconvex Minimax Theorem with Separable Homogeneous Polynomials
A Note on Nonconvex Minimax Theorem with Separable Homogeneous Polynomials G. Y. Li Communicated by Harold P. Benson Abstract The minimax theorem for a convex-concave bifunction is a fundamental theorem
More informationHandout 6: Some Applications of Conic Linear Programming
ENGG 550: Foundations of Optimization 08 9 First Term Handout 6: Some Applications of Conic Linear Programming Instructor: Anthony Man Cho So November, 08 Introduction Conic linear programming CLP, and
More informationHandout 8: Dealing with Data Uncertainty
MFE 5100: Optimization 2015 16 First Term Handout 8: Dealing with Data Uncertainty Instructor: Anthony Man Cho So December 1, 2015 1 Introduction Conic linear programming CLP, and in particular, semidefinite
More informationRobust linear optimization under general norms
Operations Research Letters 3 (004) 50 56 Operations Research Letters www.elsevier.com/locate/dsw Robust linear optimization under general norms Dimitris Bertsimas a; ;, Dessislava Pachamanova b, Melvyn
More informationSemidefinite and Second Order Cone Programming Seminar Fall 2012 Project: Robust Optimization and its Application of Robust Portfolio Optimization
Semidefinite and Second Order Cone Programming Seminar Fall 2012 Project: Robust Optimization and its Application of Robust Portfolio Optimization Instructor: Farid Alizadeh Author: Ai Kagawa 12/12/2012
More informationLecture Note 5: Semidefinite Programming for Stability Analysis
ECE7850: Hybrid Systems:Theory and Applications Lecture Note 5: Semidefinite Programming for Stability Analysis Wei Zhang Assistant Professor Department of Electrical and Computer Engineering Ohio State
More informationLecture 6: Conic Optimization September 8
IE 598: Big Data Optimization Fall 2016 Lecture 6: Conic Optimization September 8 Lecturer: Niao He Scriber: Juan Xu Overview In this lecture, we finish up our previous discussion on optimality conditions
More informationStrong duality in Lasserre s hierarchy for polynomial optimization
Strong duality in Lasserre s hierarchy for polynomial optimization arxiv:1405.7334v1 [math.oc] 28 May 2014 Cédric Josz 1,2, Didier Henrion 3,4,5 Draft of January 24, 2018 Abstract A polynomial optimization
More informationthat a broad class of conic convex polynomial optimization problems, called
JOTA manuscript No. (will be inserted by the editor) Exact Conic Programming Relaxations for a Class of Convex Polynomial Cone-Programs Vaithilingam Jeyakumar Guoyin Li Communicated by Levent Tunçel Abstract
More informationOperations Research Letters
Operations Research Letters 37 (2009) 1 6 Contents lists available at ScienceDirect Operations Research Letters journal homepage: www.elsevier.com/locate/orl Duality in robust optimization: Primal worst
More informationConvex Optimization M2
Convex Optimization M2 Lecture 3 A. d Aspremont. Convex Optimization M2. 1/49 Duality A. d Aspremont. Convex Optimization M2. 2/49 DMs DM par email: dm.daspremont@gmail.com A. d Aspremont. Convex Optimization
More informationGlobal Quadratic Minimization over Bivalent Constraints: Necessary and Sufficient Global Optimality Condition
Global Quadratic Minimization over Bivalent Constraints: Necessary and Sufficient Global Optimality Condition Guoyin Li Communicated by X.Q. Yang Abstract In this paper, we establish global optimality
More informationRelaxations and Randomized Methods for Nonconvex QCQPs
Relaxations and Randomized Methods for Nonconvex QCQPs Alexandre d Aspremont, Stephen Boyd EE392o, Stanford University Autumn, 2003 Introduction While some special classes of nonconvex problems can be
More informationLecture 5. The Dual Cone and Dual Problem
IE 8534 1 Lecture 5. The Dual Cone and Dual Problem IE 8534 2 For a convex cone K, its dual cone is defined as K = {y x, y 0, x K}. The inner-product can be replaced by x T y if the coordinates of the
More information5. Duality. Lagrangian
5. Duality Convex Optimization Boyd & Vandenberghe Lagrange dual problem weak and strong duality geometric interpretation optimality conditions perturbation and sensitivity analysis examples generalized
More informationRobust Optimization for Risk Control in Enterprise-wide Optimization
Robust Optimization for Risk Control in Enterprise-wide Optimization Juan Pablo Vielma Department of Industrial Engineering University of Pittsburgh EWO Seminar, 011 Pittsburgh, PA Uncertainty in Optimization
More informationEE 227A: Convex Optimization and Applications October 14, 2008
EE 227A: Convex Optimization and Applications October 14, 2008 Lecture 13: SDP Duality Lecturer: Laurent El Ghaoui Reading assignment: Chapter 5 of BV. 13.1 Direct approach 13.1.1 Primal problem Consider
More informationE5295/5B5749 Convex optimization with engineering applications. Lecture 5. Convex programming and semidefinite programming
E5295/5B5749 Convex optimization with engineering applications Lecture 5 Convex programming and semidefinite programming A. Forsgren, KTH 1 Lecture 5 Convex optimization 2006/2007 Convex quadratic program
More informationRobust Solutions to Multi-Objective Linear Programs with Uncertain Data
Robust Solutions to Multi-Objective Linear Programs with Uncertain Data M.A. Goberna yz V. Jeyakumar x G. Li x J. Vicente-Pérez x Revised Version: October 1, 2014 Abstract In this paper we examine multi-objective
More informationRobust Solutions to Multi-Objective Linear Programs with Uncertain Data
Robust Solutions to Multi-Objective Linear Programs with Uncertain Data arxiv:1402.3095v1 [math.oc] 13 Feb 2014 M.A. Goberna, V. Jeyakumar, G. Li, and J. Vicente-Pérez December 9, 2013 Abstract In this
More informationMustafa Ç. Pinar 1 and Marc Teboulle 2
RAIRO Operations Research RAIRO Oper. Res. 40 (2006) 253 265 DOI: 10.1051/ro:2006023 ON SEMIDEFINITE BOUNDS FOR MAXIMIZATION OF A NON-CONVEX QUADRATIC OBJECTIVE OVER THE l 1 UNIT BALL Mustafa Ç. Pinar
More informationLagrange Duality. Daniel P. Palomar. Hong Kong University of Science and Technology (HKUST)
Lagrange Duality Daniel P. Palomar Hong Kong University of Science and Technology (HKUST) ELEC5470 - Convex Optimization Fall 2017-18, HKUST, Hong Kong Outline of Lecture Lagrangian Dual function Dual
More informationChapter 2 Convex Analysis
Chapter 2 Convex Analysis The theory of nonsmooth analysis is based on convex analysis. Thus, we start this chapter by giving basic concepts and results of convexity (for further readings see also [202,
More informationRobust portfolio selection under norm uncertainty
Wang and Cheng Journal of Inequalities and Applications (2016) 2016:164 DOI 10.1186/s13660-016-1102-4 R E S E A R C H Open Access Robust portfolio selection under norm uncertainty Lei Wang 1 and Xi Cheng
More informationLecture 8. Strong Duality Results. September 22, 2008
Strong Duality Results September 22, 2008 Outline Lecture 8 Slater Condition and its Variations Convex Objective with Linear Inequality Constraints Quadratic Objective over Quadratic Constraints Representation
More informationA Hierarchy of Polyhedral Approximations of Robust Semidefinite Programs
A Hierarchy of Polyhedral Approximations of Robust Semidefinite Programs Raphael Louca Eilyan Bitar Abstract Robust semidefinite programs are NP-hard in general In contrast, robust linear programs admit
More informationDistributionally Robust Convex Optimization
Submitted to Operations Research manuscript OPRE-2013-02-060 Authors are encouraged to submit new papers to INFORMS journals by means of a style file template, which includes the journal title. However,
More informationSelected Examples of CONIC DUALITY AT WORK Robust Linear Optimization Synthesis of Linear Controllers Matrix Cube Theorem A.
. Selected Examples of CONIC DUALITY AT WORK Robust Linear Optimization Synthesis of Linear Controllers Matrix Cube Theorem A. Nemirovski Arkadi.Nemirovski@isye.gatech.edu Linear Optimization Problem,
More informationarzelier
COURSE ON LMI OPTIMIZATION WITH APPLICATIONS IN CONTROL PART II.1 LMIs IN SYSTEMS CONTROL STATE-SPACE METHODS STABILITY ANALYSIS Didier HENRION www.laas.fr/ henrion henrion@laas.fr Denis ARZELIER www.laas.fr/
More informationSOME STABILITY RESULTS FOR THE SEMI-AFFINE VARIATIONAL INEQUALITY PROBLEM. 1. Introduction
ACTA MATHEMATICA VIETNAMICA 271 Volume 29, Number 3, 2004, pp. 271-280 SOME STABILITY RESULTS FOR THE SEMI-AFFINE VARIATIONAL INEQUALITY PROBLEM NGUYEN NANG TAM Abstract. This paper establishes two theorems
More informationHandout 6: Some Applications of Conic Linear Programming
G00E04: Convex Optimization and Its Applications in Signal Processing Handout 6: Some Applications of Conic Linear Programming Instructor: Anthony Man Cho So June 3, 05 Introduction Conic linear programming
More informationConvex Optimization Boyd & Vandenberghe. 5. Duality
5. Duality Convex Optimization Boyd & Vandenberghe Lagrange dual problem weak and strong duality geometric interpretation optimality conditions perturbation and sensitivity analysis examples generalized
More informationSome Contributions to Convex Infinite-Dimensional Optimization Duality
Some Contributions to Convex Infinite-Dimensional Optimization Duality Marco A. López Alicante University King s College London Strand Campus June 2014 Introduction Consider the convex infinite programming
More informationConstraint qualifications for convex inequality systems with applications in constrained optimization
Constraint qualifications for convex inequality systems with applications in constrained optimization Chong Li, K. F. Ng and T. K. Pong Abstract. For an inequality system defined by an infinite family
More informationSemidefinite Programming Basics and Applications
Semidefinite Programming Basics and Applications Ray Pörn, principal lecturer Åbo Akademi University Novia University of Applied Sciences Content What is semidefinite programming (SDP)? How to represent
More informationSummer School: Semidefinite Optimization
Summer School: Semidefinite Optimization Christine Bachoc Université Bordeaux I, IMB Research Training Group Experimental and Constructive Algebra Haus Karrenberg, Sept. 3 - Sept. 7, 2012 Duality Theory
More informationI.3. LMI DUALITY. Didier HENRION EECI Graduate School on Control Supélec - Spring 2010
I.3. LMI DUALITY Didier HENRION henrion@laas.fr EECI Graduate School on Control Supélec - Spring 2010 Primal and dual For primal problem p = inf x g 0 (x) s.t. g i (x) 0 define Lagrangian L(x, z) = g 0
More informationA Dual Condition for the Convex Subdifferential Sum Formula with Applications
Journal of Convex Analysis Volume 12 (2005), No. 2, 279 290 A Dual Condition for the Convex Subdifferential Sum Formula with Applications R. S. Burachik Engenharia de Sistemas e Computacao, COPPE-UFRJ
More informationOptimality Conditions for Nonsmooth Convex Optimization
Optimality Conditions for Nonsmooth Convex Optimization Sangkyun Lee Oct 22, 2014 Let us consider a convex function f : R n R, where R is the extended real field, R := R {, + }, which is proper (f never
More information15. Conic optimization
L. Vandenberghe EE236C (Spring 216) 15. Conic optimization conic linear program examples modeling duality 15-1 Generalized (conic) inequalities Conic inequality: a constraint x K where K is a convex cone
More informationLecture 1. 1 Conic programming. MA 796S: Convex Optimization and Interior Point Methods October 8, Consider the conic program. min.
MA 796S: Convex Optimization and Interior Point Methods October 8, 2007 Lecture 1 Lecturer: Kartik Sivaramakrishnan Scribe: Kartik Sivaramakrishnan 1 Conic programming Consider the conic program min s.t.
More informationConstrained Optimization and Lagrangian Duality
CIS 520: Machine Learning Oct 02, 2017 Constrained Optimization and Lagrangian Duality Lecturer: Shivani Agarwal Disclaimer: These notes are designed to be a supplement to the lecture. They may or may
More informationLecture 1: Entropy, convexity, and matrix scaling CSE 599S: Entropy optimality, Winter 2016 Instructor: James R. Lee Last updated: January 24, 2016
Lecture 1: Entropy, convexity, and matrix scaling CSE 599S: Entropy optimality, Winter 2016 Instructor: James R. Lee Last updated: January 24, 2016 1 Entropy Since this course is about entropy maximization,
More informationCONVEX OPTIMIZATION VIA LINEARIZATION. Miguel A. Goberna. Universidad de Alicante. Iberian Conference on Optimization Coimbra, November, 2006
CONVEX OPTIMIZATION VIA LINEARIZATION Miguel A. Goberna Universidad de Alicante Iberian Conference on Optimization Coimbra, 16-18 November, 2006 Notation X denotes a l.c. Hausdorff t.v.s and X its topological
More informationOn duality gap in linear conic problems
On duality gap in linear conic problems C. Zălinescu Abstract In their paper Duality of linear conic problems A. Shapiro and A. Nemirovski considered two possible properties (A) and (B) for dual linear
More informationIE 521 Convex Optimization
Lecture 14: and Applications 11th March 2019 Outline LP SOCP SDP LP SOCP SDP 1 / 21 Conic LP SOCP SDP Primal Conic Program: min c T x s.t. Ax K b (CP) : b T y s.t. A T y = c (CD) y K 0 Theorem. (Strong
More informationSubdifferential representation of convex functions: refinements and applications
Subdifferential representation of convex functions: refinements and applications Joël Benoist & Aris Daniilidis Abstract Every lower semicontinuous convex function can be represented through its subdifferential
More informationSome Properties of the Augmented Lagrangian in Cone Constrained Optimization
MATHEMATICS OF OPERATIONS RESEARCH Vol. 29, No. 3, August 2004, pp. 479 491 issn 0364-765X eissn 1526-5471 04 2903 0479 informs doi 10.1287/moor.1040.0103 2004 INFORMS Some Properties of the Augmented
More informationDUALITY AND FARKAS-TYPE RESULTS FOR DC INFINITE PROGRAMMING WITH INEQUALITY CONSTRAINTS. Xiang-Kai Sun*, Sheng-Jie Li and Dan Zhao 1.
TAIWANESE JOURNAL OF MATHEMATICS Vol. 17, No. 4, pp. 1227-1244, August 2013 DOI: 10.11650/tjm.17.2013.2675 This paper is available online at http://journal.taiwanmathsoc.org.tw DUALITY AND FARKAS-TYPE
More informationLecture: Duality.
Lecture: Duality http://bicmr.pku.edu.cn/~wenzw/opt-2016-fall.html Acknowledgement: this slides is based on Prof. Lieven Vandenberghe s lecture notes Introduction 2/35 Lagrange dual problem weak and strong
More information14. Duality. ˆ Upper and lower bounds. ˆ General duality. ˆ Constraint qualifications. ˆ Counterexample. ˆ Complementary slackness.
CS/ECE/ISyE 524 Introduction to Optimization Spring 2016 17 14. Duality ˆ Upper and lower bounds ˆ General duality ˆ Constraint qualifications ˆ Counterexample ˆ Complementary slackness ˆ Examples ˆ Sensitivity
More informationAcyclic Semidefinite Approximations of Quadratically Constrained Quadratic Programs
2015 American Control Conference Palmer House Hilton July 1-3, 2015. Chicago, IL, USA Acyclic Semidefinite Approximations of Quadratically Constrained Quadratic Programs Raphael Louca and Eilyan Bitar
More informationOn deterministic reformulations of distributionally robust joint chance constrained optimization problems
On deterministic reformulations of distributionally robust joint chance constrained optimization problems Weijun Xie and Shabbir Ahmed School of Industrial & Systems Engineering Georgia Institute of Technology,
More informationGEORGIA INSTITUTE OF TECHNOLOGY H. MILTON STEWART SCHOOL OF INDUSTRIAL AND SYSTEMS ENGINEERING LECTURE NOTES OPTIMIZATION III
GEORGIA INSTITUTE OF TECHNOLOGY H. MILTON STEWART SCHOOL OF INDUSTRIAL AND SYSTEMS ENGINEERING LECTURE NOTES OPTIMIZATION III CONVEX ANALYSIS NONLINEAR PROGRAMMING THEORY NONLINEAR PROGRAMMING ALGORITHMS
More informationLecture 5. Theorems of Alternatives and Self-Dual Embedding
IE 8534 1 Lecture 5. Theorems of Alternatives and Self-Dual Embedding IE 8534 2 A system of linear equations may not have a solution. It is well known that either Ax = c has a solution, or A T y = 0, c
More informationON A CLASS OF NONSMOOTH COMPOSITE FUNCTIONS
MATHEMATICS OF OPERATIONS RESEARCH Vol. 28, No. 4, November 2003, pp. 677 692 Printed in U.S.A. ON A CLASS OF NONSMOOTH COMPOSITE FUNCTIONS ALEXANDER SHAPIRO We discuss in this paper a class of nonsmooth
More information1 Robust optimization
ORF 523 Lecture 16 Princeton University Instructor: A.A. Ahmadi Scribe: G. Hall Any typos should be emailed to a a a@princeton.edu. In this lecture, we give a brief introduction to robust optimization
More informationThe Trust Region Subproblem with Non-Intersecting Linear Constraints
The Trust Region Subproblem with Non-Intersecting Linear Constraints Samuel Burer Boshi Yang February 21, 2013 Abstract This paper studies an extended trust region subproblem (etrs in which the trust region
More informationConvex Optimization & Lagrange Duality
Convex Optimization & Lagrange Duality Chee Wei Tan CS 8292 : Advanced Topics in Convex Optimization and its Applications Fall 2010 Outline Convex optimization Optimality condition Lagrange duality KKT
More informationAgenda. 1 Duality for LP. 2 Theorem of alternatives. 3 Conic Duality. 4 Dual cones. 5 Geometric view of cone programs. 6 Conic duality theorem
Agenda 1 Duality for LP 2 Theorem of alternatives 3 Conic Duality 4 Dual cones 5 Geometric view of cone programs 6 Conic duality theorem 7 Examples Lower bounds on LPs By eliminating variables (if needed)
More informationA Unified Analysis of Nonconvex Optimization Duality and Penalty Methods with General Augmenting Functions
A Unified Analysis of Nonconvex Optimization Duality and Penalty Methods with General Augmenting Functions Angelia Nedić and Asuman Ozdaglar April 16, 2006 Abstract In this paper, we study a unifying framework
More informationConvex Optimization. (EE227A: UC Berkeley) Lecture 6. Suvrit Sra. (Conic optimization) 07 Feb, 2013
Convex Optimization (EE227A: UC Berkeley) Lecture 6 (Conic optimization) 07 Feb, 2013 Suvrit Sra Organizational Info Quiz coming up on 19th Feb. Project teams by 19th Feb Good if you can mix your research
More informationChapter 2: Preliminaries and elements of convex analysis
Chapter 2: Preliminaries and elements of convex analysis Edoardo Amaldi DEIB Politecnico di Milano edoardo.amaldi@polimi.it Website: http://home.deib.polimi.it/amaldi/opt-14-15.shtml Academic year 2014-15
More informationSubgradients. subgradients. strong and weak subgradient calculus. optimality conditions via subgradients. directional derivatives
Subgradients subgradients strong and weak subgradient calculus optimality conditions via subgradients directional derivatives Prof. S. Boyd, EE364b, Stanford University Basic inequality recall basic inequality
More informationSubgradients. subgradients and quasigradients. subgradient calculus. optimality conditions via subgradients. directional derivatives
Subgradients subgradients and quasigradients subgradient calculus optimality conditions via subgradients directional derivatives Prof. S. Boyd, EE392o, Stanford University Basic inequality recall basic
More informationLifting for conic mixed-integer programming
Math. Program., Ser. A DOI 1.17/s117-9-282-9 FULL LENGTH PAPER Lifting for conic mixed-integer programming Alper Atamtürk Vishnu Narayanan Received: 13 March 28 / Accepted: 28 January 29 The Author(s)
More informationCourse Notes for EE227C (Spring 2018): Convex Optimization and Approximation
Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee227c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee227c@berkeley.edu
More informationWe describe the generalization of Hazan s algorithm for symmetric programming
ON HAZAN S ALGORITHM FOR SYMMETRIC PROGRAMMING PROBLEMS L. FAYBUSOVICH Abstract. problems We describe the generalization of Hazan s algorithm for symmetric programming Key words. Symmetric programming,
More informationLECTURE 25: REVIEW/EPILOGUE LECTURE OUTLINE
LECTURE 25: REVIEW/EPILOGUE LECTURE OUTLINE CONVEX ANALYSIS AND DUALITY Basic concepts of convex analysis Basic concepts of convex optimization Geometric duality framework - MC/MC Constrained optimization
More informationAlmost Convex Functions: Conjugacy and Duality
Almost Convex Functions: Conjugacy and Duality Radu Ioan Boţ 1, Sorin-Mihai Grad 2, and Gert Wanka 3 1 Faculty of Mathematics, Chemnitz University of Technology, D-09107 Chemnitz, Germany radu.bot@mathematik.tu-chemnitz.de
More informationUNDERGROUND LECTURE NOTES 1: Optimality Conditions for Constrained Optimization Problems
UNDERGROUND LECTURE NOTES 1: Optimality Conditions for Constrained Optimization Problems Robert M. Freund February 2016 c 2016 Massachusetts Institute of Technology. All rights reserved. 1 1 Introduction
More informationRobust combinatorial optimization with variable budgeted uncertainty
Noname manuscript No. (will be inserted by the editor) Robust combinatorial optimization with variable budgeted uncertainty Michael Poss Received: date / Accepted: date Abstract We introduce a new model
More informationLecture: Duality of LP, SOCP and SDP
1/33 Lecture: Duality of LP, SOCP and SDP Zaiwen Wen Beijing International Center For Mathematical Research Peking University http://bicmr.pku.edu.cn/~wenzw/bigdata2017.html wenzw@pku.edu.cn Acknowledgement:
More informationJae Hyoung Lee. Gue Myung Lee
数理解析研究所講究録第 2011 巻 2016 年 8-16 8 On Approximate Solutions for Robust Convex optimization Problems Jae Hyoung Lee Department of Applied Mathematics Pukyong National University Gue Myung Lee Department of
More information4. Convex Sets and (Quasi-)Concave Functions
4. Convex Sets and (Quasi-)Concave Functions Daisuke Oyama Mathematics II April 17, 2017 Convex Sets Definition 4.1 A R N is convex if (1 α)x + αx A whenever x, x A and α [0, 1]. A R N is strictly convex
More informationc 2000 Society for Industrial and Applied Mathematics
SIAM J. OPIM. Vol. 10, No. 3, pp. 750 778 c 2000 Society for Industrial and Applied Mathematics CONES OF MARICES AND SUCCESSIVE CONVEX RELAXAIONS OF NONCONVEX SES MASAKAZU KOJIMA AND LEVEN UNÇEL Abstract.
More informationOptimization Theory. A Concise Introduction. Jiongmin Yong
October 11, 017 16:5 ws-book9x6 Book Title Optimization Theory 017-08-Lecture Notes page 1 1 Optimization Theory A Concise Introduction Jiongmin Yong Optimization Theory 017-08-Lecture Notes page Optimization
More informationIE 521 Convex Optimization
Lecture 1: 16th January 2019 Outline 1 / 20 Which set is different from others? Figure: Four sets 2 / 20 Which set is different from others? Figure: Four sets 3 / 20 Interior, Closure, Boundary Definition.
More informationOptimization for Machine Learning
Optimization for Machine Learning (Problems; Algorithms - A) SUVRIT SRA Massachusetts Institute of Technology PKU Summer School on Data Science (July 2017) Course materials http://suvrit.de/teaching.html
More informationORF 523 Lecture 9 Spring 2016, Princeton University Instructor: A.A. Ahmadi Scribe: G. Hall Thursday, March 10, 2016
ORF 523 Lecture 9 Spring 2016, Princeton University Instructor: A.A. Ahmadi Scribe: G. Hall Thursday, March 10, 2016 When in doubt on the accuracy of these notes, please cross check with the instructor
More informationCHARACTERIZATION OF (QUASI)CONVEX SET-VALUED MAPS
CHARACTERIZATION OF (QUASI)CONVEX SET-VALUED MAPS Abstract. The aim of this paper is to characterize in terms of classical (quasi)convexity of extended real-valued functions the set-valued maps which are
More informationPOLARS AND DUAL CONES
POLARS AND DUAL CONES VERA ROSHCHINA Abstract. The goal of this note is to remind the basic definitions of convex sets and their polars. For more details see the classic references [1, 2] and [3] for polytopes.
More informationThe Value of Adaptability
The Value of Adaptability Dimitris Bertsimas Constantine Caramanis September 30, 2005 Abstract We consider linear optimization problems with deterministic parameter uncertainty. We consider a departure
More informationAdditional Homework Problems
Additional Homework Problems Robert M. Freund April, 2004 2004 Massachusetts Institute of Technology. 1 2 1 Exercises 1. Let IR n + denote the nonnegative orthant, namely IR + n = {x IR n x j ( ) 0,j =1,...,n}.
More informationLecture: Examples of LP, SOCP and SDP
1/34 Lecture: Examples of LP, SOCP and SDP Zaiwen Wen Beijing International Center For Mathematical Research Peking University http://bicmr.pku.edu.cn/~wenzw/bigdata2018.html wenzw@pku.edu.cn Acknowledgement:
More informationRobust conic quadratic programming with ellipsoidal uncertainties
Robust conic quadratic programming with ellipsoidal uncertainties Roland Hildebrand (LJK Grenoble 1 / CNRS, Grenoble) KTH, Stockholm; November 13, 2008 1 Uncertain conic programs min x c, x : Ax + b K
More informationIE 521 Convex Optimization Homework #1 Solution
IE 521 Convex Optimization Homework #1 Solution your NAME here your NetID here February 13, 2019 Instructions. Homework is due Wednesday, February 6, at 1:00pm; no late homework accepted. Please use the
More informationA semidefinite relaxation scheme for quadratically constrained quadratic problems with an additional linear constraint
Iranian Journal of Operations Research Vol. 2, No. 2, 20, pp. 29-34 A semidefinite relaxation scheme for quadratically constrained quadratic problems with an additional linear constraint M. Salahi Semidefinite
More informationAgenda. 1 Cone programming. 2 Convex cones. 3 Generalized inequalities. 4 Linear programming (LP) 5 Second-order cone programming (SOCP)
Agenda 1 Cone programming 2 Convex cones 3 Generalized inequalities 4 Linear programming (LP) 5 Second-order cone programming (SOCP) 6 Semidefinite programming (SDP) 7 Examples Optimization problem in
More informationOptimality Conditions for Constrained Optimization
72 CHAPTER 7 Optimality Conditions for Constrained Optimization 1. First Order Conditions In this section we consider first order optimality conditions for the constrained problem P : minimize f 0 (x)
More informationAssignment 1: From the Definition of Convexity to Helley Theorem
Assignment 1: From the Definition of Convexity to Helley Theorem Exercise 1 Mark in the following list the sets which are convex: 1. {x R 2 : x 1 + i 2 x 2 1, i = 1,..., 10} 2. {x R 2 : x 2 1 + 2ix 1x
More informationBCOL RESEARCH REPORT 07.04
BCOL RESEARCH REPORT 07.04 Industrial Engineering & Operations Research University of California, Berkeley, CA 94720-1777 LIFTING FOR CONIC MIXED-INTEGER PROGRAMMING ALPER ATAMTÜRK AND VISHNU NARAYANAN
More information