RLT-POS: Reformulation-Linearization Technique (RLT)-based Optimization Software for Polynomial Programming Problems

Size: px
Start display at page:

Download "RLT-POS: Reformulation-Linearization Technique (RLT)-based Optimization Software for Polynomial Programming Problems"

Transcription

1 RLT-POS: Reformulation-Linearization Technique (RLT)-based Optimization Software for Polynomial Programming Problems E. Dalkiran 1 H.D. Sherali 2 1 The Department of Industrial and Systems Engineering, Wayne State University 2 Grado Department of Industrial and Systems Engineering, Virginia Tech MINLP 2014 Acknowledgement: This research has been supported by the National Science Foundation under Grant No. CMMI Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 1 / 25

2 Background Polynomial programming formulation Mathematical formulation: Problem PP PP: Minimize φ 0 (x) subject to where φ r (x) β r, r = 1,..., R 1 φ r (x) = β r, r = R 1 + 1,..., R Ax = b x Ω {x : 0 l j x j u j <, j N }, φ r (x) t T r α rt [ j J rt x j ], for r = 0,..., R. Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 2 / 25

3 Background Polynomial programming formulation Mathematical formulation: Problem PP PP: Minimize φ 0 (x) subject to where φ r (x) β r, r = 1,..., R 1 φ r (x) = β r, r = R 1 + 1,..., R Ax = b x Ω {x : 0 l j x j u j <, j N }, φ r (x) t T r α rt [ j J rt x j ], for r = 0,..., R. Reformulation Generate bound-factors and append bound-factor constraints: Bound-factors: (x j l j ) 0 and (u j x j ) 0, j N. Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 2 / 25

4 Background Mathematical formulation: Problem PP Polynomial programming formulation PP: Minimize φ 0 (x) where subject to φ r (x) β r, r = 1,..., R 1 φ r (x) = β r, r = R 1 + 1,..., R Ax = b x Ω {x : 0 l j x j u j <, j N }, φ r (x) t T r α rt [ j J rt x j ], for r = 0,..., R. Reformulation Generate bound-factors and append bound-factor constraints: Bound-factors: (x j l j ) 0 and (u j x j ) 0, j N. Bound-factor constraints: ( ) ( ) xj l j uj x j 0, (J1 J 2 ) N δ. j J 1 j J 2 Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 2 / 25

5 Background Mathematical formulation: Problem PP Polynomial programming formulation PP: Minimize φ 0 (x) where subject to φ r (x) β r, r = 1,..., R 1 φ r (x) = β r, r = R 1 + 1,..., R Ax = b x Ω {x : 0 l j x j u j <, j N }, φ r (x) t T r α rt [ j J rt x j ], for r = 0,..., R. Reformulation Generate bound-factors and append bound-factor constraints: Bound-factors: (x j l j ) 0 and (u j x j ) 0, j N. Bound-factor constraints: ( ) ( ) xj l j uj x j 0, (J1 J 2 ) N δ. j J 1 Linearization Substitute a new RLT variable for each distinct monomial as given by: j J 2 X J = j J x j, J δ N d. d=2 Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 2 / 25

6 Background Reformulation-Linearization Technique (RLT) RLT(Ω ): Minimize [φ 0 (x)] L subject to (2a) [φ r (x)] L β r, r = 1,..., R 1 (2b) [φ r (x)] L = β r, r = R 1 + 1,..., R (2c) Ax = b (2d) ( ) ) x j l j (u j x j 0, (J 1 J 2 ) N δ (2e) j J 1 j J 2 L x Ω {x : 0 l j x j u j <, j N } (2f) X J = j J x j, J δ d=2 N d. (2g) Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 3 / 25

7 Background Reformulation-Linearization Technique (RLT) RLT(Ω ): Minimize [φ 0 (x)] L subject to (2a) [φ r (x)] L β r, r = 1,..., R 1 (2b) [φ r (x)] L = β r, r = R 1 + 1,..., R (2c) Ax = b (2d) ( ) ) x j l j (u j x j 0, (J 1 J 2 ) N δ (2e) j J 1 j J 2 L x Ω {x : 0 l j x j u j <, j N } (2f) X J = j J x j, J δ d=2 N d. (2g) 1 Is there a strict subset of the fundamental bound-factor constraints for which the branch-and-bound algorithm described in Sherali and Tuncbilek [1992] would yet converge to a global optimum? 2 Is there additional valid inequalities that would tighten the RLT relaxation without sacrificing computational effort? Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 3 / 25

8 Background Reformulation-Linearization Technique (RLT) RLT(Ω ): Minimize [φ 0 (x)] L subject to (2a) [φ r (x)] L β r, r = 1,..., R 1 (2b) [φ r (x)] L = β r, r = R 1 + 1,..., R (2c) Ax = b (2d) ( ) ) x j l j (u j x j 0, (J 1 J 2 ) N δ (2e) j J 1 j J 2 L x Ω {x : 0 l j x j u j <, j N } (2f) X J = j J x j, J δ d=2 N d. (2g) 1 Is there a strict subset of the fundamental bound-factor constraints for which the branch-and-bound algorithm described in Sherali and Tuncbilek [1992] would yet converge to a global optimum? 2 Is there additional valid inequalities that would tighten the RLT relaxation without sacrificing computational effort? Model Enhancements: 1 The J-set of filtered bound-factor constraints, 2 Reduced RLT representations or RLT formulations in the reduced space, 3 ν-sdp cuts. Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 3 / 25

9 Model enhancements Constraint filtering routines The J-set of bound-factor constraints For a given index set J, N J J : the largest nonrepetitive set, d J : the cardinality of J. Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 4 / 25

10 Model enhancements Constraint filtering routines The J-set of bound-factor constraints For a given index set J, N J J : the largest nonrepetitive set, d J : the cardinality of J. Standard RLT constraints: j J 1 ( xj l j ) j J 2 ( ) uj x j L 0, (J 1 J 2 ) N J d J. Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 4 / 25

11 Model enhancements The J-set of bound-factor constraints Constraint filtering routines For a given index set J, N J J : the largest nonrepetitive set, d J : the cardinality of J. Standard RLT constraints: j J 1 ( xj l j ) j J 2 ( ) uj x j L 0, (J 1 J 2 ) N J d J. Proposition The convergence result in Sherali and Tuncbilek [1992] holds true if the following RLT bound-factor constraints are appended to the relaxation for each J such that the monomial j J x j appears in Problem PP: ( ) ( ) xj l j uj x j 0, (J 1 J 2 ) = J. j J 1 j J 2 L Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 4 / 25

12 Model enhancements The J-set and the standard RLT Constraint filtering routines The J-set of bound-factor constraints for x 2 1 x 2: [(x 1 l 1 )(x 1 l 1 )(x 2 l 2 )] L 0, [(x 1 l 1 )(u 1 x 1 )(x 2 l 2 )] L 0, [(u 1 x 1 )(u 1 x 1 )(x 2 l 2 )] L 0, [(x 1 l 1 )(x 1 l 1 )(u 2 x 2 )] L 0, [(x 1 l 1 )(u 1 x 1 )(u 2 x 2 )] L 0, [(u 1 x 1 )(u 1 x 1 )(u 2 x 2 )] L 0, Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 5 / 25

13 Model enhancements The J-set and the standard RLT Constraint filtering routines The J-set of bound-factor constraints for x 2 1 x 2: [(x 1 l 1 )(x 1 l 1 )(x 2 l 2 )] L 0, [(x 1 l 1 )(u 1 x 1 )(x 2 l 2 )] L 0, [(u 1 x 1 )(u 1 x 1 )(x 2 l 2 )] L 0, [(x 1 l 1 )(x 1 l 1 )(u 2 x 2 )] L 0, [(x 1 l 1 )(u 1 x 1 )(u 2 x 2 )] L 0, [(u 1 x 1 )(u 1 x 1 )(u 2 x 2 )] L 0, The bound-factor constraints for x 2 1 x 2 with the standard RLT: [(x 1 l 1 )(x 1 l 1 )(x 1 l 1 )] L 0, [(x 1 l 1 )(u 1 x 1 )(u 1 x 1 )] L 0, [(x 1 l 1 )(x 1 l 1 )(u 1 x 1 )] L 0, [(u 1 x 1 )(u 1 x 1 )(u 1 x 1 )] L 0. [(x 1 l 1 )(x 1 l 1 )(x 2 l 2 )] L 0, [(x 1 l 1 )(u 1 x 1 )(x 2 l 2 )] L 0, [(u 1 x 1 )(u 1 x 1 )(x 2 l 2 )] L 0, [(x 1 l 1 )(x 1 l 1 )(u 2 x 2 )] L 0, [(x 1 l 1 )(u 1 x 1 )(u 2 x 2 )] L 0, [(u 1 x 1 )(u 1 x 1 )(u 2 x 2 )] L 0, [(x 1 l 1 )(x 2 l 2 )(x 2 l 2 )] L 0, [(x 1 l 1 )(x 2 x 2 )(u 2 x 2 )] L 0, [(x 1 l 1 )(u 2 x 2 )(u 2 x 2 )] L 0, [(u 1 x 1 )(x 2 l 1 )(x 2 l 2 )] L 0, [(u 1 x 1 )(x 2 l 2 )(u 2 x 2 )] L 0, [(u 1 x 1 )(u 2 x 2 )(u 2 x 2 )] L 0, [(x 2 l 2 )(x 2 l 2 )(x 2 l 2 )] L 0, [(x 2 l 2 )(u 2 x 2 )(u 2 x 2 )] L 0, [(x 2 l 2 )(x 2 l 2 )(u 2 x 2 )] L 0, [(u 2 x 2 )(u 2 x 2 )(u 2 x 2 )] L 0. Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 5 / 25

14 Model enhancements Constraint filtering routines The number of RLT constraints and new RLT variables x j = j J x rj j j N J Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 6 / 25

15 Model enhancements Constraint filtering routines The number of RLT constraints and new RLT variables x j = j J x rj j j N J J-set N δ -set The number of bound-factor constraints j N (r j + 1) J The number of new RLT variables j N (r j + 1) ( N J + 1) J ( 2n + δ 1 ) ( n + δ ) δ δ (n + 1) Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 6 / 25

16 Model enhancements Constraint filtering routines The number of RLT constraints and new RLT variables x j = j J x rj j j N J J-set N δ -set The number of bound-factor constraints j N (r j + 1) J The number of new RLT variables j N (r j + 1) ( N J + 1) J ( 2n + δ 1 ) ( n + δ ) δ δ (n + 1) Example: For a PP involving only x1 3x 2x3 2 and x 4 2x 5x6 3 as nonlinear terms, the J-set and the N δ -set respectively generate in total: 48 and bound-factor constraints, 40 and 917 new RLT variables. Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 6 / 25

17 Model enhancements Reduced RLT representations Reduced RLT representations Linear Equality Subsystem: Ax = b Constraint-based RLT restrictions: [(Ax = b) j J x j ] L, yielding AX (. J) = bx J, J N d, d = 1,..., δ 1. (3) Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 7 / 25

18 Model enhancements Reduced RLT representations Reduced RLT representations Linear Equality Subsystem: Ax = b Constraint-based RLT restrictions: [(Ax = b) j J x j ] L, yielding AX (. J) = bx J, J N d, d = 1,..., δ 1. (3) Given a basis B of A, Ax = b Bx B + Nx N = b, AX (. J) = bx J, BX (BJ) + NX (NJ) = bx J, J N d, d = 1,..., δ 1. Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 7 / 25

19 Model enhancements Reduced RLT representations Reduced RLT representations Linear Equality Subsystem: Ax = b Constraint-based RLT restrictions: [(Ax = b) j J x j ] L, yielding AX (. J) = bx J, J N d, d = 1,..., δ 1. (3) Given a basis B of A, Ax = b Bx B + Nx N = b, AX (. J) = bx J, BX (BJ) + NX (NJ) = bx J, J N d, d = 1,..., δ 1. Proposition Let the equality system Ax = b be partitioned as Bx B + Nx N = b for any basis B of A, and define Z = (x, X) : Ax = b, (3) and X J = x j, J JN d, for d = 2,..., δ. j J Then, we have X J = j J x j, J N d, for d = 2,..., δ}. Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 7 / 25

20 Equivalent formulations Model enhancements Reduced RLT representations PP1: BX (BJ) + NX (NJ) = bx J, J N d, d = 1,..., δ 1 [ j J1 (x j l j ) j J2 (u j x j )] L 0, (J 1 J 2 ) N δ l x u X J = j J x j, J N d, d = 2,..., δ. Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 8 / 25

21 Equivalent formulations Model enhancements Reduced RLT representations PP1: BX (BJ) + NX (NJ) = bx J, J N d, d = 1,..., δ 1 [ j J1 (x j l j ) j J2 (u j x j )] L 0, (J 1 J 2 ) N δ l x u X J = j J x j, J N d, d = 2,..., δ. PP2: BX (BJ) + NX (NJ) = bx J, J N d, d = 1,..., δ 1 [ j J1 (x j l j ) j J2 (u j x j )] L 0, (J 1 J 2 ) J δ N l x u, and j J l j X J j J u j, J N d, d =2,...,δ, J B J 1 X J = j J x j, J JN d, d = 2,..., δ. Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 8 / 25

22 Equivalent formulations Model enhancements Reduced RLT representations PP1: BX (BJ) + NX (NJ) = bx J, J N d, d = 1,..., δ 1 [ j J1 (x j l j ) j J2 (u j x j )] L 0, (J 1 J 2 ) N δ l x u X J = j J x j, J N d, d = 2,..., δ. PP2: BX (BJ) + NX (NJ) = bx J, J N d, d = 1,..., δ 1 [ j J1 (x j l j ) j J2 (u j x j )] L 0, (J 1 J 2 ) J δ N l x u, and j J l j X J j J u j, J N d, d =2,...,δ, J B J 1 X J = j J x j, J JN d, d = 2,..., δ. Enhancing Algorithm RLT(PP2): Identify key bound-factor constraints, positive associated dual variables, at the root node and append within Problem PP2. Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 8 / 25

23 Model enhancements RLT(PP1) in R n m and Hybrid algorithm Reduced RLT representations RLT(PP1) in R n m : Eliminate the basic variables x B for the basis B via the substitution. Then, implement regular RLT process in the space of the (n m) nonbasic variables. Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 9 / 25

24 Model enhancements Reduced RLT representations RLT(PP1) in R n m and Hybrid algorithm RLT(PP1) in R n m : Eliminate the basic variables x B for the basis B via the substitution. Then, implement regular RLT process in the space of the (n m) nonbasic variables. RLT SDP (PP2) vs. RLT SDP (PP1) in R n m The size of the LP relaxations, The quality of the lower bounds at the root node. Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 9 / 25

25 Model enhancements Reduced RLT representations RLT(PP1) in R n m and Hybrid algorithm RLT(PP1) in R n m : Eliminate the basic variables x B for the basis B via the substitution. Then, implement regular RLT process in the space of the (n m) nonbasic variables. RLT SDP (PP2) vs. RLT SDP (PP1) in R n m The size of the LP relaxations, The quality of the lower bounds at the root node. RLT SDP (Hybrid) The swiftness of RLT SDP (PP1) in R n m, The robustness of RLT SDP (PP2). Compute µ = GAP 1 GAP 2 N 1 N 2 Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 9 / 25

26 Model enhancements Reduced RLT representations RLT(PP1) in R n m and Hybrid algorithm RLT(PP1) in R n m : Eliminate the basic variables x B for the basis B via the substitution. Then, implement regular RLT process in the space of the (n m) nonbasic variables. RLT SDP (PP2) vs. RLT SDP (PP1) in R n m The size of the LP relaxations, The quality of the lower bounds at the root node. RLT SDP (Hybrid) The swiftness of RLT SDP (PP1) in R n m, The robustness of RLT SDP (PP2). Compute µ = GAP 1 GAP 2 N 1 N 2 If (µ < 1) implement RLT SDP (PP2) else implement RLT SDP (PP1) in R n m. Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 9 / 25

27 Model enhancements Coordination between constraint filtering and reduced basis techniques PP: Minimize x 1 x 2 x 3 x5 2 subject to x x 3 + x 4 = 3 x 2 + x 5 = 6 x 3 x 5 x l i x i u i, i = 1, 2, 3, 4, 5. Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 10 / 25

28 Model enhancements Coordination between constraint filtering and reduced basis techniques PP: Minimize x 1 x 2 x 3 x5 2 subject to x x 3 + x 4 = 3 x 2 + x 5 = 6 x 3 x 5 x l i x i u i, i = 1, 2, 3, 4, 5. X X X 1355 = 0 X X X X 3555 = 0 X X X X 355 = 0. Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 10 / 25

29 Model enhancements Coordination between constraint filtering and reduced basis techniques PP: Minimize x 1 x 2 x 3 x5 2 subject to x x 3 + x 4 = 3 x 2 + x 5 = 6 x 3 x 5 x l i x i u i, i = 1, 2, 3, 4, 5. X X X 1355 = 0 X X X X 3555 = 0 X X X X 355 = 0. or X X X X 2355 = 0 X X X 3355 = 0 X X X 3455 = 0 X X X 355 = 0. Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 10 / 25

30 Model enhancements Coordination between constraint filtering and reduced basis techniques Reduced RLT Routine for Sparse Problems: Initialization: Define K {J : X J appears in (2a) (2c) and J J B 1}, K =, and L {J : X J appears within (2a) (2c) and J J B = }. Step 1: If K =, go to Step 4. Else, select an index set J K that involves the maximum number of basic variables, delete it from the list K and add it to the list K. Step 2: Let B j J be a randomly selected basic variable index. Multiply the representation of the basic variable x Bj in terms of the nonbasic variables by J {B j } x j, and linearize and append this to the relaxation. Step 3: If J {B j } involves any basic variable, the multiplication at Step 2 generates monomials involving basic variables. Include these monomials within the list K. Otherwise, if J {B j } does not involve any basic variable, then include the resulting monomials within the set L. Continue with Step 1. Step 4: Apply the J-set Routine to the set L in order to generate the proposed set of filtered bound-factor restrictions for reformulating the model. Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 11 / 25

31 Model enhancements Coordination between constraint filtering and reduced basis techniques PP: Minimize x 1 x 2 x 3 x 2 5 subject to x x 3 + x 4 = 3 x 2 + x 5 = 6 x 3 x 5 x l i x i u i, i = 1, 2, 3, 4, 5. J-RRLT: Minimize X subject to x x 3 + x 4 = 3 X X X X 3555 = 0 X X X X 355 = 0 x 2 + x 5 = 6 X X X 1355 = 0 X 35 X ( ) xj l j j J 1 j J 2 ( ) uj x j l i x i u i, i = 1, 2, 3, 4, 5, where L = {{4, 4}, {3, 3, 5, 5, 5}, {3, 4, 5, 5, 5}}. L 0, (J 1 J 2 ) L Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 12 / 25

32 Model enhancements Coordination between constraint filtering and reduced basis techniques J-set in R n m : Minimize 18X 355 3X X X X X subject to X 35 X l x 3 x 4 u 1 l 2 6 x 5 u 2 ( ) xj l j j J 1 j J 2 l i x i u i, i = 3, 4, 5. ( ) uj x j L 0, (J 1 J 2 ) L Table : Relaxation sizes and optimality gaps. # of equalities # of bound-factor constraints % optimality gap RLT RLT-E RRLT RRLT J-set J-RRLT J-NB Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 13 / 25

33 Model enhancements Coordination between constraint filtering and reduced basis techniques Table : The number of problems solved within the minimum CPU time among the J-set, J-RRLT+, and the J-NB. Degree J-set J-RRLT+ J-NB J-Hybrid (Offline) Total Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 14 / 25

34 Model enhancements Coordination between constraint filtering and reduced basis techniques Table : The number of problems solved within the minimum CPU time among the J-set, J-RRLT+, and the J-NB. Degree J-set J-RRLT+ J-NB J-Hybrid (Offline) Total Table : Average CPU time (in seconds) with the reduced basis techniques for sparse problems. Degree J-set J-RRLT+ J-NB J-Hybrid (Offline) Minimum J-Hybrid Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 14 / 25

35 ν-sdp cuts Model enhancements Semidefinite programming (SDP)-induced cuts (ν-sdp cuts) [xx T ] is symmetric and positive semidefinite M 0 = [xx T ] L 0. Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 15 / 25

36 ν-sdp cuts Model enhancements Semidefinite programming (SDP)-induced cuts (ν-sdp cuts) [xx T ] is symmetric and positive semidefinite M 0 = [xx T ] L 0. A stronger [ implication ] in this same vein is: [ 1 x (1) =, and defining the matrix M x 1 [x (1) x(1) T 1 x ] T L = x M 0 ] 0. Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 15 / 25

37 ν-sdp cuts Model enhancements Semidefinite programming (SDP)-induced cuts (ν-sdp cuts) [xx T ] is symmetric and positive semidefinite M 0 = [xx T ] L 0. A stronger [ implication ] in this same vein is: [ 1 x (1) =, and defining the matrix M x 1 [x (1) x(1) T 1 x ] T L = x M 0 ] 0. Two main approaches: 1 SDP relaxations, 2 SDP-induced valid inequalities. M = [νν T ] 0 α T Mα = [(α T ν) 2 ] L 0, α R n (orr n+1 ), α = 1. Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 15 / 25

38 ν-sdp cuts Model enhancements Semidefinite programming (SDP)-induced cuts (ν-sdp cuts) [xx T ] is symmetric and positive semidefinite M 0 = [xx T ] L 0. A stronger [ implication ] in this same vein is: [ 1 x (1) =, and defining the matrix M x 1 [x (1) x(1) T 1 x ] T L = x M 0 ] 0. Two main approaches: 1 SDP relaxations, 2 SDP-induced valid inequalities. M = [νν T ] 0 α T Mα = [(α T ν) 2 ] L 0, α R n (orr n+1 ), α = 1. 1 Let ( x, X) be a solution to the RLT relaxation. 2 Check if M 0, where M evaluates M at the solution ( x, X). Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 15 / 25

39 ν-sdp cuts Model enhancements Semidefinite programming (SDP)-induced cuts (ν-sdp cuts) [xx T ] is symmetric and positive semidefinite M 0 = [xx T ] L 0. A stronger [ implication ] in this same vein is: [ 1 x (1) =, and defining the matrix M x 1 [x (1) x(1) T 1 x ] T L = x M 0 ] 0. Two main approaches: 1 SDP relaxations, 2 SDP-induced valid inequalities. M = [νν T ] 0 α T Mα = [(α T ν) 2 ] L 0, α R n (orr n+1 ), α = 1. 1 Let ( x, X) be a solution to the RLT relaxation. 2 Check if M 0, where M evaluates M at the solution ( x, X). If not, we have an ᾱ R n+1 such that ᾱ T Mᾱ < 0. Append the SDP cut ᾱ T Mᾱ = [(ᾱ T ν) 2 ] L 0, and go to Step 1. Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 15 / 25

40 ν-vectors Model enhancements Semidefinite programming (SDP)-induced cuts (ν-sdp cuts) [ ] T ν (1) 1, {xj, j N}, {all quadratic monomials using x = j, j N},..., R ( n+ ). {all monomials of order using x j, j N } Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 16 / 25

41 ν-vectors Model enhancements Semidefinite programming (SDP)-induced cuts (ν-sdp cuts) [ ] T ν (1) 1, {xj, j N}, {all quadratic monomials using x = j, j N},..., R ( n+ ). {all monomials of order using x j, j N } where [ ν (2) 1, {xj, j N = J }, {all quadratic monomials using x j, j N J },..., {all monomials of order using x j, j N J } { J arg max J N r {0, 1,..., R} : J rt = J for some t Tr [ α rt XJrt ] } x j. j J rt ] T Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 16 / 25

42 Model enhancements Semidefinite programming (SDP)-induced cuts (ν-sdp cuts) ν-vectors [ ] T ν (1) 1, {xj, j N}, {all quadratic monomials using x = j, j N},..., R ( n+ ). {all monomials of order using x j, j N } [ ν (2) 1, {xj, j N = J }, {all quadratic monomials using x j, j N J },..., {all monomials of order using x j, j N J } ] T where { J arg max J N r {0, 1,..., R} : J rt = J for some t Tr [ α rt XJrt ] } x j. j J rt For a polynomial constraint φ r (x) β r of order δ r, define r δ 2 δr. If r 1, let 2 ν (3) = [1, all monomials of order r using x j, j N ] T. Then, we impose the following: { [φ r (x) β r ]ν (3) (ν (3) ) T } L 0. Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 16 / 25

43 Model enhancements Semidefinite programming (SDP)-induced cuts (ν-sdp cuts) ν-sdp cut inheritance parent ν-sdp inherit parent ν-sdp self Copy all Filter inactive cuts left child ν-sdp self left child ν-sdp inherit right child ν-sdp self right child ν-sdp inherit Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 17 / 25

44 Cut generation vs. branching Model enhancements Semidefinite programming (SDP)-induced cuts (ν-sdp cuts) i i+1 GAP branching = GAP left GAP parent max (1, UB ) i+2 GAP branching i+1 i+2 i+3 GAP branching i+4 GAP branching SDP cuts? i+3 i+4 Expected ( GAP i+3 SDP ) > GAP i+3 branching 2 Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 18 / 25

45 Model enhancements Semidefinite programming (SDP)-induced cuts (ν-sdp cuts) Table : Performances of SDP cut generation routines. Average CPU time (in seconds) Number of unsolved problems Degree No SDP Routine 1 Routine 2 Routine 3 Routine 4 No SDP Routine 1 Routine 2 Routine 3 Routine Routine 1: Generate SDP cuts and re-optimize the relaxation if they perform well. Else, generate and store SDP cuts for inheritance, if they perform well. Else, don t generate. Routine 2: Generate and store SDP cuts for inheritance, if they perform well. Else, don t generate. Routine 3: Generate SDP cuts and re-optimize the relaxation. Routine 4: Generate and store SDP cuts for inheritance. Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 19 / 25

46 CPU Time (in seconds) CPU Time (in seconds) Computational Results Problems without equality constraints Figure : Performance of the RLT algorithms for solving polynomial problems without equality constraints (in CPU seconds). RLT J-set Degree-two Problem Density Degree-four Degree-six Problem Density 0.0 Problem Density Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 20 / 25

47 CPU Time (in seconds) CPU Time (in seconds) Computational Results Problems with equality constraints Figure : Performance of the RLT algorithms for solving quadratic and cubic problems with equality constraints (in CPU seconds) Degree-two RLT-E J-Hybrid Proposed Degree-three RLT-E J-Hybrid Proposed 0.1 Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 21 / 25

48 CPU Time (in seconds) CPU Time (in seconds) Computational Results Problems with equality constraints Figure : Performance of the RLT Hybrid algorithms for solving degree-four, -five, -six, and -seven problems with equality constraints (in CPU seconds) Degree-four RLT-Hybrid J-Hybrid Proposed Degree-six RLT-Hybrid J-Hybrid Proposed Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 22 / 25

49 CPU Time (in seconds) CPU Time (in seconds) Computational Results Problems with equality constraints RLT-POS vs. BARON RLT-POS BARON Degree-two RLT-POS Degree-four 1000 RLT-POS Degree-six 100 BARON 100 BARON Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 23 / 25

50 Conclusions and future research directions Coordination between constraint filtering and reduced basis techniques. SDP cut generation routine for sparse problems. The J-Hybrid algorithm. RLT-based open-source optimization software. Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 24 / 25

51 Conclusions and future research directions Coordination between constraint filtering and reduced basis techniques. SDP cut generation routine for sparse problems. The J-Hybrid algorithm. RLT-based open-source optimization software. Nonlinear equality constraints. Tighten the relaxation in the reduced subspace. Stability of J-set of relaxations: Barrier and dual optimizer of CPLEX. Factorable programming problems and nonlinear integer programming problems. Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 24 / 25

52 Conclusions and future research directions Thank you! Dalkiran, Sherali (WSU, VT) RLT-based Optimization Software 25 / 25

Reduced RLT Representations for Nonconvex Polynomial Programming Problems

Reduced RLT Representations for Nonconvex Polynomial Programming Problems Journal of Global Optimization manuscript No. (will be inserted by the editor) Reduced RLT Representations for Nonconvex Polynomial Programming Problems Hanif D. Sherali Evrim Dalkiran Leo Liberti November

More information

Cutting Planes for First Level RLT Relaxations of Mixed 0-1 Programs

Cutting Planes for First Level RLT Relaxations of Mixed 0-1 Programs Cutting Planes for First Level RLT Relaxations of Mixed 0-1 Programs 1 Cambridge, July 2013 1 Joint work with Franklin Djeumou Fomeni and Adam N. Letchford Outline 1. Introduction 2. Literature Review

More information

From structures to heuristics to global solvers

From structures to heuristics to global solvers From structures to heuristics to global solvers Timo Berthold Zuse Institute Berlin DFG Research Center MATHEON Mathematics for key technologies OR2013, 04/Sep/13, Rotterdam Outline From structures to

More information

Linear programs, convex polyhedra, extreme points

Linear programs, convex polyhedra, extreme points MVE165/MMG631 Extreme points of convex polyhedra; reformulations; basic feasible solutions; the simplex method Ann-Brith Strömberg 2015 03 27 Linear programs, convex polyhedra, extreme points A linear

More information

Reduced rst-level representations via the reformulation-linearization technique: results, counterexamples, and computations

Reduced rst-level representations via the reformulation-linearization technique: results, counterexamples, and computations Discrete Applied Mathematics 101 (2000) 247 267 Reduced rst-level representations via the reformulation-linearization technique: results, counterexamples, and computations Hanif D. Sherali a;, Jonathan

More information

A Note on Representations of Linear Inequalities in Non-Convex Mixed-Integer Quadratic Programs

A Note on Representations of Linear Inequalities in Non-Convex Mixed-Integer Quadratic Programs A Note on Representations of Linear Inequalities in Non-Convex Mixed-Integer Quadratic Programs Adam N. Letchford Daniel J. Grainger To appear in Operations Research Letters Abstract In the literature

More information

Introduction to Mathematical Programming IE406. Lecture 21. Dr. Ted Ralphs

Introduction to Mathematical Programming IE406. Lecture 21. Dr. Ted Ralphs Introduction to Mathematical Programming IE406 Lecture 21 Dr. Ted Ralphs IE406 Lecture 21 1 Reading for This Lecture Bertsimas Sections 10.2, 10.3, 11.1, 11.2 IE406 Lecture 21 2 Branch and Bound Branch

More information

Semidefinite Programming Basics and Applications

Semidefinite Programming Basics and Applications Semidefinite Programming Basics and Applications Ray Pörn, principal lecturer Åbo Akademi University Novia University of Applied Sciences Content What is semidefinite programming (SDP)? How to represent

More information

Global optimization of minimum weight truss topology problems with stress, displacement, and local buckling constraints using branch-and-bound

Global optimization of minimum weight truss topology problems with stress, displacement, and local buckling constraints using branch-and-bound INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING Int. J. Numer. Meth. Engng 2004; 61:1270 1309 (DOI: 10.1002/nme.1112) Global optimization of minimum weight truss topology problems with stress,

More information

Cutting Planes for RLT Relaxations of Mixed 0-1 Polynomial Programs

Cutting Planes for RLT Relaxations of Mixed 0-1 Polynomial Programs Cutting Planes for RLT Relaxations of Mixed 0-1 Polynomial Programs Franklin Djeumou Fomeni Konstantinos Kaparis Adam N. Letchford December 2013 Abstract The Reformulation-Linearization Technique (RLT),

More information

Section Notes 9. IP: Cutting Planes. Applied Math 121. Week of April 12, 2010

Section Notes 9. IP: Cutting Planes. Applied Math 121. Week of April 12, 2010 Section Notes 9 IP: Cutting Planes Applied Math 121 Week of April 12, 2010 Goals for the week understand what a strong formulations is. be familiar with the cutting planes algorithm and the types of cuts

More information

New Developments on OPF Problems. Daniel Bienstock, Columbia University. April 2016

New Developments on OPF Problems. Daniel Bienstock, Columbia University. April 2016 New Developments on OPF Problems Daniel Bienstock, Columbia University April 2016 Organization: Intro to OPF and basic modern mathematics related to OPF Newer results, including basic methodologies for

More information

Cutting Planes for RLT Relaxations of Mixed 0-1 Polynomial Programs

Cutting Planes for RLT Relaxations of Mixed 0-1 Polynomial Programs Cutting Planes for RLT Relaxations of Mixed 0-1 Polynomial Programs Franklin Djeumou Fomeni Konstantinos Kaparis Adam N. Letchford To Appear in Mathematical Programming Abstract The Reformulation-Linearization

More information

Comparing Convex Relaxations for Quadratically Constrained Quadratic Programming

Comparing Convex Relaxations for Quadratically Constrained Quadratic Programming Comparing Convex Relaxations for Quadratically Constrained Quadratic Programming Kurt M. Anstreicher Dept. of Management Sciences University of Iowa European Workshop on MINLP, Marseille, April 2010 The

More information

Cuts for mixed 0-1 conic programs

Cuts for mixed 0-1 conic programs Cuts for mixed 0-1 conic programs G. Iyengar 1 M. T. Cezik 2 1 IEOR Department Columbia University, New York. 2 GERAD Université de Montréal, Montréal TU-Chemnitz Workshop on Integer Programming and Continuous

More information

Mixed Integer Non Linear Programming

Mixed Integer Non Linear Programming Mixed Integer Non Linear Programming Claudia D Ambrosio CNRS Research Scientist CNRS & LIX, École Polytechnique MPRO PMA 2016-2017 Outline What is a MINLP? Dealing with nonconvexities Global Optimization

More information

MVE165/MMG631 Linear and integer optimization with applications Lecture 8 Discrete optimization: theory and algorithms

MVE165/MMG631 Linear and integer optimization with applications Lecture 8 Discrete optimization: theory and algorithms MVE165/MMG631 Linear and integer optimization with applications Lecture 8 Discrete optimization: theory and algorithms Ann-Brith Strömberg 2017 04 07 Lecture 8 Linear and integer optimization with applications

More information

Semidefinite Programming

Semidefinite Programming Semidefinite Programming Basics and SOS Fernando Mário de Oliveira Filho Campos do Jordão, 2 November 23 Available at: www.ime.usp.br/~fmario under talks Conic programming V is a real vector space h, i

More information

GLOBAL OPTIMIZATION WITH GAMS/BARON

GLOBAL OPTIMIZATION WITH GAMS/BARON GLOBAL OPTIMIZATION WITH GAMS/BARON Nick Sahinidis Chemical and Biomolecular Engineering University of Illinois at Urbana Mohit Tawarmalani Krannert School of Management Purdue University MIXED-INTEGER

More information

Convexification of Mixed-Integer Quadratically Constrained Quadratic Programs

Convexification of Mixed-Integer Quadratically Constrained Quadratic Programs Convexification of Mixed-Integer Quadratically Constrained Quadratic Programs Laura Galli 1 Adam N. Letchford 2 Lancaster, April 2011 1 DEIS, University of Bologna, Italy 2 Department of Management Science,

More information

Disconnecting Networks via Node Deletions

Disconnecting Networks via Node Deletions 1 / 27 Disconnecting Networks via Node Deletions Exact Interdiction Models and Algorithms Siqian Shen 1 J. Cole Smith 2 R. Goli 2 1 IOE, University of Michigan 2 ISE, University of Florida 2012 INFORMS

More information

Strong Formulations of Robust Mixed 0 1 Programming

Strong Formulations of Robust Mixed 0 1 Programming Math. Program., Ser. B 108, 235 250 (2006) Digital Object Identifier (DOI) 10.1007/s10107-006-0709-5 Alper Atamtürk Strong Formulations of Robust Mixed 0 1 Programming Received: January 27, 2004 / Accepted:

More information

Chapter Six. Polynomials. Properties of Exponents Algebraic Expressions Addition, Subtraction, and Multiplication Factoring Solving by Factoring

Chapter Six. Polynomials. Properties of Exponents Algebraic Expressions Addition, Subtraction, and Multiplication Factoring Solving by Factoring Chapter Six Polynomials Properties of Exponents Algebraic Expressions Addition, Subtraction, and Multiplication Factoring Solving by Factoring Properties of Exponents The properties below form the basis

More information

3.7 Cutting plane methods

3.7 Cutting plane methods 3.7 Cutting plane methods Generic ILP problem min{ c t x : x X = {x Z n + : Ax b} } with m n matrix A and n 1 vector b of rationals. According to Meyer s theorem: There exists an ideal formulation: conv(x

More information

An exact reformulation algorithm for large nonconvex NLPs involving bilinear terms

An exact reformulation algorithm for large nonconvex NLPs involving bilinear terms An exact reformulation algorithm for large nonconvex NLPs involving bilinear terms Leo Liberti CNRS LIX, École Polytechnique, F-91128 Palaiseau, France. E-mail: liberti@lix.polytechnique.fr. Constantinos

More information

Quadratic reformulation techniques for 0-1 quadratic programs

Quadratic reformulation techniques for 0-1 quadratic programs OSE SEMINAR 2014 Quadratic reformulation techniques for 0-1 quadratic programs Ray Pörn CENTER OF EXCELLENCE IN OPTIMIZATION AND SYSTEMS ENGINEERING ÅBO AKADEMI UNIVERSITY ÅBO NOVEMBER 14th 2014 2 Structure

More information

Applications. Stephen J. Stoyan, Maged M. Dessouky*, and Xiaoqing Wang

Applications. Stephen J. Stoyan, Maged M. Dessouky*, and Xiaoqing Wang Introduction to Large-Scale Linear Programming and Applications Stephen J. Stoyan, Maged M. Dessouky*, and Xiaoqing Wang Daniel J. Epstein Department of Industrial and Systems Engineering, University of

More information

AM 121: Intro to Optimization! Models and Methods! Fall 2018!

AM 121: Intro to Optimization! Models and Methods! Fall 2018! AM 121: Intro to Optimization Models and Methods Fall 2018 Lecture 13: Branch and Bound (I) Yiling Chen SEAS Example: max 5x 1 + 8x 2 s.t. x 1 + x 2 6 5x 1 + 9x 2 45 x 1, x 2 0, integer 1 x 2 6 5 x 1 +x

More information

Lagrange Duality. Daniel P. Palomar. Hong Kong University of Science and Technology (HKUST)

Lagrange Duality. Daniel P. Palomar. Hong Kong University of Science and Technology (HKUST) Lagrange Duality Daniel P. Palomar Hong Kong University of Science and Technology (HKUST) ELEC5470 - Convex Optimization Fall 2017-18, HKUST, Hong Kong Outline of Lecture Lagrangian Dual function Dual

More information

Cutting Planes in SCIP

Cutting Planes in SCIP Cutting Planes in SCIP Kati Wolter Zuse-Institute Berlin Department Optimization Berlin, 6th June 2007 Outline 1 Cutting Planes in SCIP 2 Cutting Planes for the 0-1 Knapsack Problem 2.1 Cover Cuts 2.2

More information

Nonconvex Quadratic Programming: Return of the Boolean Quadric Polytope

Nonconvex Quadratic Programming: Return of the Boolean Quadric Polytope Nonconvex Quadratic Programming: Return of the Boolean Quadric Polytope Kurt M. Anstreicher Dept. of Management Sciences University of Iowa Seminar, Chinese University of Hong Kong, October 2009 We consider

More information

Lift-and-Project Techniques and SDP Hierarchies

Lift-and-Project Techniques and SDP Hierarchies MFO seminar on Semidefinite Programming May 30, 2010 Typical combinatorial optimization problem: max c T x s.t. Ax b, x {0, 1} n P := {x R n Ax b} P I := conv(k {0, 1} n ) LP relaxation Integral polytope

More information

SEMIDEFINITE PROGRAMMING VS. LP RELAXATIONS FOR POLYNOMIAL PROGRAMMING

SEMIDEFINITE PROGRAMMING VS. LP RELAXATIONS FOR POLYNOMIAL PROGRAMMING MATHEMATICS OF OPERATIONS RESEARCH Vol. 27, No. 2, May 2002, pp. 347 360 Printed in U.S.A. SEMIDEFINITE PROGRAMMING VS. LP RELAXATIONS FOR POLYNOMIAL PROGRAMMING JEAN B. LASSERRE We consider the global

More information

arxiv: v1 [cs.cc] 5 Dec 2018

arxiv: v1 [cs.cc] 5 Dec 2018 Consistency for 0 1 Programming Danial Davarnia 1 and J. N. Hooker 2 1 Iowa state University davarnia@iastate.edu 2 Carnegie Mellon University jh38@andrew.cmu.edu arxiv:1812.02215v1 [cs.cc] 5 Dec 2018

More information

1 Solving Algebraic Equations

1 Solving Algebraic Equations Arkansas Tech University MATH 1203: Trigonometry Dr. Marcel B. Finan 1 Solving Algebraic Equations This section illustrates the processes of solving linear and quadratic equations. The Geometry of Real

More information

Compact relaxations for polynomial programming problems

Compact relaxations for polynomial programming problems Compact relaxations for polynomial programming problems Sonia Cafieri 1, Pierre Hansen 2,4, Lucas Létocart 3, Leo Liberti 4, and Frédéric Messine 5 1 Laboratoire MAIAA, Ecole Nationale de l Aviation Civile,

More information

Outline. Relaxation. Outline DMP204 SCHEDULING, TIMETABLING AND ROUTING. 1. Lagrangian Relaxation. Lecture 12 Single Machine Models, Column Generation

Outline. Relaxation. Outline DMP204 SCHEDULING, TIMETABLING AND ROUTING. 1. Lagrangian Relaxation. Lecture 12 Single Machine Models, Column Generation Outline DMP204 SCHEDULING, TIMETABLING AND ROUTING 1. Lagrangian Relaxation Lecture 12 Single Machine Models, Column Generation 2. Dantzig-Wolfe Decomposition Dantzig-Wolfe Decomposition Delayed Column

More information

Introduction to optimization

Introduction to optimization Introduction to optimization Geir Dahl CMA, Dept. of Mathematics and Dept. of Informatics University of Oslo 1 / 24 The plan 1. The basic concepts 2. Some useful tools (linear programming = linear optimization)

More information

Integer Programming ISE 418. Lecture 8. Dr. Ted Ralphs

Integer Programming ISE 418. Lecture 8. Dr. Ted Ralphs Integer Programming ISE 418 Lecture 8 Dr. Ted Ralphs ISE 418 Lecture 8 1 Reading for This Lecture Wolsey Chapter 2 Nemhauser and Wolsey Sections II.3.1, II.3.6, II.4.1, II.4.2, II.5.4 Duality for Mixed-Integer

More information

The Trust Region Subproblem with Non-Intersecting Linear Constraints

The Trust Region Subproblem with Non-Intersecting Linear Constraints The Trust Region Subproblem with Non-Intersecting Linear Constraints Samuel Burer Boshi Yang February 21, 2013 Abstract This paper studies an extended trust region subproblem (etrs in which the trust region

More information

A finite branch-and-bound algorithm for nonconvex quadratic programming via semidefinite relaxations

A finite branch-and-bound algorithm for nonconvex quadratic programming via semidefinite relaxations Math. Program., Ser. A (2008) 3:259 282 DOI 0.007/s007-006-0080-6 FULL LENGTH PAPER A finite branch-and-bound algorithm for nonconvex quadratic programming via semidefinite relaxations Samuel Burer Dieter

More information

Constrained optimization: direct methods (cont.)

Constrained optimization: direct methods (cont.) Constrained optimization: direct methods (cont.) Jussi Hakanen Post-doctoral researcher jussi.hakanen@jyu.fi Direct methods Also known as methods of feasible directions Idea in a point x h, generate a

More information

Research Reports on Mathematical and Computing Sciences

Research Reports on Mathematical and Computing Sciences ISSN 1342-284 Research Reports on Mathematical and Computing Sciences Exploiting Sparsity in Linear and Nonlinear Matrix Inequalities via Positive Semidefinite Matrix Completion Sunyoung Kim, Masakazu

More information

Easy and not so easy multifacility location problems... (In 20 minutes.)

Easy and not so easy multifacility location problems... (In 20 minutes.) Easy and not so easy multifacility location problems... (In 20 minutes.) MINLP 2014 Pittsburgh, June 2014 Justo Puerto Institute of Mathematics (IMUS) Universidad de Sevilla Outline 1 Introduction (In

More information

Benders Decomposition

Benders Decomposition Benders Decomposition Yuping Huang, Dr. Qipeng Phil Zheng Department of Industrial and Management Systems Engineering West Virginia University IENG 593G Nonlinear Programg, Spring 2012 Yuping Huang (IMSE@WVU)

More information

Indicator Constraints in Mixed-Integer Programming

Indicator Constraints in Mixed-Integer Programming Indicator Constraints in Mixed-Integer Programming Andrea Lodi University of Bologna, Italy - andrea.lodi@unibo.it Amaya Nogales-Gómez, Universidad de Sevilla, Spain Pietro Belotti, FICO, UK Matteo Fischetti,

More information

Notes on the decomposition result of Karlin et al. [2] for the hierarchy of Lasserre by M. Laurent, December 13, 2012

Notes on the decomposition result of Karlin et al. [2] for the hierarchy of Lasserre by M. Laurent, December 13, 2012 Notes on the decomposition result of Karlin et al. [2] for the hierarchy of Lasserre by M. Laurent, December 13, 2012 We present the decomposition result of Karlin et al. [2] for the hierarchy of Lasserre

More information

Can Li a, Ignacio E. Grossmann a,

Can Li a, Ignacio E. Grossmann a, A generalized Benders decomposition-based branch and cut algorithm for two-stage stochastic programs with nonconvex constraints and mixed-binary first and second stage variables Can Li a, Ignacio E. Grossmann

More information

Convex Quadratic Relaxations of Nonconvex Quadratically Constrained Quadratic Progams

Convex Quadratic Relaxations of Nonconvex Quadratically Constrained Quadratic Progams Convex Quadratic Relaxations of Nonconvex Quadratically Constrained Quadratic Progams John E. Mitchell, Jong-Shi Pang, and Bin Yu Original: June 10, 2011 Abstract Nonconvex quadratic constraints can be

More information

Can Li a, Ignacio E. Grossmann a,

Can Li a, Ignacio E. Grossmann a, A generalized Benders decomposition-based branch and cut algorithm for two-stage stochastic programs with nonconvex constraints and mixed-binary first and second stage variables Can Li a, Ignacio E. Grossmann

More information

Online generation via offline selection - Low dimensional linear cuts from QP SDP relaxation -

Online generation via offline selection - Low dimensional linear cuts from QP SDP relaxation - Online generation via offline selection - Low dimensional linear cuts from QP SDP relaxation - Radu Baltean-Lugojan Ruth Misener Computational Optimisation Group Department of Computing Pierre Bonami Andrea

More information

Branch and cut algorithms for detecting critical nodes in undirected graphs

Branch and cut algorithms for detecting critical nodes in undirected graphs Branch and cut algorithms for detecting critical nodes in undirected graphs Marco Di Summa Dipartimento di Matematica Pura e Applicata, Università degli Studi di Padova Via Trieste, 63, 35121 Padova, Italy

More information

The CPLEX Library: Mixed Integer Programming

The CPLEX Library: Mixed Integer Programming The CPLEX Library: Mixed Programming Ed Rothberg, ILOG, Inc. 1 The Diet Problem Revisited Nutritional values Bob considered the following foods: Food Serving Size Energy (kcal) Protein (g) Calcium (mg)

More information

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 18

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 18 EE/ACM 150 - Applications of Convex Optimization in Signal Processing and Communications Lecture 18 Andre Tkacenko Signal Processing Research Group Jet Propulsion Laboratory May 31, 2012 Andre Tkacenko

More information

5. Duality. Lagrangian

5. Duality. Lagrangian 5. Duality Convex Optimization Boyd & Vandenberghe Lagrange dual problem weak and strong duality geometric interpretation optimality conditions perturbation and sensitivity analysis examples generalized

More information

Lecture 18: Optimization Programming

Lecture 18: Optimization Programming Fall, 2016 Outline Unconstrained Optimization 1 Unconstrained Optimization 2 Equality-constrained Optimization Inequality-constrained Optimization Mixture-constrained Optimization 3 Quadratic Programming

More information

Convex Optimization. Newton s method. ENSAE: Optimisation 1/44

Convex Optimization. Newton s method. ENSAE: Optimisation 1/44 Convex Optimization Newton s method ENSAE: Optimisation 1/44 Unconstrained minimization minimize f(x) f convex, twice continuously differentiable (hence dom f open) we assume optimal value p = inf x f(x)

More information

Convex relaxation. In example below, we have N = 6, and the cut we are considering

Convex relaxation. In example below, we have N = 6, and the cut we are considering Convex relaxation The art and science of convex relaxation revolves around taking a non-convex problem that you want to solve, and replacing it with a convex problem which you can actually solve the solution

More information

An Integer Cutting-Plane Procedure for the Dantzig-Wolfe Decomposition: Theory

An Integer Cutting-Plane Procedure for the Dantzig-Wolfe Decomposition: Theory An Integer Cutting-Plane Procedure for the Dantzig-Wolfe Decomposition: Theory by Troels Martin Range Discussion Papers on Business and Economics No. 10/2006 FURTHER INFORMATION Department of Business

More information

Approximation Algorithms

Approximation Algorithms Approximation Algorithms Chapter 26 Semidefinite Programming Zacharias Pitouras 1 Introduction LP place a good lower bound on OPT for NP-hard problems Are there other ways of doing this? Vector programs

More information

Scenario Grouping and Decomposition Algorithms for Chance-constrained Programs

Scenario Grouping and Decomposition Algorithms for Chance-constrained Programs Scenario Grouping and Decomposition Algorithms for Chance-constrained Programs Siqian Shen Dept. of Industrial and Operations Engineering University of Michigan Joint work with Yan Deng (UMich, Google)

More information

Fixed-charge transportation problems on trees

Fixed-charge transportation problems on trees Fixed-charge transportation problems on trees Gustavo Angulo * Mathieu Van Vyve * gustavo.angulo@uclouvain.be mathieu.vanvyve@uclouvain.be November 23, 2015 Abstract We consider a class of fixed-charge

More information

Second Order Cone Programming Relaxation of Nonconvex Quadratic Optimization Problems

Second Order Cone Programming Relaxation of Nonconvex Quadratic Optimization Problems Second Order Cone Programming Relaxation of Nonconvex Quadratic Optimization Problems Sunyoung Kim Department of Mathematics, Ewha Women s University 11-1 Dahyun-dong, Sudaemoon-gu, Seoul 120-750 Korea

More information

Duality in Optimization and Constraint Satisfaction

Duality in Optimization and Constraint Satisfaction Duality in Optimization and Constraint Satisfaction J. N. Hooker Carnegie Mellon University, Pittsburgh, USA john@hooker.tepper.cmu.edu Abstract. We show that various duals that occur in optimization and

More information

Lift-and-Project Inequalities

Lift-and-Project Inequalities Lift-and-Project Inequalities Q. Louveaux Abstract The lift-and-project technique is a systematic way to generate valid inequalities for a mixed binary program. The technique is interesting both on the

More information

4.5 Simplex method. LP in standard form: min z = c T x s.t. Ax = b

4.5 Simplex method. LP in standard form: min z = c T x s.t. Ax = b 4.5 Simplex method LP in standard form: min z = c T x s.t. Ax = b x 0 George Dantzig (1914-2005) Examine a sequence of basic feasible solutions with non increasing objective function values until an optimal

More information

Some Recent Advances in Mixed-Integer Nonlinear Programming

Some Recent Advances in Mixed-Integer Nonlinear Programming Some Recent Advances in Mixed-Integer Nonlinear Programming Andreas Wächter IBM T.J. Watson Research Center Yorktown Heights, New York andreasw@us.ibm.com SIAM Conference on Optimization 2008 Boston, MA

More information

Software for Integer and Nonlinear Optimization

Software for Integer and Nonlinear Optimization Software for Integer and Nonlinear Optimization Sven Leyffer, leyffer@mcs.anl.gov Mathematics & Computer Science Division Argonne National Laboratory Roger Fletcher & Jeff Linderoth Advanced Methods and

More information

Lecture 3: Semidefinite Programming

Lecture 3: Semidefinite Programming Lecture 3: Semidefinite Programming Lecture Outline Part I: Semidefinite programming, examples, canonical form, and duality Part II: Strong Duality Failure Examples Part III: Conditions for strong duality

More information

Fast Algorithms for SDPs derived from the Kalman-Yakubovich-Popov Lemma

Fast Algorithms for SDPs derived from the Kalman-Yakubovich-Popov Lemma Fast Algorithms for SDPs derived from the Kalman-Yakubovich-Popov Lemma Venkataramanan (Ragu) Balakrishnan School of ECE, Purdue University 8 September 2003 European Union RTN Summer School on Multi-Agent

More information

Algebra I Unit Report Summary

Algebra I Unit Report Summary Algebra I Unit Report Summary No. Objective Code NCTM Standards Objective Title Real Numbers and Variables Unit - ( Ascend Default unit) 1. A01_01_01 H-A-B.1 Word Phrases As Algebraic Expressions 2. A01_01_02

More information

E5295/5B5749 Convex optimization with engineering applications. Lecture 5. Convex programming and semidefinite programming

E5295/5B5749 Convex optimization with engineering applications. Lecture 5. Convex programming and semidefinite programming E5295/5B5749 Convex optimization with engineering applications Lecture 5 Convex programming and semidefinite programming A. Forsgren, KTH 1 Lecture 5 Convex optimization 2006/2007 Convex quadratic program

More information

Solving Box-Constrained Nonconvex Quadratic Programs

Solving Box-Constrained Nonconvex Quadratic Programs Noname manuscript No. (will be inserted by the editor) Solving Box-Constrained Nonconvex Quadratic Programs Pierre Bonami Oktay Günlük Jeff Linderoth June 13, 2016 Abstract We present effective computational

More information

A better approximation ratio for the Vertex Cover problem

A better approximation ratio for the Vertex Cover problem A better approximation ratio for the Vertex Cover problem George Karakostas Dept. of Computing and Software McMaster University October 5, 004 Abstract We reduce the approximation factor for Vertex Cover

More information

Semidefinite Relaxations for Non-Convex Quadratic Mixed-Integer Programming

Semidefinite Relaxations for Non-Convex Quadratic Mixed-Integer Programming Semidefinite Relaxations for Non-Convex Quadratic Mixed-Integer Programming Christoph Buchheim 1 and Angelika Wiegele 2 1 Fakultät für Mathematik, Technische Universität Dortmund christoph.buchheim@tu-dortmund.de

More information

5.3. Polynomials and Polynomial Functions

5.3. Polynomials and Polynomial Functions 5.3 Polynomials and Polynomial Functions Polynomial Vocabulary Term a number or a product of a number and variables raised to powers Coefficient numerical factor of a term Constant term which is only a

More information

Quadratic Formula: - another method for solving quadratic equations (ax 2 + bx + c = 0)

Quadratic Formula: - another method for solving quadratic equations (ax 2 + bx + c = 0) In the previous lesson we showed how to solve quadratic equations that were not factorable and were not perfect squares by making perfect square trinomials using a process called completing the square.

More information

Algorithms for constrained local optimization

Algorithms for constrained local optimization Algorithms for constrained local optimization Fabio Schoen 2008 http://gol.dsi.unifi.it/users/schoen Algorithms for constrained local optimization p. Feasible direction methods Algorithms for constrained

More information

Linear Programming Versus Convex Quadratic Programming for the Module Allocation Problem

Linear Programming Versus Convex Quadratic Programming for the Module Allocation Problem Linear Programming Versus Convex Quadratic Programming for the Module Allocation Problem Sourour Elloumi 1 April 13, 2005 Abstract We consider the Module Allocation Problem with Non-Uniform communication

More information

Extended Linear Formulation for Binary Quadratic Problems

Extended Linear Formulation for Binary Quadratic Problems Noname manuscript No. (will be inserted by the editor) Extended Linear Formulation for Binary Quadratic Problems Fabio Furini Emiliano Traversi Received: date / Accepted: date Abstract We propose and test

More information

Convex Optimization M2

Convex Optimization M2 Convex Optimization M2 Lecture 8 A. d Aspremont. Convex Optimization M2. 1/57 Applications A. d Aspremont. Convex Optimization M2. 2/57 Outline Geometrical problems Approximation problems Combinatorial

More information

Decomposition Methods for Quadratic Zero-One Programming

Decomposition Methods for Quadratic Zero-One Programming Decomposition Methods for Quadratic Zero-One Programming Borzou Rostami This dissertation is submitted for the degree of Doctor of Philosophy in Information Technology Advisor: Prof. Federico Malucelli

More information

Duality, Warm Starting, and Sensitivity Analysis for MILP

Duality, Warm Starting, and Sensitivity Analysis for MILP Duality, Warm Starting, and Sensitivity Analysis for MILP Ted Ralphs and Menal Guzelsoy Industrial and Systems Engineering Lehigh University SAS Institute, Cary, NC, Friday, August 19, 2005 SAS Institute

More information

INTERIOR-POINT METHODS ROBERT J. VANDERBEI JOINT WORK WITH H. YURTTAN BENSON REAL-WORLD EXAMPLES BY J.O. COLEMAN, NAVAL RESEARCH LAB

INTERIOR-POINT METHODS ROBERT J. VANDERBEI JOINT WORK WITH H. YURTTAN BENSON REAL-WORLD EXAMPLES BY J.O. COLEMAN, NAVAL RESEARCH LAB 1 INTERIOR-POINT METHODS FOR SECOND-ORDER-CONE AND SEMIDEFINITE PROGRAMMING ROBERT J. VANDERBEI JOINT WORK WITH H. YURTTAN BENSON REAL-WORLD EXAMPLES BY J.O. COLEMAN, NAVAL RESEARCH LAB Outline 2 Introduction

More information

Network Flows. 6. Lagrangian Relaxation. Programming. Fall 2010 Instructor: Dr. Masoud Yaghini

Network Flows. 6. Lagrangian Relaxation. Programming. Fall 2010 Instructor: Dr. Masoud Yaghini In the name of God Network Flows 6. Lagrangian Relaxation 6.3 Lagrangian Relaxation and Integer Programming Fall 2010 Instructor: Dr. Masoud Yaghini Integer Programming Outline Branch-and-Bound Technique

More information

x y x y ax bx c x Algebra I Course Standards Gap 1 Gap 2 Comments a. Set up and solve problems following the correct order of operations (including proportions, percent, and absolute value) with rational

More information

Convex Optimization Boyd & Vandenberghe. 5. Duality

Convex Optimization Boyd & Vandenberghe. 5. Duality 5. Duality Convex Optimization Boyd & Vandenberghe Lagrange dual problem weak and strong duality geometric interpretation optimality conditions perturbation and sensitivity analysis examples generalized

More information

Section 8.3 Partial Fraction Decomposition

Section 8.3 Partial Fraction Decomposition Section 8.6 Lecture Notes Page 1 of 10 Section 8.3 Partial Fraction Decomposition Partial fraction decomposition involves decomposing a rational function, or reversing the process of combining two or more

More information

Relaxations and Randomized Methods for Nonconvex QCQPs

Relaxations and Randomized Methods for Nonconvex QCQPs Relaxations and Randomized Methods for Nonconvex QCQPs Alexandre d Aspremont, Stephen Boyd EE392o, Stanford University Autumn, 2003 Introduction While some special classes of nonconvex problems can be

More information

An Exact Algorithm for the Steiner Tree Problem with Delays

An Exact Algorithm for the Steiner Tree Problem with Delays Electronic Notes in Discrete Mathematics 36 (2010) 223 230 www.elsevier.com/locate/endm An Exact Algorithm for the Steiner Tree Problem with Delays Valeria Leggieri 1 Dipartimento di Matematica, Università

More information

ON MIXING SETS ARISING IN CHANCE-CONSTRAINED PROGRAMMING

ON MIXING SETS ARISING IN CHANCE-CONSTRAINED PROGRAMMING ON MIXING SETS ARISING IN CHANCE-CONSTRAINED PROGRAMMING Abstract. The mixing set with a knapsack constraint arises in deterministic equivalent of chance-constrained programming problems with finite discrete

More information

Monomial-wise Optimal Separable Underestimators for Mixed-Integer Polynomial Optimization

Monomial-wise Optimal Separable Underestimators for Mixed-Integer Polynomial Optimization Monomial-wise Optimal Separable Underestimators for Mixed-Integer Polynomial Optimization Christoph Buchheim Claudia D Ambrosio Received: date / Accepted: date Abstract In this paper we introduce a new

More information

Semidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 5

Semidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 5 Semidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 5 Instructor: Farid Alizadeh Scribe: Anton Riabov 10/08/2001 1 Overview We continue studying the maximum eigenvalue SDP, and generalize

More information

Convex relaxation. In example below, we have N = 6, and the cut we are considering

Convex relaxation. In example below, we have N = 6, and the cut we are considering Convex relaxation The art and science of convex relaxation revolves around taking a non-convex problem that you want to solve, and replacing it with a convex problem which you can actually solve the solution

More information

The Ongoing Development of CSDP

The Ongoing Development of CSDP The Ongoing Development of CSDP Brian Borchers Department of Mathematics New Mexico Tech Socorro, NM 87801 borchers@nmt.edu Joseph Young Department of Mathematics New Mexico Tech (Now at Rice University)

More information

Lecture 1. 1 Conic programming. MA 796S: Convex Optimization and Interior Point Methods October 8, Consider the conic program. min.

Lecture 1. 1 Conic programming. MA 796S: Convex Optimization and Interior Point Methods October 8, Consider the conic program. min. MA 796S: Convex Optimization and Interior Point Methods October 8, 2007 Lecture 1 Lecturer: Kartik Sivaramakrishnan Scribe: Kartik Sivaramakrishnan 1 Conic programming Consider the conic program min s.t.

More information

Semidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 4

Semidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 4 Semidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 4 Instructor: Farid Alizadeh Scribe: Haengju Lee 10/1/2001 1 Overview We examine the dual of the Fermat-Weber Problem. Next we will

More information

Optimization of Multiuser MIMO Networks with Interference

Optimization of Multiuser MIMO Networks with Interference Optimization of Multiuser MIMO Networks with Interference Jia Liu, Y. Thomas Hou, Yi Shi, and Hanif D. Sherali Department of Electrical and Computer Engineering Department of Industrial and Systems Engineering

More information

Interior-Point versus Simplex methods for Integer Programming Branch-and-Bound

Interior-Point versus Simplex methods for Integer Programming Branch-and-Bound Interior-Point versus Simplex methods for Integer Programming Branch-and-Bound Samir Elhedhli elhedhli@uwaterloo.ca Department of Management Sciences, University of Waterloo, Canada Page of 4 McMaster

More information

More First-Order Optimization Algorithms

More First-Order Optimization Algorithms More First-Order Optimization Algorithms Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A. http://www.stanford.edu/ yyye Chapters 3, 8, 3 The SDM

More information