Properties of a Simple Variant of Klee-Minty s LP and Their Proof Tomonari Kitahara and Shinji Mizuno December 28, 2 Abstract Kitahara and Mizuno (2) presents a simple variant of Klee- Minty s LP, which has some nice properties. We explain and prove the properties in this paper. Keywords: Simplex method, Linear programming, Basic feasible solutions, Klee-Minty s LP Introduction Kitahara and Mizuno [2] utilize Ye s analysis [5] for the Markov Decision problem to general LPs (linear programming problems), and they get two upper bounds for the number of different BFSs (basic feasible solutions) generated by Dantzig s simplex method [] for the standard form of LP min c T x, subject to Ax = b, x, () where the number of equality constraints is m, the number of variables is n, A R m n, b R m, c R n, and x R n. The upper bounds they get are m γ x z δ log(ct ) (2) c T x z Graduate School of Decision Science and Technology, Tokyo Institute of Technology, 2-2--W9-62, Ookayama, Meguro-ku, Tokyo, 52-8552, Japan. Tel.: +8-3-5734-2896 Fax: +8-3-5734-2947 E-mail: kitahara.t.ab@m.titech.ac.jp Graduate School of Decision Science and Technology, Tokyo Institute of Technology, 2-2--W9-58, Ookayama, Meguro-ku, Tokyo, 52-8552, Japan. Tel.: +8-3-5734-286 Fax: +8-3-5734-2947 E-mail: mizuno.s.ab@m.titech.ac.jp
and n m γ δ log(mγ ), (3) δ where x is an initial BFS, x is a second optimal BFS, z is the optimal value, δ and γ are the minimum and the maximum values of all the positive elements of primal BFSs, respectively, and a denotes the smallest integer bigger than a R. The size of the bounds highly depends on the ratio γ/δ. A question one may have is how tight the bounds are. Kitahara and Mizuno [3] partially answer to the question above. They presents a simple variant of Klee-Minty s LP [4], and they show that the number of iterations by Dantzig s simplex method for solving the variant is just γ/δ. So it is impossible to get an upper bound which is smaller than γ/δ. In this paper, we explain and prove the properties of the variant given in [3] more precisely. 2 A simple variant of Klee-Minty s LP Klee-Minty s LP [4] is max m i= m i x i, subject to x, 2 k i= k i x i + x k k for k = 2, 3,, m, x, where x = (x, x 2,, x m ) T is a variable vector. Kitahara and Mizuno [3] presents a simple variant of Klee-Minty s LP, which is represented as max m i= x i, subject to x, 2 k i= x i + x k 2 k for k = 2, 3,, m, x, The standard form using a vector y = (y, y 2,, y m ) T of slack variables is written as max m i= x i, subject to x + y =, 2 k i= x i + x k + y k = 2 k (4) for k = 2, 3,, m, x, y. The number of equality constraints of this problem is m and the number of variables is n = 2m. 2
Lemma 2. The variant (4) has the following properties. for each k {, 2,, m} at any BFS (basic feasible solution), exactly one of x k and y k is a basic variable, 2. the problem has 2 m BFSs, 3. each component of any BFS is an integer, 4. the problem is nondegenerate, 5. the optimal BFS is and the optimal value is 2 m, x = (,,,, 2 m ) T, y = (, 2 2,, 2 m, ) T, 6. we have that δ = and γ = 2 m, where δ and γ are the minimum and the maximum values of all the positive elements of primal BFSs, respectively. Proof. At first, we easily get from the equality constraints of (4) that x k 2 k and y k 2 k (5) for any feasible solution (x, y) and any k {, 2,, m}. Next we will show that x k + y k. (6) From the first equality constraint in (4), we have (6) for k =. When k 2, we have from the (k )-th and k-th constraints that 2 k 2 i= x i + x k + y k = 2 k, 2 k i= x i + x k + y k = 2 k. Subtracting the first equality from the second equality, we get x k + y k + x k y k = 2 k. (7) Hence we obtain (6) from (5), (7), and y k. Let (x, y) be any BFS of (4). From (6), either x k or y k for any k {, 2,, m} must be a basic variable. Since the number of basic variables of any BFS is m, exactly one of x k and y k is a basic variable. 3
If we choose one of x k and y k as a basic variable for each k {, 2,, m}, a basic solution (x, y ) is determined. The value of x or y is from the equality constraint x + y =, and the values of x k or y k for k = 2, 3,, m are easily computed from (7) in this order. Then the value of each basic variable is a positive integer from (6) and (7). So the basic solution (x, y ) is feasible and nondegenerate. The number of such basic feasible solutions is obviously 2 m. We get from the last equality constraint of (4) that m i= m x i 2 x i + x m + y m = 2 m. i= Hence the object function value is at most 2 m. When y i (i {, 2,, m }) and x m are basic variables, the BFS is computed as in 5 of the lemma. Since the object function value is 2 m, it is an optimal solution. The last property 6 in the lemma is obvious from the discussion above. 3 Dantzig s simplex method for the variant Suppose that the initial BFS is x =, y = (, 2 2,, 2 m ) T and we generate a sequence of BFSs by Dantzig s simplex method [], in which a nonbasic variable, whose reduced cost is maximum (in the case of a maximization problem), is chosen as an entering variable, if two or more nonbasic variables have maximal reduced cost, then a variable (either x i or y i ) of the minimum index in them is chosen as an entering variable. At first, we explain the case of m =. Since x =, y = at the initial BFS, y is the basic variables and the dictionary for the initial BFS is z = x, y = x, (8) where z represents the objective function. Then Dantzig s simplex method updates the dictionaries as follows: x is chosen as an entering variable from (8), and the second dictionary is calculated as z = y, x = y. 4
This dictionary is optimal. We get an optimal solution x =, y = and an optimal value. Then we explain the case of m = 2. Since x =, y = (, 3) T at the initial BFS, y and y 2 are the basic variables and the dictionary for the initial BFS is z = x +x 2, y = x, (9) y 2 = 3 2x x 2, where z represents the objective function. Then Dantzig s simplex method updates the dictionary as follows: x is chosen as an entering variable from (9), and the second dictionary is calculated as The third dictionary is The 4-th dictionary is z = y +x 2, x = y, y 2 = +2y x 2. z = 2 +y y 2, x = y, x 2 = +2y y 2. z = 3 x y 2, y = x, x 2 = 3 2x y 2. () This dictionary is optimal. We get an optimal solution (x, x 2 ) = (, 3), (y, y 2 ) = (, ) and an optimal value 2 2 = 3. The x-parts of BFSs generated are ( ) ( ) v = ( v 2 =, v = ), v 3 = ( 3 which are shown in Figure. We observe that the number of iterations is (2 2 ),, ), any reduced cost of every dictionary is or, the objective function value increases by at each iteration. 5
x 2 3 v 2 v v v x Figure : Vertices generated by Dantzig s simplex method when m = 2 Now we consider the following variant of Klee-Minty s LP max (2 m ) m i= x i, subject to x, 2 k i= x i + x k 2 k for k = 2, 3,, m, x. The standard form using a vector y = (y, y 2,, y m ) T of slack variables is written as max (2 m ) m i= x i, subject to x + y =, 2 k i= x i + x k + y k = 2 k for k = 2, 3,, m, x, y. () It is easy to see that this problem () also has the properties in Lemma 2. except for the optimal solution. The optimal solution of () is x =, y = (, 2 2,, 2 m ), and the optimal value is (2 m ). Suppose that the initial BFS is x = (,,,, 2 m ) T, y = (, 2 2,, 2 m, ) T and we generate a sequence of BFSs by Dantzig s simplex method for the problem (). When m =, Dantzig s simplex method finds the optimal solution in one iteration just as the case of the problem (4). Next 6
we explain the case of m = 2. Since x = (, 3) T, y = (, ) at the initial BFS, y and x 2 are the basic variables and the dictionary for the initial BFS is w = +x +y 2, y = x, (2) x 2 = 3 2x y 2. where w represents the objective function. Then Dantzig s simplex method updates the dictionary as follows: x is chosen as an entering variable from (2), and the second dictionary is calculated as The third dictionary is The 4-th dictionary is w = y +y 2, x = y, x 2 = +2y y 2. w = 2 +y x 2, x = y, y 2 = +2y x 2. w = 3 x x 2, y = x, y 2 = 3 2x x 2, (3) This dictionary is optimal. We get an optimal solution (x, x 2 ) = (, ), (y, y 2 ) = (, 3) and an optimal value 3. The x-parts of BFSs generated are v 3, v 2, v, v, which are just the opposite order in the case of the problem (4). We observe that the number of iterations is (2 2 ), any reduced cost of every dictionary is or, the objective function value increases by at each iteration. Theorem 3. When we generate a sequence of BFSs by Dantzig s simplex method for the problem (4) from an initial BFS x =, y = (, 2 2,, 2 m ) T then the number of iterations is (2 m ), any reduced cost of every dictionary is or, 7
the objective function value increases by at each iteration. When we generate a sequence of BFSs by Dantzig s simplex method for the problem () from an initial BFS x = (,,,, 2 m ) T, y = (, 2 2,, 2 m, ) T then the number of iterations is (2 m ), any reduced cost of every dictionary is or, the objective function value increases by at each iteration. We can prove these results by using induction. We get the results for m =, 2 from the discussion above. Then we will show that the results are also true for m = 3. For ease of understanding the discussion, we show all the iterations of Dantzig s simplex method for solving the problem (4), when m = 3 and m = 4, in Section 5. When m = 3, the first dictionary is z = x +x 2 +x 3, y = x, y 2 = 3 2x x 2, y 3 = 7 2x 2x 2 x 3. (4) Note that the nonbasic variable x 3 does not appear except for the objective function and the last constraint in (4), and that the reduced cost of x 3 is. So the variable x 3 does not chosen as an entering variable until the reduced costs of the other nonbasic variables becomes less than. While the variable x 3 is nonbasic, the dictionaries except for the last constraint are updated just as the case of m = 2. Since any reduced cost for m = 2 is or, all the reduced costs becomes less than only when an optimal BFS for m = 2 is obtained. Thus the first 2 2 = 3 iterations are just like the case of m = 2, and (x, x 2 ) T -part of BFSs moves in the order of v, v, v 2, v 3. After 3 iterations, we get the 4-th dictionary z = 3 x y 2 +x 3, y = x, x 2 = 3 2x y 2, y 3 = +2x +2y 2 x 3. (5) We can see that this dictionary except for the last row and the last column is just same as the dictionary (). At this dictionary, x 3 is chosen as an 8
entering variable, and the 5-th dictionary is z = 4 +x +y 2 y 3, y = x, x 2 = 3 2x y 2, x 3 = +2x +2y 2 y 3. (6) We see that the objective function value increases by and each reduced cost is or in this case too. The nonbasic variable y 3 does not appear except for the objective function and the last constraint in (6), and the reduced cost of y 3 is. So the variable y 3 does not chosen as an entering variable until the algorithm stops. Hence iterations are just like the case of m = 2. Note that the dictionary (6) except for the last column, the last row, and the constant in z is same as the dictionary (2). So (x, x 2 ) T -part of BFSs generated by Dantzig s simplex method starting from (6) moves in the order of v 3, v 2, v, v (see Figure 2 where the vertices generated are shown as u, u 2,, u 7 ). After 3 iterations, we get the 8-th dictionary z = 7 x x 2 y 3, y = x, y 2 = 3 2x x 2, x 3 = 7 2x 2x 2 y 3. This dictionary is optimal and the optimal value is 2 3. It is easy to check that x-parts of the iterates are u =, u =, u 2 =, u 3 = u 6 = 3 5, u 4 =, u 7 = 3, u 5 = whose objective function values are,, 2, 3, 4, 5, 6, 7 in this order. Hence all the results of the problem (4) in the theorem are true for m = 3. In the similar way, we can also prove that all the results of the problem () in the theorem are true for m = 3. We will prove the theorem in the next section. Here we state the next two corollaries, which are obtained from the second and third results for each problem in the theorem. 9 7, 3,
x 3 7 u 6 u 5 u u u x 2 u 4 u 3 u x 2 Figure 2: Vertices generated by Dantzig s simplex method when m = 3 Corollary 3. The sequence of BFSs generated by the simplex method for the problem (4) by the minimum index rule is same as the case of Dantzig s rule. Corollary 3.2 Let {(v i, u i ) R 2m : i =,, 2,, 2 m } be the sequence of BFSs generated by Dantzig s simplex method for the problem (4) from an initial BFS x =, y = (, 2 2,, 2 m ) T. Then the sequence of BFSs generated by Dantzig s simplex method for the problem () from an initial BFS x = (,,,, 2 m ) T, y = (, 2 2,, 2 m, ) T is {(v 2m i, u 2m i ) R 2m : i =,, 2,, 2 m }. 4 Proof of Theorem 3. In this section, we prove Theorem 3.. For ease of understanding the proof, we show in Section 5 all the iterations of Dantzig s simplex method for solving the problem (4) when m = 3 and m = 4.
Proof. Assume that all the results in Theorem 3. are true for m = k. When m = k +, the first dictionary of the problem (4) is z = x +x k +x k+, y = x,. y k = (2 k ) 2x x k, y k+ = (2 k+ ) 2x 2x k x k+. (7) Note that the nonbasic variable x k+ does not appear except for the objective function and the last constraint in (7), and that the reduced cost of x k+ is. So the variable x k+ does not chosen as an entering variable until the reduced costs of the other nonbasic variables becomes less than. While the variable x k+ is nonbasic, the dictionaries except for the last constraint are updated just as the case of m = k. Since any reduced cost for m = k is or, all the reduced costs becomes less than only when an optimal BFS for m = k is obtained. Thus the first (2 k ) iterations are just like the case of m = k. After (2 k ) iterations, the basic variables are y,, y k, x k, y k+, and we get the 2 k -th dictionary z = (2 k ) x y k +x k+, y = x,. x k = (2 k ) 2x y k, y k+ = +2x +2y k x k+. (8) This dictionary except for the last row and the last column is just same as the dictionary of the optimal BFS for m = k. At this dictionary, x k+ is chosen as an entering variable, and the (2 k + )-th dictionary is z = 2 k +x +y k y k+, y = x,. x k = (2 k ) 2x y k, x k+ = +2x +2y k y k+. (9) We see that the objective function value increases by, and each reduced cost is or in this case too. The nonbasic variable y k+ does not appear except for the objective function and the last constraint in (9), and the reduced cost of y k+ is. So the variable y k+ does not chosen as an entering variable until the algorithm stops. Hence iterations are just like the case of m = k.
The dictionary (9) except for the last column, the last row, and the constant in z is same as the initial dictionary of the problem () when m = k. Thus the next (2 k ) iterations are just the case of the problem () when m = k. After (2 k ) iterations, we get the 2 k+ -th dictionary z = (2 k+ ) x x k y k+, y = x,. y k = (2 k ) 2x x k, x k+ = (2 k+ ) 2x 2x k y k+. This dictionary is optimal and the optimal value is 2 k+. It is easy to check that all the results of the problem (4) for m = k + in the theorem are true. In the similar way, we can also prove that all the results of the problem () in the theorem are true when m = k +. 5 Iterations when m = 3 and m = 4 In this section, we show all the iterations of Dantzig s simplex method for solving the problem (4) when m = 3 and m = 4. When m = 3, the initial dictionary is The second dictionary is The third dictionary is z = x +x 2 +x 3 y = x y 2 = 3 2x x 2 y 3 = 7 2x 2x 2 x 3. z = y +x 2 +x 3 x = y y 2 = +2y x 2 y 3 = 5 +2y 2x 2 x 3. z = 2 +y y 2 +x 3 x = y x 2 = +2y y 2 y 3 = 3 2y +2y 2 x 3. 2
The 4-th dictionary is z = 3 x y 2 +x 3 y = x x 2 = 3 2x y 2 y 3 = +2x +2y 2 x 3. The 5-th dictionary is z = 4 +x +y 2 y 3 y = x x 2 = 3 2x y 2 x 3 = +2x +2y 2 y 3. The 6-th dictionary is z = 5 y +y 2 y 3 x = y x 2 = +2y y 2 x 3 = 3 2y +2y 2 y 3. The 7-th dictionary is z = 6 +y x 2 y 3 x = y y 2 = +2y x 2 x 3 = 5 +2y 2x 2 y 3. The 8-th dictionary is z = 7 x x 2 y 3 y = x y 2 = 3 2x x 2 x 3 = 7 2x 2x 2 y 3. This dictionary is optimal. Next we show all the iterations of Dantzig s simplex method for solving the problem (4) when m = 4. The initial dictionary is z = x +x 2 +x 3 +x 4 y = x y 2 = 3 2x x 2 y 3 = 7 2x 2x 2 x 3 y 4 = 5 2x 2x 2 2x 3 x 4. 3
The second dictionary is The third dictionary is The 4-th dictionary is The 5-th dictionary is The 6-th dictionary is The 7-th dictionary is z = y +x 2 +x 3 +x 4 x = y y 2 = +2y x 2 y 3 = 5 +2y 2x 2 x 3 y 4 = 3 +2y 2x 2 2x 3 x 4. z = 2 +y y 2 +x 3 +x 4 x = y x 2 = +2y y 2 y 3 = 3 2y +2y 2 x 3 y 4 = 2y +2y 2 2x 3 x 4. z = 3 x y 2 +x 3 +x 4 y = x x 2 = 3 2x y 2 y 3 = +2x +2y 2 x 3 y 4 = 9 +2x +2y 2 2x 3 x 4. z = 4 +x +y 2 y 3 +x 4 y = x x 2 = 3 2x y 2 x 3 = +2x +2y 2 y 3 y 4 = 7 2x 2y 2 +2y 3 x 4. z = 5 y +y 2 y 3 +x 4 x = y x 2 = +2y y 2 x 3 = 3 2y +2y 2 y 3 y 4 = 5 +2y 2y 2 +2y 3 x 4. z = 6 +y x 2 y 3 +x 4 x = y y 2 = +2y x 2 x 3 = 5 +2y 2x 2 y 3 y 4 = 3 2y +2x 2 +2y 3 x 4. 4
The 8-th dictionary is The 9-th dictionary is The -th dictionary is The -th dictionary is The 2-th dictionary is The 3-th dictionary is z = 7 x x 2 y 3 +x 4 y = x y 2 = 3 2x x 2 x 3 = 7 2x 2x 2 y 3 y 4 = +2x +2x 2 +2y 3 x 4. z = 8 +x +x 2 +y 3 y 4 y = x y 2 = 3 2x x 2 x 3 = 7 2x 2x 2 y 3 x 4 = +2x +2x 2 +2y 3 y 4. z = 9 y +x 2 +y 3 y 4 x = y y 2 = +2y x 2 x 3 = 5 +2y 2x 2 y 3 x 4 = 3 2y +2x 2 +2y 3 y 4. z = +y y 2 +y 3 y 4 x = y x 2 = +2y y 2 x 3 = 3 2y +2y 2 y 3 x 4 = 5 +2y 2y 2 +2y 3 y 4. z = x y 2 +y 3 y 4 y = x x 2 = 3 2x y 2 x 3 = +2x +2y 2 y 3 x 4 = 7 2x 2y 2 +2y 3 y 4. z = 2 +x +y 2 x 3 y 4 y = x x 2 = 3 2x y 2 y 3 = +2x +2y 2 x 3 x 4 = 9 +2x +2y 2 2x 3 y 4. 5
The 4-th dictionary is The 5-th dictionary is The 6-th dictionary is This dictionary is optimal. z = 3 y +y 2 x 3 y 4 x = y x 2 = +2y y 2 y 3 = 3 2y +2y 2 x 3 x 4 = 2y +2y 2 2x 3 y 4. z = 4 +y x 2 x 3 y 4 x = y y 2 = +2y x 2 y 3 = 5 +2y 2x 2 x 3 x 4 = 3 +2y 2x 2 2x 3 y 4. z = 5 x x 2 x 3 y 4 y = x y 2 = 3 2x x 2 y 3 = 7 2x 2x 2 x 3 x 4 = 5 2x 2x 2 2x 3 y 4. Acknowledgment This research is supported in part by Grant-in-Aid for Science Research (A) 22438 of Japan Society for the Promotion of Science. References [] G. B. Dantzig, Linear Programming and Extensions, Princeton University Press, Princeton, New Jersey, 963. [2] T. Kitahara and S. Mizuno, A Bound for the Number of Different Basic Solutions Generated by the Simplex Method, Technical paper, available at http://www.me.titech.ac.jp/ mizu lab/simplexmethod/, 2. [3] T. Kitahara and S. Mizuno, Klee-Minty s LP and Upper Bounds for Dantzig s Simplex Method, Technical paper, available at http://www.me.titech.ac.jp/ mizu lab/simplexmethod/, 2. 6
[4] V. Klee and G. J. Minty, How good is the simplex method. In O. Shisha, editor, Inequalities III, pp. 59-75, Academic Press, New York, NY, 972. [5] Y. Ye, The Simplex and Policy Iteration Methods are Strongly Polynomial for the Markov Decision Problem with a Fixed Discount Rate, Technical paper, available at http://www.stanford.edu/ yyye/, 2. 7