Lecture 5. x 1,x 2,x 3 0 (1)

Similar documents
Example Problem. Linear Program (standard form) CSCI5654 (Linear Programming, Fall 2013) Lecture-7. Duality

The Strong Duality Theorem 1

Optimisation and Operations Research

CSCI5654 (Linear Programming, Fall 2013) Lecture-8. Lecture 8 Slide# 1

Introduction to Mathematical Programming IE406. Lecture 10. Dr. Ted Ralphs

Review Solutions, Exam 2, Operations Research

Chap6 Duality Theory and Sensitivity Analysis

Understanding the Simplex algorithm. Standard Optimization Problems.

Linear Programming Duality P&S Chapter 3 Last Revised Nov 1, 2004

IE 5531: Engineering Optimization I

Linear Programming: Chapter 5 Duality

Sensitivity Analysis

Systems Analysis in Construction

The Primal-Dual Algorithm P&S Chapter 5 Last Revised October 30, 2006

Standard Form An LP is in standard form when: All variables are non-negativenegative All constraints are equalities Putting an LP formulation into sta

21. Solve the LP given in Exercise 19 using the big-m method discussed in Exercise 20.

Lecture 11: Post-Optimal Analysis. September 23, 2009

Duality. Peter Bro Mitersen (University of Aarhus) Optimization, Lecture 9 February 28, / 49

The Simplex Algorithm

Linear and Combinatorial Optimization

1 Seidel s LP algorithm

Algorithmic Game Theory and Applications. Lecture 7: The LP Duality Theorem

Part 1. The Review of Linear Programming

1 Review Session. 1.1 Lecture 2

Introduction to Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras

The use of shadow price is an example of sensitivity analysis. Duality theory can be applied to do other kind of sensitivity analysis:

4.6 Linear Programming duality

Chapter 1 Linear Programming. Paragraph 5 Duality

Farkas Lemma, Dual Simplex and Sensitivity Analysis

(P ) Minimize 4x 1 + 6x 2 + 5x 3 s.t. 2x 1 3x 3 3 3x 2 2x 3 6

Linear Programming Duality

Lecture Notes 3: Duality

MVE165/MMG631 Linear and integer optimization with applications Lecture 5 Linear programming duality and sensitivity analysis

Linear Programming Inverse Projection Theory Chapter 3

Lecture 10: Linear programming. duality. and. The dual of the LP in standard form. maximize w = b T y (D) subject to A T y c, minimize z = c T x (P)

MATH2070 Optimisation

Note 3: LP Duality. If the primal problem (P) in the canonical form is min Z = n (1) then the dual problem (D) in the canonical form is max W = m (2)

Chapter 3, Operations Research (OR)

Lecture 10: Linear programming duality and sensitivity 0-0

Slack Variable. Max Z= 3x 1 + 4x 2 + 5X 3. Subject to: X 1 + X 2 + X x 1 + 4x 2 + X X 1 + X 2 + 4X 3 10 X 1 0, X 2 0, X 3 0

3. Duality: What is duality? Why does it matter? Sensitivity through duality.

March 13, Duality 3

AM 121 Introduction to Optimization: Models and Methods Example Questions for Midterm 1

Spring 2017 CO 250 Course Notes TABLE OF CONTENTS. richardwu.ca. CO 250 Course Notes. Introduction to Optimization

Introduction to Mathematical Programming

Week 3 Linear programming duality

Section Notes 9. IP: Cutting Planes. Applied Math 121. Week of April 12, 2010

Summary of the simplex method

Supplementary lecture notes on linear programming. We will present an algorithm to solve linear programs of the form. maximize.

Section Notes 9. Midterm 2 Review. Applied Math / Engineering Sciences 121. Week of December 3, 2018

Special cases of linear programming

Summary of the simplex method

How to Take the Dual of a Linear Program

I.3. LMI DUALITY. Didier HENRION EECI Graduate School on Control Supélec - Spring 2010

56:270 Final Exam - May

Discrete Optimization

Sensitivity Analysis and Duality in LP

SOLVING INTEGER LINEAR PROGRAMS. 1. Solving the LP relaxation. 2. How to deal with fractional solutions?

Lagrange duality. The Lagrangian. We consider an optimization program of the form

February 17, Simplex Method Continued

4. Duality and Sensitivity

CSC373: Algorithm Design, Analysis and Complexity Fall 2017 DENIS PANKRATOV NOVEMBER 1, 2017

CO 250 Final Exam Guide

Introduction to optimization

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 12 Luca Trevisan October 3, 2017

MAT016: Optimization

END3033 Operations Research I Sensitivity Analysis & Duality. to accompany Operations Research: Applications and Algorithms Fatih Cavdur

BBM402-Lecture 20: LP Duality

F 1 F 2 Daily Requirement Cost N N N

Optimization (168) Lecture 7-8-9

Algorithms and Theory of Computation. Lecture 13: Linear Programming (2)

The Simplex Method. Lecture 5 Standard and Canonical Forms and Setting up the Tableau. Lecture 5 Slide 1. FOMGT 353 Introduction to Management Science

Sensitivity Analysis and Duality

"SYMMETRIC" PRIMAL-DUAL PAIR

System Planning Lecture 7, F7: Optimization

Motivating examples Introduction to algorithms Simplex algorithm. On a particular example General algorithm. Duality An application to game theory

EE364a Review Session 5

Duality Theory, Optimality Conditions

OPTIMISATION 2007/8 EXAM PREPARATION GUIDELINES

Linear Programming, Lecture 4

LINEAR PROGRAMMING. Relation to the Text (cont.) Relation to Material in Text. Relation to the Text. Relation to the Text (cont.

Lecture 9 Tuesday, 4/20/10. Linear Programming

CPS 616 ITERATIVE IMPROVEMENTS 10-1

4. Algebra and Duality

LINEAR PROGRAMMING 2. In many business and policy making situations the following type of problem is encountered:

CS261: A Second Course in Algorithms Lecture #8: Linear Programming Duality (Part 1)

ORIE 6300 Mathematical Programming I August 25, Lecture 2

AM 121: Intro to Optimization! Models and Methods! Fall 2018!

LINEAR PROGRAMMING II

Introduction to Mathematical Programming IE406. Lecture 13. Dr. Ted Ralphs

Lecture #21. c T x Ax b. maximize subject to

Metode Kuantitatif Bisnis. Week 4 Linear Programming Simplex Method - Minimize

IE 400: Principles of Engineering Management. Simplex Method Continued

Topic: Primal-Dual Algorithms Date: We finished our discussion of randomized rounding and began talking about LP Duality.

AM 121: Intro to Optimization

16.1 L.P. Duality Applied to the Minimax Theorem

MATH 445/545 Test 1 Spring 2016

min 4x 1 5x 2 + 3x 3 s.t. x 1 + 2x 2 + x 3 = 10 x 1 x 2 6 x 1 + 3x 2 + x 3 14

OPTIMISATION /09 EXAM PREPARATION GUIDELINES

AM 121: Intro to Optimization Models and Methods

Transcription:

Computational Intractability Revised 2011/6/6 Lecture 5 Professor: David Avis Scribe:Ma Jiangbo, Atsuki Nagao 1 Duality The purpose of this lecture is to introduce duality, which is an important concept in linear programming. One of the main uses is to give a certificate of optimality. We follow the approach of Chvátal (1983). 1.1 Certificates of optimality Recall our previous example. max z = 5x 1 + 6x 2 + 9x 3 x 1 + 2x 2 + 3x 3 5 x 1 + x 2 + 2x 3 3 The problem has an initial dictionary: x 1,x 2,x 3 0 (1) x 4 = 5 x 1 2x 2 3x 3 0 x 5 = 3 x 1 x 2 2x 3 0 z = 5x 1 + 6x 2 + 9x 3 (2) It has optimum solution x 1 = 1,x 2 = 2,x 3 = 0,z = 17, obtained from the final dictionary: x 1 = 1 x 3 + x 4 2x 5 x 2 = 2 x 3 x 4 + x 5 z = 17 2x 3 x 4 4x 5 (3) Our original proof of optimality came from the fact that all coefficients in the final row are non-positive. In general c j 0 as this is the stopping condition for the simplex method. Compare the objective row in (3) with that in the initial dictionary (2). Since they are equivalent, and since the slack variables in (2) only appear once, we can obtain the final objective function by adding the equation for x 4 and four times the equation for x 5 to the initial objective function. 5-1

Or we may work directly with the original inequalities in (1). If we add the first constraint to four times the second constraint, we get: x 1 + 2x 2 + 3x 3 5 4x 1 + 4x 2 + 8x 3 12 5x 1 + 6x 2 + 11x 3 17 (4) All of these inequalities are valid for every feasible x 1,x 2,x 3. This gives us a proof of optimality for the LP, since for every feasible x 1,x 2,x 3 we have z = 5x 1 + 6x 2 + 9x 3 5x 1 + 6x 2 + 11x 3 17 (5) Let y 1 be the multiplier for the first constraint, and y 2 be the multiplier for the second. We used y 1 = 1 and y 2 = 4. Observe that using the c j in the objective row of the final dictionary (3) we have 1.2 Formulation of the dual y 1 = c 4, y 2 = c 5. In the previous section we hinted that the certificate provided by the multipliers y i can be obtained from the optimum dictionary. In this section we give an independent way of doing this. We may manipulate inequalities by one of the following two operations: 1. multiply by non-negative numbers 2. add inequalities together. We will now formulate the properties that the y i should satisfy. The idea is that they should give a way to combine the inequalities in the primal in order to give an upper bound on z. So we must have: (i) y 1,y 2 0, since otherwise they are not valid multipliers. (ii) After combining constraints, we must bound the objective z, so: y 1 + y 2 5, 2y 1 + y 2 6, 3y 1 + 2y 2 9. Here are some multipliers satisfying (i) and (ii) and the upper bounds on z that they give: y 1 = 5 y 2 = 0 z 25 y 1 = 0 y 2 = 6 z 18 y 1 = 1 y 2 = 4 z 17 The bound on z is given by 5y 1 + 3y 2. Since we want the lowest bound possible, we have one further condition. (iii) Find min w = 5y 1 + 3y 2. Putting all this together we have formulated the dual problem, which is also a linear program: 5-2

min w = 5y 1 + 3y 2 y 1 + y 2 5 2y 1 + y 2 6 3y 1 + 2y 2 9 y 1,y 2 0 (6) This has optimum solution: y 1 = 1,y 2 = 4,w = 17. Using exactly the same logic, we may define a dual for every LP in standard form. The original LP is called the primal. Primal max z = c j x j (7) a ij x j b i (i = 1,...,m) (8) x j 0 (,...n) Dual min w = b i y i (9) a ij y i c j (j = 1,...,n) (10) y i 0 (,...m) If x 1,..,x n satisfy the constraints of the primal, they are called primal feasible. Similarly y 1,...,y m are dual feasible if they satisfy the constraints of the dual. 1.3 Duality theorems By construction, the purpose of the dual is to provide an upper bound on the value of the primal solution. This idea is formalized in the weak duality theorem. Theorem 1. (Weak Duality Theorem) Let x 1,x 2,...x n be primal feasible and let y 1,y 2,...y n be dual feasible then z = c j x j b i y i = w (11) 5-3

Proof. Given feasible x 1 x n and y 1 y n, a ij x j b i for each i = 1,..,m, and since y i 0 we have Summing over all i: Similarly a ij x j y i b i y i. a ij x j y i b i y i = w. a ij y i c j for each j = 1,...n, and since x j 0 we have Summing over j we have a ij x j y i c j x j. a ij x j y i c j x j = z Combining and noting that we can reverse the order of summation in any finite sum, we have z = c j x j a ij x j y i b i y i = w. (12) It follows immediately that if equations hold throughout in (12), then the corresponding solutions are both optimal. The strong duality theorem states that this is always the case when an LP has an optimal solution. Theorem 2. (Strong Duality Theorem) If a linear programming problem has an optimal solution, so does its dual, and the respective objective functions are equal. Proof. Let x be an optimal solution with objective value z = c T x. We will exhibit a feasible dual solution y with b T y = c T x, and so optimality follows from the weak duality theorem. Consider the final optimal dictionary produced by the simplex method, and let c j,j = 1,2,...,n + m be the coefficient of x j in the objective row. We will set y i = c n+i, i = 1,...,m 5-4

and show that the following are satisfied: yi 0 (i = 1,...m) (13) a ij yi c j (j = 1,...,n) (14) z = m c j x j = b i yi (15) First, it is clear that inequality (13) is true since c n+i 0 is the stopping condition of the simplex method. Next we observe that the cost row in the final dictionary is equivalent to that in the initial dictionary. So n+m z = c j x j = z + c k x k (16) where c k 0 and is zero for basic variables in the final dictionary. Using this, z = c j x j = z + c j x j + c n+i x n+i = z + = z + = (z c j x j + c j x j yi b i ) + k=1 c n+i (b i yi b i + (c j + a ij x j ) (17) a ij x j yi a ij yi )x j. (18) Note that to get equation (18) we reverse the order of summation of the finite double sum. We may conclude two things. Firstly, since there is no constant on the LHS, the RHS constant must also be zero. (Consider setting x i = 0 for each i). So the constant on the RHS must also be zero and z = yi b i (19) showing (15). Secondly, we may equate the coefficients of x j (equivalently set x j = 1 and otherwise x i = 0) to see that c j = c j + a ij yi. Noting again that c j 0 for each j, we have and so (14) holds. a ij yi c j for each j = 1,...,n) (20) 5-5

1.4 Outcomes for primal/dual problems The weak and strong duality theorems give us information on what is the relationship between the primal and dual problems. This is summarized in the table below. Dual opt Infeasible unbounded Primal opt Infeasible unbounded :can not happen : can happen In particular we may conclude that: References primal unbounded dual infeasible primal infeasible dual infeasible or unbounded [1] V. Chvátal Linear Programming (W.H.FREEMAN AND COMPANY, New York, 1983) 5-6