LECTURE 10 LECTURE OUTLINE

Similar documents
LECTURE 25: REVIEW/EPILOGUE LECTURE OUTLINE

LECTURE 12 LECTURE OUTLINE. Subgradients Fenchel inequality Sensitivity in constrained optimization Subdifferential calculus Optimality conditions

LECTURE 4 LECTURE OUTLINE

LECTURE 9 LECTURE OUTLINE. Min Common/Max Crossing for Min-Max

Lecture 8. Strong Duality Results. September 22, 2008

Convex Optimization Theory. Chapter 5 Exercises and Solutions: Extended Version

LECTURE SLIDES ON BASED ON CLASS LECTURES AT THE CAMBRIDGE, MASS FALL 2007 BY DIMITRI P. BERTSEKAS.

LP Duality: outline. Duality theory for Linear Programming. alternatives. optimization I Idea: polyhedra

LECTURE 13 LECTURE OUTLINE

Lecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem

LECTURE SLIDES ON CONVEX ANALYSIS AND OPTIMIZATION BASED ON LECTURES GIVEN AT THE MASSACHUSETTS INSTITUTE OF TECHNOLOGY CAMBRIDGE, MASS

Midterm Review. Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A.

I.3. LMI DUALITY. Didier HENRION EECI Graduate School on Control Supélec - Spring 2010

LECTURE SLIDES ON CONVEX OPTIMIZATION AND DUALITY THEORY TATA INSTITUTE FOR FUNDAMENTAL RESEARCH MUMBAI, INDIA JANUARY 2009 PART I

Convex Optimization Theory. Chapter 4 Exercises and Solutions: Extended Version

Lecture 6: Conic Optimization September 8

Lecture 18: Optimization Programming

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

A Geometric Framework for Nonconvex Optimization Duality using Augmented Lagrangian Functions

Conic Linear Optimization and its Dual. yyye

Convex Optimization Theory

Chap 2. Optimality conditions

14. Duality. ˆ Upper and lower bounds. ˆ General duality. ˆ Constraint qualifications. ˆ Counterexample. ˆ Complementary slackness.

Lagrangian Duality Theory

Nonlinear Programming

Convex Optimization & Lagrange Duality

4. Algebra and Duality

Nonlinear Programming 3rd Edition. Theoretical Solutions Manual Chapter 6

Convex Optimization M2

Enhanced Fritz John Optimality Conditions and Sensitivity Analysis

EE364a Review Session 5

Date: July 5, Contents

Linear and Combinatorial Optimization

Solving Dual Problems

Convex Optimization and Modeling

LECTURE 3 LECTURE OUTLINE

Lecture 8 Plus properties, merit functions and gap functions. September 28, 2008

Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization

Lecture 5. Theorems of Alternatives and Self-Dual Embedding

Linear Programming. Larry Blume Cornell University, IHS Vienna and SFI. Summer 2016

Convex Analysis and Optimization Chapter 2 Solutions

Summer School: Semidefinite Optimization

Additional Homework Problems

Solutions Chapter 5. The problem of finding the minimum distance from the origin to a line is written as. min 1 2 kxk2. subject to Ax = b.

Optimality, Duality, Complementarity for Constrained Optimization

Constrained Optimization and Lagrangian Duality

Extended Monotropic Programming and Duality 1

Linear and non-linear programming

BBM402-Lecture 20: LP Duality

2.098/6.255/ Optimization Methods Practice True/False Questions

LINEAR PROGRAMMING II

5. Duality. Lagrangian

Example: feasibility. Interpretation as formal proof. Example: linear inequalities and Farkas lemma

Part IB Optimisation

Week 3 Linear programming duality

4TE3/6TE3. Algorithms for. Continuous Optimization

Convex Optimization Theory. Athena Scientific, Supplementary Chapter 6 on Convex Optimization Algorithms

Optimization for Communications and Networks. Poompat Saengudomlert. Session 4 Duality and Lagrange Multipliers

Quiz Discussion. IE417: Nonlinear Programming: Lecture 12. Motivation. Why do we care? Jeff Linderoth. 16th March 2006

Convex Optimization Theory. Chapter 3 Exercises and Solutions: Extended Version

Lecture 3 January 28

Chapter 1. Preliminaries

Convex Optimization Boyd & Vandenberghe. 5. Duality

Today: Linear Programming (con t.)

CO 250 Final Exam Guide

ICS-E4030 Kernel Methods in Machine Learning

Nonlinear Optimization

Convex Optimization Theory. Athena Scientific, Supplementary Chapter 6 on Convex Optimization Algorithms

Lecture: Duality.

Note 3: LP Duality. If the primal problem (P) in the canonical form is min Z = n (1) then the dual problem (D) in the canonical form is max W = m (2)

Support Vector Machines

Sparse Optimization Lecture: Dual Certificate in l 1 Minimization

Chapter 2: Preliminaries and elements of convex analysis

Optimization. Yuh-Jye Lee. March 28, Data Science and Machine Intelligence Lab National Chiao Tung University 1 / 40

1 Review of last lecture and introduction

UC Berkeley Department of Electrical Engineering and Computer Science. EECS 227A Nonlinear and Convex Optimization. Solutions 5 Fall 2009

1 Convex Optimization Models: An Overview

Optimization for Machine Learning

CONSTRAINT QUALIFICATIONS, LAGRANGIAN DUALITY & SADDLE POINT OPTIMALITY CONDITIONS

Lecture 5. The Dual Cone and Dual Problem

A Unified Analysis of Nonconvex Optimization Duality and Penalty Methods with General Augmenting Functions

Approximate Farkas Lemmas in Convex Optimization

Introduction to Semidefinite Programming I: Basic properties a

Optimality Conditions for Constrained Optimization

Lecture: Duality of LP, SOCP and SDP

Introduction to Mathematical Programming IE406. Lecture 10. Dr. Ted Ralphs

A New Fenchel Dual Problem in Vector Optimization

Introduction to Nonlinear Stochastic Programming

UNDERGROUND LECTURE NOTES 1: Optimality Conditions for Constrained Optimization Problems

CSCI 1951-G Optimization Methods in Finance Part 01: Linear Programming

LP. Kap. 17: Interior-point methods

Introduction to optimization

Semidefinite Programming

Lecture Note 18: Duality

CS711008Z Algorithm Design and Analysis

CHAPTER 2: CONVEX SETS AND CONCAVE FUNCTIONS. W. Erwin Diewert January 31, 2008.

Appendix PRELIMINARIES 1. THEOREMS OF ALTERNATIVES FOR SYSTEMS OF LINEAR CONSTRAINTS

Some Properties of the Augmented Lagrangian in Cone Constrained Optimization

Lecture Note 5: Semidefinite Programming for Stability Analysis

Global Optimization of Polynomials

Transcription:

LECTURE 10 LECTURE OUTLINE Min Common/Max Crossing Th. III Nonlinear Farkas Lemma/Linear Constraints Linear Programming Duality Convex Programming Duality Optimality Conditions Reading: Sections 4.5, 5.1,5.2, 5.3.1, 5.3.2 Recall the MC/MC Theorem II: If <w and 0 ri(d) = u there exists w Rwith (u, w) M} then q = w and there exists µ s. t. q(µ) =q. w w M M (µ, 1) w w = q M q M 0 D u 0 D u All figures are courtesy of Athena Scientific, and are used with permission. 1

MC/MC TH. III - POLYHEDRAL Consider the MC/MC problems, and assume that <w and: (1) M is a horizontal translation of M by P, M = M (u, 0) u P, where P : polyhedral and M : convex. w w w M (µ, 1) M = M (u, 0) u P w P q(µ) 0 u 0 u 0 u (2) We have ri(d ) P = Ø, where D = u there exists w Rwith (u, w) M} Then q = w, there is a max crossing solution, and all max crossing solutions µ satisfy µ d 0 for all d R P. Comparison with Th. II: Since D = D P, the condition 0 ri(d) of Theorem II is ri(d ) ri(p ) = Ø 2

PROOF OF MC/MC TH. III Consider the disjoint convex sets C 1 = (u, v) v>wfor some (u, w) M and C 2 = (u, w ) u P [u P and (u, w) M with w >w contradicts the definition of w ] (µ, β) v M C 1 C 2 w P 0} u Since C 2 is polyhedral, there exists a separating hyperplane not containing C 1, i.e., a (µ, ) = (0, 0) such that w + µ z v + µ x, inf (x,v) C 1 apple (x, v) C 1, apple z P v + µ x < sup v + µ x (x,v) C 1 Since (0, 1) is a direction of recession of C1, we see that 0. Because of the relative interior point assumption, = 0, so we may assume that = 1. 3

PROOF (CONTINUED) Hence, w + µ z inf {v + µ u}, apple z P, (u,v) C 1 so that w inf v + µ (u z) (u,v) C 1,z P = inf (u,v) M P {v + µ u} = inf {v + µ u} (u,v) M = q(µ) Using q w (weak duality), we have q(µ) = q = w. Proof that all max crossing solutions µ satisfy µ d 0 for all d R P : follows from q(µ) = inf v + µ (u z) (u,v) C 1,z P so that q(µ) = if µ d>0. Q.E.D. Geometrical intuition: every (0, d) with d R P, is direction of recession of M. 4

MC/MC TH. III - A SPECIAL CASE Consider the MC/MC framework, and assume: (1) For a convex function f : R m (, ], an r m matrix A, and a vector b R r : M = (u, w) for some (x, w) epi(f), Ax b u so M = M + Positive Orthant, where M = (Ax b, w) (x, w) epi(f) w w p(u) = inf f(x) Ax b u epi(f) M (µ, 1) M (x,w ) (x, w) (Ax b, w) w w q(µ) Ax b 0} x 0} u 0} u { (u, w) p(u) <w M epi(p) (2) There is an x ri(dom(f)) s. t. Ax b 0. Then q = w and there is a µ 0 with q(µ) =q. Also M = M epi(p), where p(u) = inf Ax b u f(x). We have w = p(0) = inf Ax b 0 f(x). 5

NONL. FARKAS L. - POLYHEDRAL ASSUM. Let X R n be convex, and f : X R and g j : R n R, j =1,...,r, be linear so g(x) =Ax b for some A and b. Assume that f(x) 0, apple x X with Ax b 0 Let Q = µ µ 0, f(x)+µ (Ax b) 0, apple x X. Assume that there exists a vector x ri(x) such that Ax b 0. Then Q is nonempty. Proof: As before, apply special case of MC/MC Th. III of preceding slide, using the fact w 0, implied by the assumption. w M = (u, w) Ax b u, for some (x, w) epi(f) (0, w ) (Ax b, f(x)) x X 0 u (µ, 1) D 6

(LINEAR) FARKAS LEMMA Let A be an m n matrix and c R m. The system Ay = c, y 0 has a solution if and only if A x 0 c x 0. ( ) Alternative/Equivalent Statement: If P = cone{a 1,...,a n }, where a 1,...,a n are the columns of A, then P =(P ) (Polar Cone Theorem). Proof: If y R n is such that Ay = c, y 0, then y A x = c x for all x R m, which implies Eq. (*). Conversely, apply the Nonlinear Farkas Lemma with f(x) = c x, g(x) = A x, and X = R m. Condition (*) implies the existence of µ 0 such that or equivalently or Aµ = c. c x + µ A x 0, apple x R m, (Aµ c) x 0, apple x R m, 7

LINEAR PROGRAMMING DUALITY Consider the linear program minimize c x subject to a j x b j, j =1,...,r, where c R n, a j R n, and b j R, j =1,...,r. The dual problem is maximize b µ r subject to a j µ j = c, µ 0. j=1 Linear Programming Duality Theorem: (a) If either f or q is finite, then f = q and both the primal and the dual problem have optimal solutions. (b) If f =, then q =. (c) If q =, then f =. Proof: (b) and (c) follow from weak duality. For part (a): If f is finite, there is a primal optimal solution x, by existence of solutions of quadratic programs. Use Farkas Lemma to construct a dual feasible µ such that c x = b µ (next slide). 8

PROOF OF LP DUALITY (CONTINUED) x a 1 a 2 Feasible Set c = µ 1 a 1 + µ 2 a 2 Cone D (translated to x ) Let x be a primal optimal solution, and let J = {j a j x = b j }. Then, c y 0 for all y in the cone of feasible directions D = {y a jy 0, apple j J} By Farkas Lemma, for some scalars µ j be expressed as 0, c can r c = µ j a j, µ j 0, apple j J, µ j =0, j=1 apple j/ J. Taking inner product with x, we obtain c x = b µ, which in view of q f, shows that q = f and that µ is optimal. 9

LINEAR PROGRAMMING OPT. CONDITIONS A pair of vectors (x,µ ) form a primal and dual optimal solution pair if and only if x is primalfeasible, µ is dual-feasible, and µ j (b j a j x )=0, apple j =1,...,r. ( ) Proof: If x is primal-feasible and µ is dualfeasible, then r b µ = b j µ j + c j=1 j=1 r = c x + µ j (b j a j x ) j=1 r a j µ j x ( ) So if Eq. (*) holds, we have b µ = c x, and weak duality implies that x is primal optimal and µ is dual optimal. Conversely, if (x,µ ) form a primal and dual optimal solution pair, then x is primal-feasible, µ is dual-feasible, and by the duality theorem, we have b µ = c x. From Eq. (**), we obtain Eq. (*). 10

CONVEX PROGRAMMING Consider the problem minimize f(x) subject to x X, g j (x) 0, j=1,...,r, where X R n is convex, and f : X R and g j : X R are convex. Assume f : finite. Recall the connection with the max crossing problem in the MC/MC framework where M = epi(p) with p(u) = inf f(x) x X, g(x) u Consider the Lagrangian function L(x, µ) =f(x)+µ g(x), the dual function { inf x X L( x, µ ) if µ 0, q( µ) = otherwise and the dual problem of maximizing inf x X L(x, µ) over µ 0. 11

STRONG DUALITY THEOREM Assume that f is finite, and that one of the following two conditions holds: (1) There exists x X such that g(x) < 0. (2) The functions g j, j =1,...,r, are a ne, and there exists x ri(x) such that g(x) 0. Then q = f and the set of optimal solutions of the dual problem is nonempty. Under condition (1) this set is also compact. Proof: Replace f(x) by f(x) f so that f(x) f 0 for all x X w/ g(x) 0. Apply Nonlinear Farkas Lemma. Then, there exist µ j 0, s.t. r f f(x)+ µ j g j(x), It follows that j=1 apple x X f inf f(x)+µ g(x) inf f(x) =f. x X x X, g(x) 0 Thus equality holds throughout, and we have apple r f = inf f(x)+ µ j g j(x) = q(µ ) x X j=1 12

QUADRATIC PROGRAMMING DUALITY Consider the quadratic program minimize 1 2x Qx + c x subject to Ax b, where Q is positive definite. If f is finite, then f = q and there exist both primal and dual optimal solutions, since the constraints are linear. Calculation of dual function: q(µ) = inf x R n { 1 2x Qx + c x + µ (Ax b)} The infimum is attained for x = Q 1 (c + A µ), and, after substitution and calculation, q(µ) = 1 µ AQ 1 A µ µ (b + AQ 1 c) 1 c Q 1 2 2 c The dual problem, after a sign change, is minimize 1 2µ Pµ+ t µ subject to µ 0, where P = AQ 1 A and t = b + AQ 1 c. 13

OPTIMALITY CONDITIONS We have q = f, and the vectors x and µ are optimal solutions of the primal and dual problems, respectively, iff x is feasible, µ 0, and x arg min L(x, µ ), µ j g j(x )=0, apple j. x X (1) Proof: If q = f, and x,µ are optimal, then f = q = q(µ ) = inf L(x, µ ) x X r L(x,µ ) = f(x )+ µ j g j(x ) f(x ), j=1 where the last inequality follows from µ j 0 and g j (x ) 0 for all j. Hence equality holds throughout above, and (1) holds. Conversely, if x,µ are feasible, and (1) holds, q(µ ) = inf L(x, µ )=L(x,µ ) x X r = f(x )+ µ j g j(x )=f(x ), j=1 so q = f, and x,µ are optimal. Q.E.D. 14

QUADRATIC PROGRAMMING OPT. COND. For the quadratic program minimize 1 2x Qx + c x subject to Ax b, where Q is positive definite, (x,µ ) is a primal and dual optimal solution pair if and only if: Primal and dual feasibility holds: Ax b, µ 0 Lagrangian optimality holds [x minimizes L(x, µ ) over x R n ]. This yields x = Q 1 (c + A µ ) Complementary slackness holds [(Ax b) µ = 0]. It can be written as µ j > 0 a j x = b j, apple j =1,...,r, where a j is the jth row of A, and b j is the jth component of b. 15

LINEAR EQUALITY CONSTRAINTS The problem is minimize f(x) subject to x X, g(x) 0, Ax = b, where X is convex, g(x) = (, g 1 (x),...,g r (x) f : X R and g j : X R, j =1,...,r, are convex. Convert the constraint Ax = b to Ax b and Ax b, with corresponding dual variables + 0 and 0. The Lagrangian function is f(x)+µ g(x)+( + ) (Ax b), and by introducing a dual variable = +, with no sign restriction, it can be written as L(x, µ, ) =f(x)+µ g(x)+ (Ax b). The dual problem is maximize q(µ, ) inf L(x, µ, ) x X subject to µ 0, R m. 16

DUALITY AND OPTIMALITY COND. Pure equality constraints: (a) Assume that f : finite and there exists x ri(x) such that Ax = b. Then f = q and there exists a dual optimal solution. (b) f = q, and (x, ) are a primal and dual optimal solution pair if and only if x is feasible, and x arg min L(x, ) x X Note: No complementary slackness for equality constraints. Linear and nonlinear constraints: (a) Assume f : finite, that there exists x X such that Ax = b and g(x) < 0, and that there exists x ri(x) such that Ax = b. Then q = f and there exists a dual optimal solution. (b) f = q, and (x,µ, ) are a primal and dual optimal solution pair if and only if x is feasible, µ 0, and x arg min L(x, µ, ), µ j g j(x )=0, x X apple j 17

MIT OpenCourseWare http://ocw.mit.edu 6.253 Convex Analysis and Optimization Spring 2012 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.