Convex analysis and profit/cost/support functions

Size: px
Start display at page:

Download "Convex analysis and profit/cost/support functions"

Transcription

1 Division of the Humanities and Social Sciences Convex analysis and profit/cost/support functions KC Border October 2004 Revised January 2009 Let A be a subset of R m Convex analysts may give one of two definitions for the support function of A as either an infimum or a supremum Recall that the supremum of a set of real numbers is its least upper bound and the infimum is its greatest lower bound If A has no upper bound, then by convention sup A = and if A has no lower bound, then inf A = For the empty set, sup A = and inf A = ; otherwise inf A sup A (This makes a kind of sense: Every real number is an upper bound for the empty set, since there is no member of the empty set that is greater than Thus the least upper bound must be Similarly, every real number is also a lower bound, so the infimum is ) Thus support functions (as infima or suprema) may assume the values and By convention, 0 = 0; if > 0 is a real number, then = and ( ) = ; and if < 0 is a real number, then = and ( ) = These conventions are used to simplify statements involving positive homogeneity Rather than choose one definition, I shall give the two definitions different names based on their economic interpretation Profit maximization The profit function π A of A is defined by π A (p) = sup p y y A Cost minimization The cost function c A of A is defined by c A (p) = inf y A p y Clearly, π A (p) = c A ( p) Clearly c A (p) = π A ( p) π A is convex, lower semicontinuous, and positively homogeneous of degree 1 c A is concave, upper semicontinuous, and positively homogeneous of degree 1 1

2 KC Border Convex analysis and profit/cost/support functions 2 Positive homogeneity of π A is obvious given the conventions on multiplication of infinities To see that it is convex, let g x be the linear (hence convex) function defined by g x (p) = x p Then π A (p) = sup x A g x (p) Since the pointwise supremum of a family of convex functions is convex, π A is convex Also each g x is continuous, hence lower semicontinuous, and the supremum of a family of lower semicontinuous functions is lower semicontinuous See my notes on maximization The set {p R m : π A (p) < } is a closed convex cone, called the effective domain of π A, and denoted dom π A The effective domain will always include the point 0 provided A is nonempty By convention π (p) = for all p, and we say that π is improper If A = R m, then 0 is the only point in the effective domain of π A It is easy to see that the effective domain dom π A of π A is a cone, that is, if p dom π A, then p dom π A for every 0 (Note that {0} is a (degenerate) cone) It is also straightforward to show that dom π A is convex For if π A (p) < and π A (q) <, for 0 1, by convexity of π A, we π A ( x + (1 )y ) πa (p) + (1 )π A (q) < The closedness of dom π A is more difficult Positive homogeneity of c A is obvious given the conventions on multiplication of infinities To see that it is concave, let g x be the linear (hence concave) function defined by g x (p) = x p Then c A (p) = inf x A g x (p) Since the pointwise infimum of a family of concave functions is concave, c A is concave Also each g x is continuous, hence upper semicontinuous, and the infimum of a family of upper semicontinuous functions is upper semicontinuous See my notes on maximization The set {p R m : c A (p) > } is a closed convex cone, called the effective domain of c A, and denoted dom c A The effective domain will always include the point 0 provided A is nonempty By convention c (p) = for all p, and we say that c is improper If A = R m, then 0 is the only point in the effective domain of c A It is easy to see that the effective domain dom c A of c A is a cone, that is, if p dom c A, then p dom c A for every 0 (Note that {0} is a (degenerate) cone) It is also straightforward to show that dom c A is convex For if c A (p) > and c A (q) >, for 0 1, by concavity of c A, we c A ( x + (1 )y ) ca (p) + (1 )c A (q) > The closedness of dom c A is more difficult Recoverability Separating Hyperplane Theorem If A is a closed convex set, and x does not belong to A, then there is a nonzero p satisfying p x > π A (p) Separating Hyperplane Theorem If A is a closed convex set, and x does not belong to A, then there is a nonzero p satisfying p x < c A (p)

3 KC Border Convex analysis and profit/cost/support functions 3 For a proof see my notes Note that given our conventions, A may be empty From this theorem we easily get the next proposition For a proof see my notes Note that given our conventions, A may be empty From this theorem we easily get the next proposition The closed convex hull co A of The closed convex hull co A of co A = { y R m : ( p R m ) [ p y π A (p) ]} co A = { y R m : ( p R m ) [ p y c A (p) ]} Now let f be a continuous real-valued function defined on a closed convex cone D We can extend f to all of R m by setting it to outside of D if f is convex or if f is concave If f is convex and positively homogeneous of degree 1, define A = { y R m : ( p R m ) [ p y f(p) ]} Then A is closed and convex and f = π A If f is concave and positively homogeneous of degree 1, define A = { y R m : ( p R m ) [ p y f(p) ]} Then A is closed and convex and f = c A Extremizers are subgradients If ỹ(p) maximizes p over A, that is, if ỹ(p) belongs to A and p ỹ(p) p y for all y A, then ỹ(p) is a subgradient of π A at p That is, If ŷ(p) minimizes p over A, that is, if ŷ(p) belongs to A and p ŷ(p) p y for all y A, then ŷ(p) is a supergradient of c A at p That is, π A (p) + ỹ(p) (q p) π A (q) ( ) c A (p) + ŷ(p) (q p) c A (q) ( ) for all q R m for all q R m To see this, note that for any q R m, by definition we q ỹ(p) π A (q) Now add π A (p) p ỹ(p) = 0 to the left hand side to get the subgradient inequality Note that π A (p) may be finite for a closed convex set A, and yet there may be no maximizer For instance, let A = {(x, y) R 2 : x < 0, y < 0, xy 1} Then for p = (1, 0), we π A (p) = 0 as (1, 0) ( 1/n, n) = 1/n, but (1, 0) (x, y) = x < 0 for each (x, y) A Thus there is no maximizer in A To see this, note that for any q R m, by definition we q ŷ(p) c A (q) Now add c A (p) p ŷ(p) = 0 to the left hand side to get the supergradient inequality Note that x A (p) may be finite for a closed convex set A, and yet there may be no minimizer For instance, let A = {(x, y) R 2 : x > 0, y > 0, xy 1} Then for p = (1, 0), we π A (p) = 0 as (1, 0) (1/n, n) = 1/n, but (1, 0) (x, y) = x > 0 for each (x, y) A Thus there is no minimizer in A

4 KC Border Convex analysis and profit/cost/support functions 4 It turns out that if there is no maximizer of p, then π A has no subgradient at p In fact, the following is true, but I won t present the proof, which relies on the Separating Hyperplane Theorem (See my notes for a proof) Theorem If A is closed and convex, then x is a subgradient of π A at p if and only if x A and x maximizes p over A It turns out that if there is no minimizer of p, then c A has no supergradient at p In fact, the following is true, but I won t present the proof, which relies on the Separating Hyperplane Theorem (See my notes for a proof) Theorem If A is closed and convex, then x is a supergradient of c A at p if and only if x A and x minimizes p over A Comparative statics Consequently, if A is closed and convex, and ỹ(p) is the unique maximizer of p over A, then π A is differentiable at p and Consequently, if A is closed and convex, and ŷ(p) is the unique minimizer of p over A, then c A is differentiable at p and ỹ(p) = π A(p) ( ) ŷ(p) = c A(p) ( ) To see that differentiability of π A implies the profit maximizer is unique, consider q of the form p ± e i, where e i is the i th unit coordinate vector, and > 0 The subgradient inequality for q = p + e i is ỹ(p) e i π A (p + e i ) π A (p) and for q = p e i is ỹ(p) e i π A (p e i ) π A (p) Dividing these by and respectively yields To see that differentiability of c A implies the cost minimizer is unique, consider q of the form p ± e i, where e i is the i th unit coordinate vector, and > 0 The supergradient inequality for q = p + e i is ŷ(p) e i c A (p + e i ) c A (p) and for q = p e i is ŷ(p) e i c A (p e i ) c A (p) Dividing these by and respectively yields so yi (p) π A(p + e i ) π A (p) yi (p) π A(p e i ) π A (p) so yi (p) c A(p + e i ) c A (p) yi (p) c A(p e i ) c A (p) π A (p e i ) π A (p) y i (p) π A(p+e i ) π A (p) c A (p+e i ) c A (p) y i (p) c A(p e i ) c A (p) Letting 0 yields ỹ i (p) = D i π A (p) Thus if π A is twice differentiable at p, that is, if the maximizer ỹ(p) is differentiable with respect to p, then the i th component D j ỹ i (p) = D ij π A (p) ( ) Letting 0 yields ŷ i (p) = D i c A (p) Thus if c A is twice differentiable at p, that is, if the minimizer ŷ(p) is differentiable with respect to p, then the i th component D j ŷ i (p) = D ij c A (p) ( )

5 KC Border Convex analysis and profit/cost/support functions 5 Consequently, the matrix [ ] D j ỹ i (p) is positive semidefinite Consequently, the matrix [ ] D j ŷ i (p) is negative semidefinite In particular, D i ỹ i 0 In particular, D i ŷ i 0 Even without twice differentiability, from the subgradient inequality, we π A (p) + ỹ(p) (q p) π A (q) π A (q) + ỹ(q) (p q) π A (p) so adding the two inequalities, we get (ỹ(p) ỹ(q) ) (p q) 0 Even without twice differentiability, from the supergradient inequality, we c A (p) + ŷ(p) (q p) c A (q) c A (q) + ŷ(q) (p q) c A (p) so adding the two inequalities, we get (ŷ(p) ŷ(q) ) (p q) 0 Thus if q differs from p only in its i th component, say q i = p i + p i, then we ỹ i p i 0 Dividing by the positive quantity ( p i ) 2 does not change this inequality, so ỹ i p i 0 Thus if q differs from p only in its i th component, say q i = p i + p i, then we ŷ i p i 0 Dividing by the positive quantity ( p i ) 2 does not change this inequality, so ŷ i p i 0 Cyclical monotonicity and empirical restrictions A real function g : X R R is increasing if x y = g(x) g(y) That is, g(x) g(y) and x y the same sign An equivalent way to say this is ( g(x) g(y) )( x y) 0 for all x, y, A real function g : X R R is decreasing if x y = g(x) g(y) That is, g(x) g(y) and x y the opposite sign An equivalent way to say this is ( g(x) g(y) )( x y) 0 for all x, y, which can be rewritten as g(x)(y x) + g(y)(x y) 0 for all x, y which can be rewritten as g(x)(y x) + g(y)(x y) 0 for all x, y

6 KC Border Convex analysis and profit/cost/support functions 6 We can generalize this to a function g from R m into R m like this: Definition A function g : X R m R m is monotone (increasing) if g(x) (y x) + g(y) (x y) 0 for all x, y X We already seen that the (sub)gradient of π A is monotone (increasing) More is true Definition A mapping g : X R m R m is cyclically monotone (increasing) if for every cycle x 0, x 1,, x n, x n+1 = x 0 in X, we g(x 0 ) (x 1 x 0 ) + + g(x n ) (x n+1 x n ) 0 If f : X R m R is convex, and g : X R m R m is a selection from the subdifferential of f, that is, if g(x) is a subgradient of f at x for every x, then g is cyclically monotone (increasing) Proof : Let x 0, x 1,, x n, x n+1 = x 0 be a cycle in X From the subgradient inequality at x i, we or f(x i ) + g(x i ) (x i+1 x i ) f(x i+1 ) g(x i ) (x i+1 x i ) f(x i+1 ) f(x i ) for each i = 0,, n Summing gives n g(x i ) (x i+1 x i ) 0, i=0 where the right-hand side takes into account f(x n+1 ) = f(x 0 ) We can generalize this to a function g from R m into R m like this: Definition A function g : X R m R m is monotone (decreasing) if g(x) (y x) + g(y) (x y) 0 for all x, y X We already seen that the (super)gradient of c A is monotone (decreasing) More is true Definition A mapping g : X R m R m is cyclically monotone (decreasing) if for every cycle x 0, x 1,, x n, x n+1 = x 0 in X, we g(x 0 ) (x 1 x 0 ) + + g(x n ) (x n+1 x n ) 0 If f : X R m R is concave, and g : X R m R m is a selection from the superdifferential of f, that is, if g(x) is a supergradient of f at x for every x, then g is cyclically monotone (decreasing) Proof : Let x 0, x 1,, x n, x n+1 = x 0 be a cycle in X From the supergradient inequality at x i, we or f(x i ) + g(x i ) (x i+1 x i ) f(x i+1 ) g(x i ) (x i+1 x i ) f(x i+1 ) f(x i ) for each i = 0,, n Summing gives n g(x i ) (x i+1 x i ) 0, i=0 where the right-hand side takes into account f(x n+1 ) = f(x 0 )

7 KC Border Convex analysis and profit/cost/support functions 7 Corollary The profit maximizing points correspondence ŷ is cyclically monotonic (increasing) Remarkably the converse is true Theorem (Rockafellar) Let X be a convex set in R m and let φ: X R m be a cyclically monotone (increasing) correspondence (that is, if every selection from φ is a cyclically monotone (increasing) function) Then there is a lower semicontinuous convex function f : X R such that φ(x) f(x) for every x Corollary The cost minimizing points correspondence ŷ is cyclically monotonic (decreasing) Remarkably the converse is true Theorem (Rockafellar) Let X be a convex set in R m and let φ: X R m be a cyclically monotone (decreasing) correspondence (that is, if every selection from φ is a cyclically monotone (decreasing) function) Then there is an upper semicontinuous concave function f : X R such that φ(x) f(x) for every x

Division of the Humanities and Social Sciences. Supergradients. KC Border Fall 2001 v ::15.45

Division of the Humanities and Social Sciences. Supergradients. KC Border Fall 2001 v ::15.45 Division of the Humanities and Social Sciences Supergradients KC Border Fall 2001 1 The supergradient of a concave function There is a useful way to characterize the concavity of differentiable functions.

More information

Rockefellar s Closed Functions

Rockefellar s Closed Functions Division of the Humanities and Social Sciences Ec 181 KC Border Convex Analysis and Economic Theory Fall 2018 Winter 2019 Topic 21: Rockefellar s Closed Functions 21.1 Closed convex functions Convex analysts

More information

A SET OF LECTURE NOTES ON CONVEX OPTIMIZATION WITH SOME APPLICATIONS TO PROBABILITY THEORY INCOMPLETE DRAFT. MAY 06

A SET OF LECTURE NOTES ON CONVEX OPTIMIZATION WITH SOME APPLICATIONS TO PROBABILITY THEORY INCOMPLETE DRAFT. MAY 06 A SET OF LECTURE NOTES ON CONVEX OPTIMIZATION WITH SOME APPLICATIONS TO PROBABILITY THEORY INCOMPLETE DRAFT. MAY 06 CHRISTIAN LÉONARD Contents Preliminaries 1 1. Convexity without topology 1 2. Convexity

More information

Convex Analysis and Economic Theory AY Elementary properties of convex functions

Convex Analysis and Economic Theory AY Elementary properties of convex functions Division of the Humanities and Social Sciences Ec 181 KC Border Convex Analysis and Economic Theory AY 2018 2019 Topic 6: Convex functions I 6.1 Elementary properties of convex functions We may occasionally

More information

Subgradients. subgradients. strong and weak subgradient calculus. optimality conditions via subgradients. directional derivatives

Subgradients. subgradients. strong and weak subgradient calculus. optimality conditions via subgradients. directional derivatives Subgradients subgradients strong and weak subgradient calculus optimality conditions via subgradients directional derivatives Prof. S. Boyd, EE364b, Stanford University Basic inequality recall basic inequality

More information

Convex Analysis and Economic Theory Fall 2018 Winter 2019

Convex Analysis and Economic Theory Fall 2018 Winter 2019 Division of the Humanities and Social Sciences Ec 181 KC Border Convex Analysis and Economic Theory Fall 2018 Winter 2019 Topic 18: Differentiability 18.1 Differentiable functions In this section I want

More information

CHAPTER 2: CONVEX SETS AND CONCAVE FUNCTIONS. W. Erwin Diewert January 31, 2008.

CHAPTER 2: CONVEX SETS AND CONCAVE FUNCTIONS. W. Erwin Diewert January 31, 2008. 1 ECONOMICS 594: LECTURE NOTES CHAPTER 2: CONVEX SETS AND CONCAVE FUNCTIONS W. Erwin Diewert January 31, 2008. 1. Introduction Many economic problems have the following structure: (i) a linear function

More information

Subgradients. subgradients and quasigradients. subgradient calculus. optimality conditions via subgradients. directional derivatives

Subgradients. subgradients and quasigradients. subgradient calculus. optimality conditions via subgradients. directional derivatives Subgradients subgradients and quasigradients subgradient calculus optimality conditions via subgradients directional derivatives Prof. S. Boyd, EE392o, Stanford University Basic inequality recall basic

More information

Some Properties of the Augmented Lagrangian in Cone Constrained Optimization

Some Properties of the Augmented Lagrangian in Cone Constrained Optimization MATHEMATICS OF OPERATIONS RESEARCH Vol. 29, No. 3, August 2004, pp. 479 491 issn 0364-765X eissn 1526-5471 04 2903 0479 informs doi 10.1287/moor.1040.0103 2004 INFORMS Some Properties of the Augmented

More information

Optimality Conditions for Nonsmooth Convex Optimization

Optimality Conditions for Nonsmooth Convex Optimization Optimality Conditions for Nonsmooth Convex Optimization Sangkyun Lee Oct 22, 2014 Let us consider a convex function f : R n R, where R is the extended real field, R := R {, + }, which is proper (f never

More information

Local strong convexity and local Lipschitz continuity of the gradient of convex functions

Local strong convexity and local Lipschitz continuity of the gradient of convex functions Local strong convexity and local Lipschitz continuity of the gradient of convex functions R. Goebel and R.T. Rockafellar May 23, 2007 Abstract. Given a pair of convex conjugate functions f and f, we investigate

More information

Optimization and Optimal Control in Banach Spaces

Optimization and Optimal Control in Banach Spaces Optimization and Optimal Control in Banach Spaces Bernhard Schmitzer October 19, 2017 1 Convex non-smooth optimization with proximal operators Remark 1.1 (Motivation). Convex optimization: easier to solve,

More information

7) Important properties of functions: homogeneity, homotheticity, convexity and quasi-convexity

7) Important properties of functions: homogeneity, homotheticity, convexity and quasi-convexity 30C00300 Mathematical Methods for Economists (6 cr) 7) Important properties of functions: homogeneity, homotheticity, convexity and quasi-convexity Abolfazl Keshvari Ph.D. Aalto University School of Business

More information

Shiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 4. Subgradient

Shiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 4. Subgradient Shiqian Ma, MAT-258A: Numerical Optimization 1 Chapter 4 Subgradient Shiqian Ma, MAT-258A: Numerical Optimization 2 4.1. Subgradients definition subgradient calculus duality and optimality conditions Shiqian

More information

BASICS OF CONVEX ANALYSIS

BASICS OF CONVEX ANALYSIS BASICS OF CONVEX ANALYSIS MARKUS GRASMAIR 1. Main Definitions We start with providing the central definitions of convex functions and convex sets. Definition 1. A function f : R n R + } is called convex,

More information

Dedicated to Michel Théra in honor of his 70th birthday

Dedicated to Michel Théra in honor of his 70th birthday VARIATIONAL GEOMETRIC APPROACH TO GENERALIZED DIFFERENTIAL AND CONJUGATE CALCULI IN CONVEX ANALYSIS B. S. MORDUKHOVICH 1, N. M. NAM 2, R. B. RECTOR 3 and T. TRAN 4. Dedicated to Michel Théra in honor of

More information

Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization

Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization Compiled by David Rosenberg Abstract Boyd and Vandenberghe s Convex Optimization book is very well-written and a pleasure to read. The

More information

MAXIMALITY OF SUMS OF TWO MAXIMAL MONOTONE OPERATORS

MAXIMALITY OF SUMS OF TWO MAXIMAL MONOTONE OPERATORS MAXIMALITY OF SUMS OF TWO MAXIMAL MONOTONE OPERATORS JONATHAN M. BORWEIN, FRSC Abstract. We use methods from convex analysis convex, relying on an ingenious function of Simon Fitzpatrick, to prove maximality

More information

Subdifferential representation of convex functions: refinements and applications

Subdifferential representation of convex functions: refinements and applications Subdifferential representation of convex functions: refinements and applications Joël Benoist & Aris Daniilidis Abstract Every lower semicontinuous convex function can be represented through its subdifferential

More information

1 Convexity, concavity and quasi-concavity. (SB )

1 Convexity, concavity and quasi-concavity. (SB ) UNIVERSITY OF MARYLAND ECON 600 Summer 2010 Lecture Two: Unconstrained Optimization 1 Convexity, concavity and quasi-concavity. (SB 21.1-21.3.) For any two points, x, y R n, we can trace out the line of

More information

EC9A0: Pre-sessional Advanced Mathematics Course. Lecture Notes: Unconstrained Optimisation By Pablo F. Beker 1

EC9A0: Pre-sessional Advanced Mathematics Course. Lecture Notes: Unconstrained Optimisation By Pablo F. Beker 1 EC9A0: Pre-sessional Advanced Mathematics Course Lecture Notes: Unconstrained Optimisation By Pablo F. Beker 1 1 Infimum and Supremum Definition 1. Fix a set Y R. A number α R is an upper bound of Y if

More information

Convex Analysis and Economic Theory Winter 2018

Convex Analysis and Economic Theory Winter 2018 Division of the Humanities and Social Sciences Ec 181 KC Border Convex Analysis and Economic Theory Winter 2018 Supplement A: Mathematical background A.1 Extended real numbers The extended real number

More information

On a Class of Multidimensional Optimal Transportation Problems

On a Class of Multidimensional Optimal Transportation Problems Journal of Convex Analysis Volume 10 (2003), No. 2, 517 529 On a Class of Multidimensional Optimal Transportation Problems G. Carlier Université Bordeaux 1, MAB, UMR CNRS 5466, France and Université Bordeaux

More information

Introduction to Correspondences

Introduction to Correspondences Division of the Humanities and Social Sciences Introduction to Correspondences KC Border September 2010 Revised November 2013 In these particular notes, all spaces are metric spaces. The set of extended

More information

GEOMETRIC APPROACH TO CONVEX SUBDIFFERENTIAL CALCULUS October 10, Dedicated to Franco Giannessi and Diethard Pallaschke with great respect

GEOMETRIC APPROACH TO CONVEX SUBDIFFERENTIAL CALCULUS October 10, Dedicated to Franco Giannessi and Diethard Pallaschke with great respect GEOMETRIC APPROACH TO CONVEX SUBDIFFERENTIAL CALCULUS October 10, 2018 BORIS S. MORDUKHOVICH 1 and NGUYEN MAU NAM 2 Dedicated to Franco Giannessi and Diethard Pallaschke with great respect Abstract. In

More information

Convex Functions. Pontus Giselsson

Convex Functions. Pontus Giselsson Convex Functions Pontus Giselsson 1 Today s lecture lower semicontinuity, closure, convex hull convexity preserving operations precomposition with affine mapping infimal convolution image function supremum

More information

Suppose R is an ordered ring with positive elements P.

Suppose R is an ordered ring with positive elements P. 1. The real numbers. 1.1. Ordered rings. Definition 1.1. By an ordered commutative ring with unity we mean an ordered sextuple (R, +, 0,, 1, P ) such that (R, +, 0,, 1) is a commutative ring with unity

More information

HW1 solutions. 1. α Ef(x) β, where Ef(x) is the expected value of f(x), i.e., Ef(x) = n. i=1 p if(a i ). (The function f : R R is given.

HW1 solutions. 1. α Ef(x) β, where Ef(x) is the expected value of f(x), i.e., Ef(x) = n. i=1 p if(a i ). (The function f : R R is given. HW1 solutions Exercise 1 (Some sets of probability distributions.) Let x be a real-valued random variable with Prob(x = a i ) = p i, i = 1,..., n, where a 1 < a 2 < < a n. Of course p R n lies in the standard

More information

Structure of R. Chapter Algebraic and Order Properties of R

Structure of R. Chapter Algebraic and Order Properties of R Chapter Structure of R We will re-assemble calculus by first making assumptions about the real numbers. All subsequent results will be rigorously derived from these assumptions. Most of the assumptions

More information

Scalar multiplication and addition of sequences 9

Scalar multiplication and addition of sequences 9 8 Sequences 1.2.7. Proposition. Every subsequence of a convergent sequence (a n ) n N converges to lim n a n. Proof. If (a nk ) k N is a subsequence of (a n ) n N, then n k k for every k. Hence if ε >

More information

In N we can do addition, but in order to do subtraction we need to extend N to the integers

In N we can do addition, but in order to do subtraction we need to extend N to the integers Chapter The Real Numbers.. Some Preliminaries Discussion: The Irrationality of 2. We begin with the natural numbers N = {, 2, 3, }. In N we can do addition, but in order to do subtraction we need to extend

More information

On Semicontinuity of Convex-valued Multifunctions and Cesari s Property (Q)

On Semicontinuity of Convex-valued Multifunctions and Cesari s Property (Q) On Semicontinuity of Convex-valued Multifunctions and Cesari s Property (Q) Andreas Löhne May 2, 2005 (last update: November 22, 2005) Abstract We investigate two types of semicontinuity for set-valued

More information

In N we can do addition, but in order to do subtraction we need to extend N to the integers

In N we can do addition, but in order to do subtraction we need to extend N to the integers Chapter 1 The Real Numbers 1.1. Some Preliminaries Discussion: The Irrationality of 2. We begin with the natural numbers N = {1, 2, 3, }. In N we can do addition, but in order to do subtraction we need

More information

6.254 : Game Theory with Engineering Applications Lecture 7: Supermodular Games

6.254 : Game Theory with Engineering Applications Lecture 7: Supermodular Games 6.254 : Game Theory with Engineering Applications Lecture 7: Asu Ozdaglar MIT February 25, 2010 1 Introduction Outline Uniqueness of a Pure Nash Equilibrium for Continuous Games Reading: Rosen J.B., Existence

More information

STABLE AND TOTAL FENCHEL DUALITY FOR CONVEX OPTIMIZATION PROBLEMS IN LOCALLY CONVEX SPACES

STABLE AND TOTAL FENCHEL DUALITY FOR CONVEX OPTIMIZATION PROBLEMS IN LOCALLY CONVEX SPACES STABLE AND TOTAL FENCHEL DUALITY FOR CONVEX OPTIMIZATION PROBLEMS IN LOCALLY CONVEX SPACES CHONG LI, DONGHUI FANG, GENARO LÓPEZ, AND MARCO A. LÓPEZ Abstract. We consider the optimization problem (P A )

More information

Mathematics 530. Practice Problems. n + 1 }

Mathematics 530. Practice Problems. n + 1 } Department of Mathematical Sciences University of Delaware Prof. T. Angell October 19, 2015 Mathematics 530 Practice Problems 1. Recall that an indifference relation on a partially ordered set is defined

More information

Helly's Theorem and its Equivalences via Convex Analysis

Helly's Theorem and its Equivalences via Convex Analysis Portland State University PDXScholar University Honors Theses University Honors College 2014 Helly's Theorem and its Equivalences via Convex Analysis Adam Robinson Portland State University Let us know

More information

LECTURE 9 LECTURE OUTLINE. Min Common/Max Crossing for Min-Max

LECTURE 9 LECTURE OUTLINE. Min Common/Max Crossing for Min-Max Min-Max Problems Saddle Points LECTURE 9 LECTURE OUTLINE Min Common/Max Crossing for Min-Max Given φ : X Z R, where X R n, Z R m consider minimize sup φ(x, z) subject to x X and maximize subject to z Z.

More information

Chapter 2 Convex Analysis

Chapter 2 Convex Analysis Chapter 2 Convex Analysis The theory of nonsmooth analysis is based on convex analysis. Thus, we start this chapter by giving basic concepts and results of convexity (for further readings see also [202,

More information

Lecture 1: Background on Convex Analysis

Lecture 1: Background on Convex Analysis Lecture 1: Background on Convex Analysis John Duchi PCMI 2016 Outline I Convex sets 1.1 Definitions and examples 2.2 Basic properties 3.3 Projections onto convex sets 4.4 Separating and supporting hyperplanes

More information

CONSTRAINT QUALIFICATIONS, LAGRANGIAN DUALITY & SADDLE POINT OPTIMALITY CONDITIONS

CONSTRAINT QUALIFICATIONS, LAGRANGIAN DUALITY & SADDLE POINT OPTIMALITY CONDITIONS CONSTRAINT QUALIFICATIONS, LAGRANGIAN DUALITY & SADDLE POINT OPTIMALITY CONDITIONS A Dissertation Submitted For The Award of the Degree of Master of Philosophy in Mathematics Neelam Patel School of Mathematics

More information

8. Conjugate functions

8. Conjugate functions L. Vandenberghe EE236C (Spring 2013-14) 8. Conjugate functions closed functions conjugate function 8-1 Closed set a set C is closed if it contains its boundary: x k C, x k x = x C operations that preserve

More information

Convex Sets and Minimal Sublinear Functions

Convex Sets and Minimal Sublinear Functions Convex Sets and Minimal Sublinear Functions Amitabh Basu Gérard Cornuéjols Giacomo Zambelli March 2009, revised January 2010 Abstract We show that, given a closed convex set K containing the origin in

More information

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee227c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee227c@berkeley.edu

More information

Translative Sets and Functions and their Applications to Risk Measure Theory and Nonlinear Separation

Translative Sets and Functions and their Applications to Risk Measure Theory and Nonlinear Separation Translative Sets and Functions and their Applications to Risk Measure Theory and Nonlinear Separation Andreas H. Hamel Abstract Recently defined concepts such as nonlinear separation functionals due to

More information

Chapter 2: Preliminaries and elements of convex analysis

Chapter 2: Preliminaries and elements of convex analysis Chapter 2: Preliminaries and elements of convex analysis Edoardo Amaldi DEIB Politecnico di Milano edoardo.amaldi@polimi.it Website: http://home.deib.polimi.it/amaldi/opt-14-15.shtml Academic year 2014-15

More information

Convex Optimization Notes

Convex Optimization Notes Convex Optimization Notes Jonathan Siegel January 2017 1 Convex Analysis This section is devoted to the study of convex functions f : B R {+ } and convex sets U B, for B a Banach space. The case of B =

More information

Convex Functions and Optimization

Convex Functions and Optimization Chapter 5 Convex Functions and Optimization 5.1 Convex Functions Our next topic is that of convex functions. Again, we will concentrate on the context of a map f : R n R although the situation can be generalized

More information

Duality. for The New Palgrave Dictionary of Economics, 2nd ed. Lawrence E. Blume

Duality. for The New Palgrave Dictionary of Economics, 2nd ed. Lawrence E. Blume Duality for The New Palgrave Dictionary of Economics, 2nd ed. Lawrence E. Blume Headwords: CONVEXITY, DUALITY, LAGRANGE MULTIPLIERS, PARETO EFFICIENCY, QUASI-CONCAVITY 1 Introduction The word duality is

More information

Subgradient. Acknowledgement: this slides is based on Prof. Lieven Vandenberghes lecture notes. definition. subgradient calculus

Subgradient. Acknowledgement: this slides is based on Prof. Lieven Vandenberghes lecture notes. definition. subgradient calculus 1/41 Subgradient Acknowledgement: this slides is based on Prof. Lieven Vandenberghes lecture notes definition subgradient calculus duality and optimality conditions directional derivative Basic inequality

More information

On John type ellipsoids

On John type ellipsoids On John type ellipsoids B. Klartag Tel Aviv University Abstract Given an arbitrary convex symmetric body K R n, we construct a natural and non-trivial continuous map u K which associates ellipsoids to

More information

Maximal Monotone Inclusions and Fitzpatrick Functions

Maximal Monotone Inclusions and Fitzpatrick Functions JOTA manuscript No. (will be inserted by the editor) Maximal Monotone Inclusions and Fitzpatrick Functions J. M. Borwein J. Dutta Communicated by Michel Thera. Abstract In this paper, we study maximal

More information

Lectures on Geometry

Lectures on Geometry January 4, 2001 Lectures on Geometry Christer O. Kiselman Contents: 1. Introduction 2. Closure operators and Galois correspondences 3. Convex sets and functions 4. Infimal convolution 5. Convex duality:

More information

Constraint qualifications for convex inequality systems with applications in constrained optimization

Constraint qualifications for convex inequality systems with applications in constrained optimization Constraint qualifications for convex inequality systems with applications in constrained optimization Chong Li, K. F. Ng and T. K. Pong Abstract. For an inequality system defined by an infinite family

More information

Lecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem

Lecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem Lecture 3: Lagrangian duality and algorithms for the Lagrangian dual problem Michael Patriksson 0-0 The Relaxation Theorem 1 Problem: find f := infimum f(x), x subject to x S, (1a) (1b) where f : R n R

More information

Mathematical Economics: Lecture 16

Mathematical Economics: Lecture 16 Mathematical Economics: Lecture 16 Yu Ren WISE, Xiamen University November 26, 2012 Outline 1 Chapter 21: Concave and Quasiconcave Functions New Section Chapter 21: Concave and Quasiconcave Functions Concave

More information

A Brøndsted-Rockafellar Theorem for Diagonal Subdifferential Operators

A Brøndsted-Rockafellar Theorem for Diagonal Subdifferential Operators A Brøndsted-Rockafellar Theorem for Diagonal Subdifferential Operators Radu Ioan Boţ Ernö Robert Csetnek April 23, 2012 Dedicated to Jon Borwein on the occasion of his 60th birthday Abstract. In this note

More information

Convex Analysis and Economic Theory Winter 2018

Convex Analysis and Economic Theory Winter 2018 Division of the Humanities and Social Sciences Ec 181 KC Border Convex Analysis and Economic Theory Winter 2018 Topic 7: Quasiconvex Functions I 7.1 Level sets of functions For an extended real-valued

More information

BORIS MORDUKHOVICH Wayne State University Detroit, MI 48202, USA. Talk given at the SPCOM Adelaide, Australia, February 2015

BORIS MORDUKHOVICH Wayne State University Detroit, MI 48202, USA. Talk given at the SPCOM Adelaide, Australia, February 2015 CODERIVATIVE CHARACTERIZATIONS OF MAXIMAL MONOTONICITY BORIS MORDUKHOVICH Wayne State University Detroit, MI 48202, USA Talk given at the SPCOM 2015 Adelaide, Australia, February 2015 Based on joint papers

More information

Constrained maxima and Lagrangean saddlepoints

Constrained maxima and Lagrangean saddlepoints Division of the Humanities and Social Sciences Ec 181 KC Border Convex Analysis and Economic Theory Winter 2018 Topic 10: Constrained maxima and Lagrangean saddlepoints 10.1 An alternative As an application

More information

Division of the Humanities and Social Sciences. Sums of sets, etc.

Division of the Humanities and Social Sciences. Sums of sets, etc. Division of the Humanities and Social Sciences Sums of sets, etc. KC Border September 2002 Rev. November 2012 Rev. September 2013 If E and F are subsets of R m, define the sum E + F = {x + y : x E; y F

More information

Convex Analysis and Economic Theory Winter 2018

Convex Analysis and Economic Theory Winter 2018 Division of the Humanities and Social Sciences Ec 181 KC Border Convex Analysis and Economic Theory Winter 2018 Topic 0: Vector spaces 0.1 Basic notation Here are some of the fundamental sets and spaces

More information

When are Sums Closed?

When are Sums Closed? Division of the Humanities and Social Sciences Ec 181 KC Border Convex Analysis and Economic Theory Fall 2018 Winter 2019 Topic 20: When are Sums Closed? 20.1 Is a sum of closed sets closed? Example 0.2.2

More information

Postulate 2 [Order Axioms] in WRW the usual rules for inequalities

Postulate 2 [Order Axioms] in WRW the usual rules for inequalities Number Systems N 1,2,3,... the positive integers Z 3, 2, 1,0,1,2,3,... the integers Q p q : p,q Z with q 0 the rational numbers R {numbers expressible by finite or unending decimal expansions} makes sense

More information

Aliprantis, Border: Infinite-dimensional Analysis A Hitchhiker s Guide

Aliprantis, Border: Infinite-dimensional Analysis A Hitchhiker s Guide aliprantis.tex May 10, 2011 Aliprantis, Border: Infinite-dimensional Analysis A Hitchhiker s Guide Notes from [AB2]. 1 Odds and Ends 2 Topology 2.1 Topological spaces Example. (2.2) A semimetric = triangle

More information

The Fitzpatrick Function and Nonreflexive Spaces

The Fitzpatrick Function and Nonreflexive Spaces Journal of Convex Analysis Volume 13 (2006), No. 3+4, 861 881 The Fitzpatrick Function and Nonreflexive Spaces S. Simons Department of Mathematics, University of California, Santa Barbara, CA 93106-3080,

More information

Global Maximum of a Convex Function: Necessary and Sufficient Conditions

Global Maximum of a Convex Function: Necessary and Sufficient Conditions Journal of Convex Analysis Volume 13 2006), No. 3+4, 687 694 Global Maximum of a Convex Function: Necessary and Sufficient Conditions Emil Ernst Laboratoire de Modélisation en Mécaniue et Thermodynamiue,

More information

Applied Mathematics Letters

Applied Mathematics Letters Applied Mathematics Letters 25 (2012) 974 979 Contents lists available at SciVerse ScienceDirect Applied Mathematics Letters journal homepage: www.elsevier.com/locate/aml On dual vector equilibrium problems

More information

Lecture 4: Optimization. Maximizing a function of a single variable

Lecture 4: Optimization. Maximizing a function of a single variable Lecture 4: Optimization Maximizing or Minimizing a Function of a Single Variable Maximizing or Minimizing a Function of Many Variables Constrained Optimization Maximizing a function of a single variable

More information

Lecture 5: The Bellman Equation

Lecture 5: The Bellman Equation Lecture 5: The Bellman Equation Florian Scheuer 1 Plan Prove properties of the Bellman equation (In particular, existence and uniqueness of solution) Use this to prove properties of the solution Think

More information

Continuous Sets and Non-Attaining Functionals in Reflexive Banach Spaces

Continuous Sets and Non-Attaining Functionals in Reflexive Banach Spaces Laboratoire d Arithmétique, Calcul formel et d Optimisation UMR CNRS 6090 Continuous Sets and Non-Attaining Functionals in Reflexive Banach Spaces Emil Ernst Michel Théra Rapport de recherche n 2004-04

More information

Convex Analysis and Economic Theory Winter 2018

Convex Analysis and Economic Theory Winter 2018 Division of the Humanities and Social Sciences Ec 181 KC Border Conve Analysis and Economic Theory Winter 2018 Toic 16: Fenchel conjugates 16.1 Conjugate functions Recall from Proosition 14.1.1 that is

More information

Necessary and Sufficient Conditions for the Existence of a Global Maximum for Convex Functions in Reflexive Banach Spaces

Necessary and Sufficient Conditions for the Existence of a Global Maximum for Convex Functions in Reflexive Banach Spaces Laboratoire d Arithmétique, Calcul formel et d Optimisation UMR CNRS 6090 Necessary and Sufficient Conditions for the Existence of a Global Maximum for Convex Functions in Reflexive Banach Spaces Emil

More information

Duality (Continued) min f ( x), X R R. Recall, the general primal problem is. The Lagrangian is a function. defined by

Duality (Continued) min f ( x), X R R. Recall, the general primal problem is. The Lagrangian is a function. defined by Duality (Continued) Recall, the general primal problem is min f ( x), xx g( x) 0 n m where X R, f : X R, g : XR ( X). he Lagrangian is a function L: XR R m defined by L( xλ, ) f ( x) λ g( x) Duality (Continued)

More information

Parcours OJD, Ecole Polytechnique et Université Pierre et Marie Curie 05 Mai 2015

Parcours OJD, Ecole Polytechnique et Université Pierre et Marie Curie 05 Mai 2015 Examen du cours Optimisation Stochastique Version 06/05/2014 Mastère de Mathématiques de la Modélisation F. Bonnans Parcours OJD, Ecole Polytechnique et Université Pierre et Marie Curie 05 Mai 2015 Authorized

More information

On the (Non-)Differentiability of the Optimal Value Function When the Optimal Solution Is Unique

On the (Non-)Differentiability of the Optimal Value Function When the Optimal Solution Is Unique On the (Non-)Differentiability of the Optimal Value Function When the Optimal Solution Is Unique Daisuke Oyama Faculty of Economics, University of Tokyo Hongo, Bunkyo-ku, Tokyo 113-0033, Japan oyama@e.u-tokyo.ac.jp

More information

The sum of two maximal monotone operator is of type FPV

The sum of two maximal monotone operator is of type FPV CJMS. 5(1)(2016), 17-21 Caspian Journal of Mathematical Sciences (CJMS) University of Mazandaran, Iran http://cjms.journals.umz.ac.ir ISSN: 1735-0611 The sum of two maximal monotone operator is of type

More information

1. Supremum and Infimum Remark: In this sections, all the subsets of R are assumed to be nonempty.

1. Supremum and Infimum Remark: In this sections, all the subsets of R are assumed to be nonempty. 1. Supremum and Infimum Remark: In this sections, all the subsets of R are assumed to be nonempty. Let E be a subset of R. We say that E is bounded above if there exists a real number U such that x U for

More information

Notes on Ordered Sets

Notes on Ordered Sets Notes on Ordered Sets Mariusz Wodzicki September 10, 2013 1 Vocabulary 1.1 Definitions Definition 1.1 A binary relation on a set S is said to be a partial order if it is reflexive, x x, weakly antisymmetric,

More information

Convex Optimization Theory

Convex Optimization Theory Convex Optimization Theory A SUMMARY BY DIMITRI P. BERTSEKAS We provide a summary of theoretical concepts and results relating to convex analysis, convex optimization, and duality theory. In particular,

More information

Nonlinear Programming 3rd Edition. Theoretical Solutions Manual Chapter 6

Nonlinear Programming 3rd Edition. Theoretical Solutions Manual Chapter 6 Nonlinear Programming 3rd Edition Theoretical Solutions Manual Chapter 6 Dimitri P. Bertsekas Massachusetts Institute of Technology Athena Scientific, Belmont, Massachusetts 1 NOTE This manual contains

More information

Introduction to optimal transport

Introduction to optimal transport Introduction to optimal transport Nicola Gigli May 20, 2011 Content Formulation of the transport problem The notions of c-convexity and c-cyclical monotonicity The dual problem Optimal maps: Brenier s

More information

WEAK CONVERGENCE OF RESOLVENTS OF MAXIMAL MONOTONE OPERATORS AND MOSCO CONVERGENCE

WEAK CONVERGENCE OF RESOLVENTS OF MAXIMAL MONOTONE OPERATORS AND MOSCO CONVERGENCE Fixed Point Theory, Volume 6, No. 1, 2005, 59-69 http://www.math.ubbcluj.ro/ nodeacj/sfptcj.htm WEAK CONVERGENCE OF RESOLVENTS OF MAXIMAL MONOTONE OPERATORS AND MOSCO CONVERGENCE YASUNORI KIMURA Department

More information

DUALITY, OPTIMALITY CONDITIONS AND PERTURBATION ANALYSIS

DUALITY, OPTIMALITY CONDITIONS AND PERTURBATION ANALYSIS 1 DUALITY, OPTIMALITY CONDITIONS AND PERTURBATION ANALYSIS Alexander Shapiro 1 School of Industrial and Systems Engineering, Georgia Institute of Technology, Atlanta, Georgia 30332-0205, USA, E-mail: ashapiro@isye.gatech.edu

More information

Galois Connections. Roland Backhouse 3rd December, 2002

Galois Connections. Roland Backhouse 3rd December, 2002 1 Galois Connections Roland Backhouse 3rd December, 2002 Fusion 2 Many problems are expressed in the form evaluate generate where generate generates a (possibly infinite) candidate set of solutions, and

More information

Robust Duality in Parametric Convex Optimization

Robust Duality in Parametric Convex Optimization Robust Duality in Parametric Convex Optimization R.I. Boţ V. Jeyakumar G.Y. Li Revised Version: June 20, 2012 Abstract Modelling of convex optimization in the face of data uncertainty often gives rise

More information

Convex Analysis Notes. Lecturer: Adrian Lewis, Cornell ORIE Scribe: Kevin Kircher, Cornell MAE

Convex Analysis Notes. Lecturer: Adrian Lewis, Cornell ORIE Scribe: Kevin Kircher, Cornell MAE Convex Analysis Notes Lecturer: Adrian Lewis, Cornell ORIE Scribe: Kevin Kircher, Cornell MAE These are notes from ORIE 6328, Convex Analysis, as taught by Prof. Adrian Lewis at Cornell University in the

More information

Lecture 7 Monotonicity. September 21, 2008

Lecture 7 Monotonicity. September 21, 2008 Lecture 7 Monotonicity September 21, 2008 Outline Introduce several monotonicity properties of vector functions Are satisfied immediately by gradient maps of convex functions In a sense, role of monotonicity

More information

2.2 Some Consequences of the Completeness Axiom

2.2 Some Consequences of the Completeness Axiom 60 CHAPTER 2. IMPORTANT PROPERTIES OF R 2.2 Some Consequences of the Completeness Axiom In this section, we use the fact that R is complete to establish some important results. First, we will prove that

More information

arxiv:math/ v1 [math.fa] 4 Feb 1993

arxiv:math/ v1 [math.fa] 4 Feb 1993 Lectures on Maximal Monotone Operators R. R. Phelps Dept. Math. GN 50, Univ. of Wash., Seattle WA 98195; phelps@math.washington.edu (Lectures given at Prague/Paseky Summer School, Czech Republic, August

More information

A Dual Condition for the Convex Subdifferential Sum Formula with Applications

A Dual Condition for the Convex Subdifferential Sum Formula with Applications Journal of Convex Analysis Volume 12 (2005), No. 2, 279 290 A Dual Condition for the Convex Subdifferential Sum Formula with Applications R. S. Burachik Engenharia de Sistemas e Computacao, COPPE-UFRJ

More information

Merit Functions and Descent Algorithms for a Class of Variational Inequality Problems

Merit Functions and Descent Algorithms for a Class of Variational Inequality Problems Merit Functions and Descent Algorithms for a Class of Variational Inequality Problems Michael Patriksson June 15, 2011 Abstract. We consider a variational inequality problem, where the cost mapping is

More information

II KLUWER ACADEMIC PUBLISHERS. Abstract Convexity and Global Optimization. Alexander Rubinov

II KLUWER ACADEMIC PUBLISHERS. Abstract Convexity and Global Optimization. Alexander Rubinov Abstract Convexity and Global Optimization by Alexander Rubinov School of Information Technology and Mathematical Sciences, University of Ballarat, Victoria, Australia II KLUWER ACADEMIC PUBLISHERS DORDRECHT

More information

The general programming problem is the nonlinear programming problem where a given function is maximized subject to a set of inequality constraints.

The general programming problem is the nonlinear programming problem where a given function is maximized subject to a set of inequality constraints. 1 Optimization Mathematical programming refers to the basic mathematical problem of finding a maximum to a function, f, subject to some constraints. 1 In other words, the objective is to find a point,

More information

Due date: Monday, February 6, 2017.

Due date: Monday, February 6, 2017. Modern Analysis Homework 3 Solutions Due date: Monday, February 6, 2017. 1. If A R define A = {x R : x A}. Let A be a nonempty set of real numbers, assume A is bounded above. Prove that A is bounded below

More information

IE 521 Convex Optimization

IE 521 Convex Optimization Lecture 5: Convex II 6th February 2019 Convex Local Lipschitz Outline Local Lipschitz 1 / 23 Convex Local Lipschitz Convex Function: f : R n R is convex if dom(f ) is convex and for any λ [0, 1], x, y

More information

LECTURE SLIDES ON BASED ON CLASS LECTURES AT THE CAMBRIDGE, MASS FALL 2007 BY DIMITRI P. BERTSEKAS.

LECTURE SLIDES ON BASED ON CLASS LECTURES AT THE CAMBRIDGE, MASS FALL 2007 BY DIMITRI P. BERTSEKAS. LECTURE SLIDES ON CONVEX ANALYSIS AND OPTIMIZATION BASED ON 6.253 CLASS LECTURES AT THE MASSACHUSETTS INSTITUTE OF TECHNOLOGY CAMBRIDGE, MASS FALL 2007 BY DIMITRI P. BERTSEKAS http://web.mit.edu/dimitrib/www/home.html

More information

On duality theory of conic linear problems

On duality theory of conic linear problems On duality theory of conic linear problems Alexander Shapiro School of Industrial and Systems Engineering, Georgia Institute of Technology, Atlanta, Georgia 3332-25, USA e-mail: ashapiro@isye.gatech.edu

More information

LECTURE 12 LECTURE OUTLINE. Subgradients Fenchel inequality Sensitivity in constrained optimization Subdifferential calculus Optimality conditions

LECTURE 12 LECTURE OUTLINE. Subgradients Fenchel inequality Sensitivity in constrained optimization Subdifferential calculus Optimality conditions LECTURE 12 LECTURE OUTLINE Subgradients Fenchel inequality Sensitivity in constrained optimization Subdifferential calculus Optimality conditions Reading: Section 5.4 All figures are courtesy of Athena

More information

Convex Optimization and Modeling

Convex Optimization and Modeling Convex Optimization and Modeling Duality Theory and Optimality Conditions 5th lecture, 12.05.2010 Jun.-Prof. Matthias Hein Program of today/next lecture Lagrangian and duality: the Lagrangian the dual

More information